Let $T$ be an linear operator on a finite-dimensional vector space $V$,
$T$ is diagonalizable if there exists an ordered basis $\beta $ of $V $ s.t. $[ T ]_{ \beta } $ is a diagonal matrix.
Let $A_{n \times n} \in M_{n \times n}(F)$.
$L_A : F^n \to F^n $ of $F^n /F$ s.t.
$[ L_A ]_{ \gamma }$ is a diagonal matrix.
A square matrix $A $ is called diagonalizable if $L_A $ is diagonalizable.
Equivalently,
$A$ is diagonalizable if exists an invertible matrix $Q$ s.t. $\inv{ Q }AQ$ is a diagonal matrix.
To diagonalize a matrix or a linear operator is to find a basis of eigenvectors and the correspoding eigenvalues.
$L_A : F^jkn_{\beta \neq \gamma} \to F^n_{\beta \neq \gamma} $
$L_A(v) = Av, \forall v \in F^n $
$v = {\Vcr{ a_1 }{ a_2 }{ \vdots }{ a_n }} \in F^n$
$[ L_A ]_{ \beta } = A \if \beta = { e_1, e_2, …, e_{ n }}$
$[ L_A ]_{ \gamma } = B $
$B = \inv{ Q }A Q$
$T: V \to V $, assume that $T $ is diagonalizable.
$\implies $ ordered basis $\beta = \set{ x_1, x_2, …, x_{ n }} $ s.t.
$[ T ]_{ \beta } = {\Mee{ \lambda_1 }{ 0 }{ 0 }{ 0 }{ \ddots }{ 0 }{ 0 }{ 0 }{ \lambda_n }} $
$\lambda_i$ are not necessarily distinct.
$T(x_1) = \lambda x_1 + 0 x_2 + … + 0 x_n$
$\implies T(x_1) = \lambda _ 1 x_1$ and $x_1 \neq 0$
$\implies \lambda_1 $is an eigenvalue of $T$ and $x_1 $ is a corresponding eigenvector.
Conclusion: That basis $\beta $ is a basis where every vector in $\beta $ is a eigenvector, and the diagonal entries of $[ T ]_{ \beta } $ are the corresponding eigenvalues.
$T: V \to V $, assume that $\set{ v_1, v_2, …, v_{ n }} $ be a basis of $V $ s.t. $v_i $ is an eigenvector corresponding to eigenvalue of $t_i $ of $T $.
$$T(v_1) = t_1v_1 = t_1 v_1 + 0v_2 + … + 0v_{ n } \br T(v_2) = t_2v_2 = 0 v_1 + t_2v_2 + … + 0v_{ n } \br \vdots \br T(v_n) = t_nv_n = 0 v_1 + 0v_2 + … + t_nv_{ n }$$
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
Then $T$ is diagonalizable $\iff $
There exists an ordered basis $\beta = \set{ x_1, x_2, …, x_{ n }}$.
Consisting of eigenvector of $T $ each $x_i $ is an eigenvector of $T$. In this case, $[ T ]_{ \beta }$ is a diagonal matrix with the diagonal entries as corresponding eigenvalue.
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
Let $\lambda_1, \lambda_2, …, \lambda_{ k }$ be distinct eigenvalues of $T , k \leq n$.
If $x_1, x_2, …, x_{ k } $ are corresponding eigenvector, then $\set{ x_1, x_2, …, x_{ k }}$ is linearly independent.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
$T$ has $n$-distinct eigenvalues $\implies T$ is diagonalizable.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
A polynomial $f(x)$ with coefficients in a field F is called split over $F$, if $f(x)$ can be written as a product of $n $ linear polynomial with coefficient in $F$.
i.e. $f(x) = a(x- a_1)(x - a_2)(x- a_3) … (x - a_n)$
note: $f$ is called a split of over if it can be completely factored into linear factors over $F$.
$x^2 + 2x -3 = (x+3)(x -1) = 5(x-2)^2(x-3)x^3$
$5(x-2)(x-3)x^3 = 5(x-2)(x-2)(x-3)xxx$
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
Let $\lambda $ be an eigenvalue of $T $, then a positive integer $k \geq 1 $ is called the algebraic multiplicity of $\lambda $, if $(\lambda - t)^k $ is a factor of the characteristic polynomial.
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
$\lambda \in F $ is an eigenvalue of $T $, we define a subspace $E_ \lambda \subseteq V $ as $E_ \lambda = \set{ x \in V | T(x) = \lambda x } =N(T - \lambda I_v)$.
$E_ \lambda$ is called the eigenspace of $\lambda. $
The geometric multiplicity of $\lambda$ is defined as the $\dim E _ \lambda$.
Let $T$ be an linear operator on a finite-dimensional vector space $V$,
$\lambda $ is an eigenvalue of $T $, the algebraic multiplicity of $\lambda $ is $m $, then $1 \leq \dim E_{ \lambda } \leq m$.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $T$ be an linear operator on a finite-dimensional vector space $V$, let $\lambda_1, \lambda_2, …, \lambda_{ k } $ be distinct eigenvalues of $T $. Let $v_i \in E_ { \lambda_i }$ for $i = 1,2, … ,k .$
If $$v_1 + v_2 + … + v_{ k } = 0 $$ then $$v_1 = v_2 = … = v_k = 0 $$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $T$ be an linear operator on a finite-dimensional vector space $V, \lambda_1, \lambda_2, …, \lambda_{ k } $ are distinct eigenvalues of $T $.
Let $S_i $ be a linearly independent subset of $E_{ \lambda_i } $ for $i = 1, 2, …, k $.
Then $\underset{i=1}{\overset{k} \cup} S_i$ is also linearly independent.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $T$ be an linear operator on a finite-dimensional vector space $V$, let $\lambda_1, \lambda_2, …, \lambda_{ k }$ are distinct eigenvalues of $T$, then
- $T $ is diagonalizable $\iff \forall \lambda_i, $ algebraic multiplicity$(\lambda_i) =$ geometric multiplicity$(\lambda_i), i = 1, 2, …, k $.
- If $T $ is diagonalizable and if $\beta_1, \beta_2, …, \beta_{ k } $ are basis of $E_{ \lambda_1}, E_{ \lambda_2}, …, E_{ \lambda_{ k }} $ respectively, $\beta = \beta_1 \cup \beta_2 \cup … \cup \beta_k $ is a basis of $V $.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $T $ be a linear operator on an $n $-dimensional vector space $V $. Then $T $ is diagonalizable if and only if both of the following conditions hold.
For each eigenvalue $\lambda$ of $T $, $\text{alg.mult.}(T) = \text{geo.mult.}(T)$.
These same conditions can be used to test if a square matrix $A $ is diagonalizable because diagonalizability of $A $ is equivalent to diagonalizability of the operator $L_A $.
If $T $ is diagonalizable, and $\beta_1, \beta_2, …, \beta_{ k } $ are ordered bases for the eigenspaces of $T $.
Then the union $\beta = \beta_1 \cup \beta_2 \cup … \cup \beta_k $ is an ordered bases for $V $ consisting of eigenvectors of $T $, and hence $[T]_ \beta $ is a diagonal matrix.
We test the matrix $$A = {\Mee{ 3 }{ 1 }{ 0 }{ 0 }{ 3 }{ 0 }{ 0 }{ 0 }{ 4 }} \in M_{3 \times 3}(R) $$ for diagonalizability.
The characteristic polynomial of $A $ is $\det(A - t I) = 0(t - 4)(t- 3)^2$, which splits.
A has eigenvalues $\lambda_1 = 4, \lambda_2 = 3$ with algebraic multiplicities $1$ and $2$, respectively.
Since $\lambda_1 $ has alg.mult. $1 $, by theorem 5.7, geo.mult. = 1 = alg.mult.
Then we check $\lambda_2. Since
$$A - \lambda_2I = {\Mee{ 0 }{ 1 }{ 0 }{ 0 }{ 0 }{ 0 }{ 0 }{ 0 }{ 1 }}$$
has $\rank{ 2 }$, we see that $\nullity{ A - \lambda_2 I } = 3 - \rank{ A - \lambda_2 I } = 1 $, which is not the alg.mult. of $\lambda_2 $.
$\therefore A $ is not diagonalizable.
Let $$A = {\Mww{ 0 }{ -2 }{ 1 }{ 3 }} $$ We show that $A $ is diagonalizable and find a $2 \times 2 $ matrix $Q $ s.t. $\inv{ Q }A Q $ is a diagonal matrix.
$A - \lambda I = (t -1)(t- 2) $, and hence $A $ has two distinct eigenvalues, $\lambda_1 = 1 $ and $\lambda_2 = 2 $. By applying the corollary to theorem 5.5 to the operator $L_A $, we see that $A $ is diagonalizable.
Moreover,
$$\gamma_1 = \set{{\Vcw{ -2 }{ 1 }}}, \gamma_2 = \set{{\Vcw{ -1 }{ 1 }}} $$
are bases for the eigenspaces $E_{\lambda_1} $ and $E_{ \lambda_2 } $, respectively. Therefore
$\gamma = \gamma_1 \cup \gamma_2 = \set{{\Vcw{ -2 }{ 1 }}, {\Vcw{ -1 }{ 1 }}} $
is an ordered basis for $\R^2 $ consisting of eigenvector of $A $. Let
$$ Q = {\Mww{ -2 }{ -1 }{ 1 }{ 1 }} $$
the matrix whose columns are the vectors in $\gamma$. Then, by the corollary to theorem 2.23,
$$ D = \inv{ Q }AQ = [ L_A ]_{ \beta } = {\Mww{ 1 }{ 0 }{ 0 }{ 2 }} $$