$\newcommand{\br}{\\}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\Q}{\mathbb{Q}}$ $\newcommand{\Z}{\mathbb{Z}}$ $\newcommand{\N}{\mathbb{N}}$ $\newcommand{\C}{\mathbb{C}}$ $\newcommand{\P}{\mathbb{P}}$ $\newcommand{\F}{\mathbb{F}}$ $\newcommand{\L}{\mathcal{L}}$ $\newcommand{\spa}[1]{\text{span}(#1)}$ $\newcommand{\set}[1]{\{#1\}}$ $\newcommand{\emptyset}{\varnothing}$ $\newcommand{\otherwise}{\text{ otherwise }}$ $\newcommand{\if}{\text{ if }}$ $\newcommand{\union}{\cup}$ $\newcommand{\intercept}{\cap}$ $\newcommand{\abs}[1]{| #1 |}$ $\newcommand{\pare}[1]{\left\(#1\right\)}$ $\newcommand{\t}[1]{\text{ #1 }}$ $\newcommand{\head}{\text H}$ $\newcommand{\tail}{\text T}$ $\newcommand{\d}{\text d}$ $\newcommand{\inv}[1]{{#1}^{-1}}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\rank}[1]{\text{rank}(#1)}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\oto}{\text{ one-to-one }}$ $\newcommand{\ot}{\text{ onto }}$ $\newcommand{\Vcw}[2]{\begin{pmatrix} #1 \br #2 \end{pmatrix}}$ $\newcommand{\Vce}[3]{\begin{pmatrix} #1 \br #2 \br #3 \end{pmatrix}}$ $\newcommand{\Vcr}[4]{\begin{pmatrix} #1 \br #2 \br #3 \br #4 \end{pmatrix}}$ $\newcommand{\Vct}[5]{\begin{pmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{pmatrix}}$ $\newcommand{\Vcy}[6]{\begin{pmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{pmatrix}}$ $\newcommand{\Vcu}[7]{\begin{pmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{pmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Mqr}[4]{\begin{bmatrix} #1 & #2 & #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqt}[5]{\begin{bmatrix} #1 & #2 & #3 & #4 & #5 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mrq}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Mtq}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Mww}[4]{\begin{bmatrix} #1 & #2 \br #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mwe}[6]{\begin{bmatrix} #1 & #2 & #3\br #4 & #5 & #6 \end{bmatrix}}$ $\newcommand{\Mew}[6]{\begin{bmatrix} #1 & #2 \br #3 & #4 \br #5 & #6 \end{bmatrix}}$ $\newcommand{\Mee}[9]{\begin{bmatrix} #1 & #2 & #3 \br #4 & #5 & #6 \br #7 & #8 & #9 \end{bmatrix}}$
Definition: Eigenvalues and Eigenvectors of Linear Operators

$T: V \to V $ is a linear operator on $V$.

A scalar $\lambda \in F $ is called an eigenvalue of $T $ if there exists a non-zero vector $x \in V$ s.t. $T(x) = \lambda x$.

The non-zero vector $x$ is called an eigenvector of $T$ associated to the eigenvalue $\lambda $.

Definition: Eigenvalues and Eigenvectors of Square Matrices

$A_{n \times n} \in M_{n \times n }(F) , \lambda \in F$ is called an eigenvalue, if $\exists v \in F^n, v \neq {\Vcr{ 0 }{ 0 }{ \vdots }{ 0 }}, s.t. Av = \lambda v . v$ is called the eigenvector of $A $ with respect to $\lambda $.

Example

Let $T \R^2 \to \R^2 $,

$T(x,y) = (x + 3y, 4x + 2y)$

$v_1 = (1, -1), v_2 = (3,4)$

Are $v_1$ and $v_2$ eigenvectors?

$T(v_1) = (-2, 2) = -2(-1, 1) = -2v_1 $

$\therefore -2$ is an eigenvalue, $v_1$ is the corresponding eigenvector.

$T(v_2) = (15, 20) = 5(3, 4) = 5v_2$

$5$ is an eigenvalue and $v_2 $ is a corresponding eigenvector.

Lemma

Let $T:V \to V$,

Fix an eigenvalue $\lambda$ and an eigenvector $v$ corresponding to $\lambda $, then $av$ is also an eigenvector of T corresponding to $\lambda \forall a \neq 0, a \in F$ .

Proof [expand]

$Tv = \lambda v$

$T(av) = \lambda(av)$

First, $\because v \neq 0 $ and $a \neq 0 , \therefore av \neq 0$.

Now, $\begin{align*} T(av) &= aT(v) \br &=a(Tv) \br &=\lambda (av) \end{align*}$

Remarks

Eigenvectors are not unique.

Definition: Characteristic Polynomial of Matrices

$A_{n \times n} \in M_{n \times n} (F) $ The characteristic polynomial of $A$ is the degree n-polynomial $f(t) $ where $f(t) = \det(A - I_n) , I_n = $identity matrix.

Theorem 5.2

$A_{n \times n} \in M_{n \times n}(F)$, Then $\lambda \in F $ is an eigenvalue of $A$ $\iff$ $\lambda$ is a root of the character polynomial of $A_{i \times j} f(\lambda) = 0$ i.e. $\det{A - \lambda I_n} = 0 $

Proof [expand]

Assume that $\det(A- \lambda I_n) = 0$

$\implies A - \lambda n $ is not an invertible matrix

$\implies \rank{ A- \lambda I_n } < n$

$\implies \nullity{ A - \lambda I_n } > 0 $

By dimension theorem, $\nullity{ A- \lambda I_n } + \rank{ A- \lambda I_n } = n$

$\implies \nullity{ A - \lambda I_n } > 0$

$\implies \exists v \in N(A- \lambda I_n) s.t. v \neq 0$

$\implies v \in N(A - \lambda \lambda I_n) $ s.t. $v \neq 0$

$\implies (A- \lambda I_n)(v) = 0 = {\Vcr{ 0 }{ 0 }{ \vdots }{ 0 }}$

$\implies Av - \lambda I_n(v) = 0 $

$\implies Av = \lambda v, when v \neq 0$

$\therefore v$ is an eigenvector and $\lambda$ is an eigenvalue of $A$.

Proof [expand]

assume that $\lambda \in F $ is a eigenvalue.

Then by definition , $\exists v \in F^n$ s.t. $v \neq {\Vcr{ 0 }{ 0 }{ \vdots }{ 0 }}$ and $Av = \lambda v $

$\implies (A- \lambda I_n) v = 0$

$v \neq 0 $

$\implies N (A - \lambda I_n) \neq \set{0} $

$\implies (A - \lambda I_n) > 0 $

Then again by dimension thoerem,

$\because v \in N(A - \lambda I_n) $ and $v \neq 0 $

Then, $\rank{ A - \lambda I_n } < 0$

$\implies A - \lambda I_n$ is not invertible.

$\implies \det(A- \lambda I_n) = 0$

Lemma

Similar matrices have have same characteristic polynomial.

Proof [expand]

Let $A_{n \times n }, B_{n \times n } $ are similar, then $\exists $ an invertible matrix $Q$ s.t. $B = \inv{ Q } A Q $

We need to prove $\det(B - t I_n) = \det(A - tI_n) $

$\det(B - tI_n) $

$=\det(\inv{ Q }A Q - tI_n) $

$=\det(\inv{ Q }A Q - t \inv{ Q } I_n Q) $

$=\det(\inv{ Q }(A - tI_n)Q) $

$\det(M_{n \times n} N_{n \times n} = (\det M) (\det N))$

\det(\inv{ P} = \frac{ 1 }{ \det P })

Lemma

If $A_{n \times n}, B_{n \times n} \in M_{n \times n}(F)$.

If $A$ and $B$ are similar, i.e.. $\exists$ an $Q $ s.t. $B = \inv{ Q } A Q$ then

$$\det(A - tI_n) = \det (B - t I_n)$$

Definition: Characteristic Polynomial of Linear Operators

Let $T $ be a linear operator on $V $ and $\dim V = n$. The characteristic polynomial of $T $ is defined as the characteristic polynomial of $[ T ]_{ \beta } $ with same ordered basis of $V $.

Remarks

If $B’ $ is another ordered basis of $V $, then $\exists a_{n \times n}$ invertible s.t. $[ T ]_{ \beta’ } = \inv{ Q } [ T ]_{ \beta } Q$.

Then by the previous lemma,

$$\det([ T ]_{ \beta’ } - t I_n) = \det ([ T ]_{ \beta } - t I_n) $$.

WILL APPEAR ON EXAM: FINDING eigenvector and eigenvalue, DIAGONALIZABILITY

Example: IMPORTANT

$T: \P_2(x)/ \R \to \P_2 (x) $

$T(f(x)) = f(x) + (x+1) f’(x) $

Find all eigenvalues and eigenvectors.

$\beta = \set{ 1, x, x^2 }$

$$ T(1) = 1 + (x+1) \cdot 0 = 1 = 1 \cdot x + 0x + 0x^2 T(x) = x + (x+1) \cdot 1 = 1+2x = 1 \cdot 1 + 2x + 0x^2 T(x^2) = x^2 + (x+1) \cdot 2x = 3x^2+2x = 0 \cdot 1 + 2x + 3x^2 $$

$$A = [ T ]_{ \beta } = {\Mee{ 1 }{ 1 }{ 0 }{ 0 }{ 2 }{ 2 }{ 0 }{ 0 }{ 3 }} $$

Characteristic polynomial of T is:

$$\begin{align*} &=\det (A - t I_3) \br &= \det {\Mee{ 1-t }{ 1 }{ 0 }{ 0 }{ 2-t }{ 2 }{ 0 }{ 0 }{ 3-t }} \br &= (1-t)(2-t)(3-t) \end{align*}$$

solve $\det (A - t I_3) = 0 $

$t = 1, 2,$ or $3 $

The eigenvalue of $T$ are 1,2 and 3.

$t= 1$, Let $v = {\Vce{ x_1 }{ x_2 }{ x_3 }}$ be a eigenvector of $A $ with respect to $t = 1$.

$$\begin{align*} Av &= 1v \br {\Mee{ 1 }{ 1 }{ 0 }{ 0 }{ 2 }{ 2 }{ 0 }{ 0 }{ 3 }} {\Vce{ x_1 }{ x_2 }{ x_3 }} &= {\Vce{ x_1 }{ x_2 }{ x_3 }} \br {\Vce{ x_1 + x_2 }{ 2 x_1 + 2 x_3 }{ 3x_3 }} &= {\Vce{ x_1 }{ x_2 }{ x_3 }} \end{align*}$$

$\implies \begin{cases} x_1 = x_1 \br x_2 = 0 \br x_3 = 0 \end{cases} $

$v = {\Vce{ x_1 }{ x_2 }{ x_3 }} = {\Vce{ x_1 }{ 0 }{ 0 }} = x_1{\Vce{ 1 }{ 0 }{ 0 }} $

${\Vce{ 1 }{ 0 }{ 0 }}$ is an eigenvector of A with respect to $t = 1 $.

$1 \cdot + 0 \cdot x + 0 \cdot x^2 = 1 $

$\therefore f(x) = 1$ is an eigenvector of $T$ with respect to $t = 1 $.

$t = 2, v = {\Vce{ 1 }{ 1 }{ 0 }} \implies 1 \cdot 1 + 1 \cdot x + 0 \cdot x^2 = 1 + x$

$t=3, v = {\Vce{ 1 }{ 2 }{ 1 }} \implies 1 \cdot + 2 \cdot x + 1 \cdot x^2 = 1 + 2x + x^2 $

Theorem

Let $T $ be a linear operator on $V $, and $\lambda$ is a eigenvalue of $T $.

Then a vector $v \in V $ is an eigenvector of T with respect to $\lambda \iff v \neq 0$ and $v \in N(T - \lambda I_v) $

$$\begin{align*} &\iff v \neq 0, T(v) = \lambda v \br & \iff Tv = \lambda v = 0\br & \iff (T - \lambda I_v) (v) = 0\br & \iff v \in N(T - \lambda I_v) \end{align*}$$