$\newcommand{\br}{\\}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\Q}{\mathbb{Q}}$ $\newcommand{\Z}{\mathbb{Z}}$ $\newcommand{\N}{\mathbb{N}}$ $\newcommand{\C}{\mathbb{C}}$ $\newcommand{\P}{\mathbb{P}}$ $\newcommand{\F}{\mathbb{F}}$ $\newcommand{\L}{\mathcal{L}}$ $\newcommand{\spa}[1]{\text{span}(#1)}$ $\newcommand{\set}[1]{\{#1\}}$ $\newcommand{\emptyset}{\varnothing}$ $\newcommand{\otherwise}{\text{ otherwise }}$ $\newcommand{\if}{\text{ if }}$ $\newcommand{\union}{\cup}$ $\newcommand{\intercept}{\cap}$ $\newcommand{\abs}[1]{| #1 |}$ $\newcommand{\pare}[1]{\left\(#1\right\)}$ $\newcommand{\t}[1]{\text{ #1 }}$ $\newcommand{\head}{\text H}$ $\newcommand{\tail}{\text T}$ $\newcommand{\d}{\text d}$ $\newcommand{\inv}[1]{{#1}^{-1}}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\rank}[1]{\text{rank }#1}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\oto}{\text{ one-to-one }}$ $\newcommand{\ot}{\text{ onto }}$ $\newcommand{\Vcw}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Vce}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Vcr}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Vct}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Vcy}[6]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{bmatrix}}$ $\newcommand{\Vcu}[7]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Mqr}[4]{\begin{bmatrix} #1 & #2 & #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqt}[5]{\begin{bmatrix} #1 & #2 & #3 & #4 & #5 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mrq}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Mtq}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Mww}[4]{\begin{bmatrix} #1 & #2 \br #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mwe}[6]{\begin{bmatrix} #1 & #2 & #3\br #4 & #5 & #6 \end{bmatrix}}$ $\newcommand{\Mew}[6]{\begin{bmatrix} #1 & #2 \br #3 & #4 \br #5 & #6 \end{bmatrix}}$ $\newcommand{\Mee}[9]{\begin{bmatrix} #1 & #2 & #3 \br #4 & #5 & #6 \br #7 & #8 & #9 \end{bmatrix}}$
Theorem 2.22

$T:V \to V, \dim V$ is finite.

Let $V$ be a finite-dimensional vector space, and $\beta$ ordered basis of $V$.

Let $I_V: V \to V$ is defined as $I_V(x) = x \forall x \in V, $ i.e. $I_V =$ identity matrix. Let $Q = [I_V]$, then

  1. $Q$ is invertible
  2. $Q [ v ]_{ \beta' } = [ \beta ]$

The matrix $Q= [ I_V ]_{ \beta' }^{ \beta }$ is called the change of coordinate matrix from $\beta'$ -coordinate to $\beta$-coordinate.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Since the identity transformation $I_V $ is clearly invertible, its matrix $[ I_V ]_{ \beta' }^{ \beta } = Q$ is also invertible.

(ii) $[ I_v(v) ]_{ \beta } = [ v ]_{ \beta }$

$[ I_V ]_{ \beta' }^ \beta [ v ]_{ \beta' }^{ \beta } = [ v ]_{ \beta } $

$\therefore Q[v]_\beta' = [ v ]_{ \beta }$

Theorem 2.25

$T: V \to V, \dim V $ is finite and $\beta$ and $\beta' $ is ordered basis of $V$.

$$[ T ]_{ \beta' } = \inv{ Q } [ T ]_{ \beta } Q$$

where $Q $ is the change of coordinate matrix from $\beta'$ to $\beta $.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$I_V T = T I_V $

$[ I_V T ]_{ \beta' }^{ \beta } = [ T I_V ]_{ \beta' }^{ \beta }$

$[ I_V ]_{ \beta' }^{ \beta } [ T ]_{ \beta' } = [ T ]_{ \beta } [ I_V ]_{ \beta' }^{ \beta }$ $Q [ T ]_{ \beta' } = [ T ]_{ \beta } Q$

$\implies [ T ]_{ \beta' } = \inv{ Q } [ T ]_{ \beta } Q$

Corollary

Let $A \in M_{n \times n} (F) $, and let $\gamma $ be an ordered basis for $F^n $, then $[ L_A ]_{ \gamma } = \inv{ Q } A Q $, where $Q$ is the $n \times n $ matrix whose $j $th column is the $j$th vector of $\gamma $.

Definition: Similar Matrices

Let $A $ and $B $ be two $n \times n $ matrices, then $A $ and $B $ are similar if $\exists $ and invertible matrix $Q$ s.t. $B = \inv{ Q } AQ $.

Remarks

Two matrices $T $ with respect to different ordered basis are similar to each other.

Theorem

$A$ is invertible matrix if and only if it is a change of coordinate matrix.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$\implies$

If $A$ is invertible, so is $L_A$, and therefore the columns of $A$, which span the range of $L_A$, must be linearly independent. Then $A$ is the matrix that changes $\beta$ coordinates into standard coordinates in $F^n$, where $\beta$ is the ordered basis consisting of the columns of $A$.

$\impliedby $

The inverse is the matrix that changes coordinates back again.