$\newcommand{\br}{\\}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\Q}{\mathbb{Q}}$ $\newcommand{\Z}{\mathbb{Z}}$ $\newcommand{\N}{\mathbb{N}}$ $\newcommand{\C}{\mathbb{C}}$ $\newcommand{\P}{\mathbb{P}}$ $\newcommand{\F}{\mathbb{F}}$ $\newcommand{\L}{\mathcal{L}}$ $\newcommand{\spa}[1]{\text{span}(#1)}$ $\newcommand{\set}[1]{\{#1\}}$ $\newcommand{\emptyset}{\varnothing}$ $\newcommand{\union}{\cup}$ $\newcommand{\intercept}{\cap}$ $\newcommand{\abs}[1]{|#1|}$ $\newcommand{\inv}[1]{#1^{-1}}$ $\newcommand{\t#1}{\text}[1]$ $\newcommand{\head}{\text H}$ $\newcommand{\tail}{\text T}$ $\newcommand{\Vcw}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Vce}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Vcr}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Vct}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Vcy}[6]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{bmatrix}}$ $\newcommand{\Vcu}[7]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{bmatrix}}$ $\newcommand{\vcw}[2]{\begin{matrix} #1 \br #2 \end{matrix}}$ $\newcommand{\vce}[3]{\begin{matrix} #1 \br #2 \br #3 \end{matrix}}$ $\newcommand{\vcr}[4]{\begin{matrix} #1 \br #2 \br #3 \br #4 \end{matrix}}$ $\newcommand{\vct}[5]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \end{matrix}}$ $\newcommand{\vcy}[6]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{matrix}}$ $\newcommand{\vcu}[7]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{matrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix}#1 & #2 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix}#1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Mqr}[4]{\begin{bmatrix}#1 & #2 & #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqt}[5]{\begin{bmatrix}#1 & #2 & #3 & #4 & #5 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix}#1 \br #2 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix}#1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mrq}[4]{\begin{bmatrix}#1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Mtq}[5]{\begin{bmatrix}#1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Mww}[4]{\begin{bmatrix} #1 & #2 \br #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mwe}[6]{\begin{bmatrix} #1 & #2 & #3\br #4 & #5 & #6 \end{bmatrix}}$ $\newcommand{\Mew}[6]{\begin{bmatrix} #1 & #2 \br #3 & #4 \br #5 & #6 \end{bmatrix}}$ $\newcommand{\Mee}[9]{\begin{bmatrix} #1 & #2 & #3 \br #4 & #5 & #6 \br #7 & #8 & #9 \end{bmatrix}}$
Theorem 2.9

$T:V \to W, U: W \to Z$. $V, W, Z$ are vector spaces over $F$, then $U \circ T = UT:V \to Z$ is also a linear transformation.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$x, y \in V, a \in F$.

$\begin{align*} UT(ax + y) &= U(T(ax + y)) \br &=(aT(x) + T(y)) \br &=aU(T(x)) + U(T(y)) \br &=a(UT)(x) + (UT)(y) \end{align*}$

Theorem 2.10

$V$ is a vector space over a field $F$. Let $T, U_1, U_2 \in \L(V)$, then the following holds

  1. $T(U_1 + U_2) = TU_1 + TU_2$
  2. $(U_1 + U_2)T = U_1T + U_2T$
  3. $T(U_1U_2) = (TU_1)U_2$
  4. $TI = IT = T$(where $I$ is the identity linear transformation)
  5. $a(U_1U_2) = (aU_1)U_2 = U_1(aU_2), \forall a \in F$
Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$\begin{align*} &T(U_1 + U_2)(x) \br &=T((U_1+U_2)(x)) \br &=T(U_1(x)+U_2(x)) \br &=T(U_1(x))+T(U_2(x)) \br &=(TU_1)(x)+(TU_2)(x) \forall x \in V \end{align*}$

$\therefore T \circ (U_1 + U_2) = TU_1 + TU_2$

Note

This is special to linear transformation and does not apply to general functions.

e.g. $f(x) = x^2, g_1(x) = x^3, g_2(x)=x^4$ does not follow this theorem.

Definition: Matrix Multiplication

Let $A $ be an $m \times n $ matrix and $B $ be an $n \times p $ matrix. We define the product of $A $ and $B $, denoted $AB $, to be the $m \times p $ matrix s.t.

$$(AB)_{ij} = \sum_{ k=1 }^{ n } A_{ik}B_{kj}, \text{ for } i \leq i \leq m, 1 \leq j \leq p$$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Analysis

$T: V \to W, U: W \to Z / F$.

Let $\alpha = \set{v_1, v_2, …, v_{n}}, \beta = \set{w_1, w_2, …, w_{m}}, \gamma = \set{z_1, z_2, …, z_{p}}$ be the ordered basis of $V, W, Z$ respectively.

Let $A = [U]^\gamma_\beta$ and $B=[T]^\beta_\alpha$.

Let $A = (a_{ij})_{p \times m}$ and $B = (b_{ij})_{m \times n}$.

We define the product of $A$ and $B$ denoted by $AB$ as the new matrix obtained from the matrix representation of $UT$ with respect to ordered basis $\alpha$ and $\gamma$, i.e. $AB:= [UT]^\gamma_\alpha$.

We want to find the matrix representation of $UT$.

$UT(v_j) = \sum_{i=1}^{v} (\cdot)_{ij}z_i$

Proof

Let $A=[U]^\gamma_\beta$ and $B=[T]^\beta_\alpha$.

$A = (a_{ij})_{p\times m},$

$B=(b_{ij})_{m \times n}.$

We want to compute the matrix representation of $UT$.

$$\begin{align} U(w_j) &= \sum_{i=1}^{p}a_{ij}z_i & (1)\br T(v_j) &= \sum_{i=1}^{m} b_{ij} w_i & (2) \end{align}$$

$\begin{align*} (UT)(v_j)&=U(T(v_j)) \br &= U(\sum_{i=1}^{m}b_{ij}w_i) \br &= \sum_{i=1}^{m}b_{ij}U(w_i) \br &= \sum_{k=1}^{m}b_{kj}U(w_k) \br &= \sum_{k=1}^{m}b_{kj}(\sum_{i=1}^{p} a_{ik} z_i)\br &= \sum_{i=1}^{p} (\sum_{k=1}^{m} a_{ik}b_{kj})z_i \end{align*}$

Theorem 2.11

$T:V \to W, U: W \to Z$,

$\alpha, \beta, \gamma$ are ordered basis of $V, W$ and $Z$ respectively, then

$$[UT]^\gamma_\alpha = [U]^\gamma_\beta[T]^\beta_\alpha$$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Follows from the previous computation

Corollary

Let $V $ be a finite-dimensional vector space with an ordered basis $\beta $. Let $T, U \in \L(V)$, then

$$[UT]_ \beta = [U]_ \beta [T]_ \beta $$

Theorem : 2.14

Let $V, W $ be two finite-dimensional vector spaces. $T: V \to W$. Let $\beta$ and $\gamma$ be ordered basis $V $ and $W $, respectively, then

$$[ T ]_{ \beta }^{ \gamma } [ v ]_{ \beta } = [ T(v) ]_{ \gamma }, \forall v \in V$$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Fix $v \in V$, let $f: F \to V $ and $g: F \to W $ be two linear transformation.

defined as $f(a) = av, \forall a \in F$
and $g(a) = aT(v), \forall a \in F$

$g(a) = aT(v) = T(av) = T(f(a)) = (T \circ f)(a), \forall a \in F$

$\therefore, g = T \circ f … (i) $

Let $\alpha = \set{ 1 }$ be the ordered basis over F.

$[ g ]_{ \alpha }^{ \gamma } = $

$g(1) = T(v) \implies [ g ]_{ \alpha }^{ \gamma } = [ T(v) ]_{ \gamma } … (iii)$

$f(1) = v \implies [ f ]_{ \alpha }^{ \beta } = [ v ]_{ \beta } … (ii) $

$\underset{ \alpha }{ F } \overset{f}{\to} \underset{ \beta }{ V } \overset{T}{\to} \underset{ \gamma }{ W } $

$g = T \circ f $

$[ g ]_{ \alpha }^{ \gamma } = [ T \circ f ]_{ \alpha }^{ \gamma } $

$[ T(v) ]_{ \gamma }^{ \alpha } = [ T ]_{ \beta }^{ \gamma } [ f ]_{ \alpha }^{ \beta } $

$[ T(v) ]_{ \gamma } = [ T ]_{ \beta }^{ \gamma } [ v ]_{ \beta } $

Definition: Left-Multiplication Transformation

Let $A $ be an $m \times n $ matrix with entries from a field $F $. We denote by $L_A $ the mapping $L_A: F^n \to F^m $ defined by $L_A(x) = Ax $(the matrix product of $A $ and $x $) for each column vector $x \in F^n $. We call $L_A $ a left-multiplication transformation.

Theorem 2.15

Let $A $ be an $m \times n $ matrix with entries from $F $. Then the left-multiplication transformation $L_A: F^n \to F^m $ is linear. Furthermore, if $B $ is any other $m \times n $ matrix (with entries from $F $), and $\beta$ and $\gamma $ are the standard ordered bases for $F^n $ and $F^m $, respectively, then we have the following properties.

  1. $[L_A]^ \gamma _ \beta = A $
  2. $L_A = L_B $ if and only if $A = B $
  3. $L_{A+B} = L_A + L_B $ and $L_{aA} = a L_A $ for all $a \in F$.
  4. If $T : F^n \to F^m $ is linear, then there exists a unique $m \times n $ matrix $C $ s.t. $T = L_C $. In fact, $C = [ T ]_{ \beta}^{ \gamma } $
  5. If $E $ is an $n \times p $ matrix, then $L_{AE} = L_A L_E $
  6. If $m = n $, then $L_{I_n} = I_{F^n} $
Theorem 2.16: Associativity of Matrix Multiplication

Let $A, B $ and $C $ be matrices s.t. $A(BC)$ is defined. Then $(AB)C $ is also defined and $A(BC) = (AB)C $; that is matrix multiplication is associative.