$\newcommand{\br}{\\}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\Q}{\mathbb{Q}}$ $\newcommand{\Z}{\mathbb{Z}}$ $\newcommand{\N}{\mathbb{N}}$ $\newcommand{\C}{\mathbb{C}}$ $\newcommand{\P}{\mathbb{P}}$ $\newcommand{\F}{\mathbb{F}}$ $\newcommand{\L}{\mathcal{L}}$ $\newcommand{\spa}[1]{\text{span}(#1)}$ $\newcommand{\set}[1]{\{#1\}}$ $\newcommand{\emptyset}{\varnothing}$ $\newcommand{\otherwise}{\text{ otherwise }}$ $\newcommand{\if}{\text{ if }}$ $\newcommand{\union}{\cup}$ $\newcommand{\intercept}{\cap}$ $\newcommand{\abs}[1]{| #1 |}$ $\newcommand{\proj}{\text{proj}}$ $\newcommand{\norm}[1]{\left\lVert#1\right\rVert}$ $\newcommand{\pare}[1]{\left\(#1\right\)}$ $\newcommand{\t}[1]{\text{ #1 }}$ $\newcommand{\head}{\text H}$ $\newcommand{\tail}{\text T}$ $\newcommand{\d}{\text d}$ $\newcommand{\limu}[2]{\underset{#1 \to #2}\lim}$ $\newcommand{\inv}[1]{{#1}^{-1}}$ $\newcommand{\inner}[2]{\langle #1, #2 \rangle}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\rank}[1]{\text{rank }#1}$ $\newcommand{\tr}[1]{\text{tr}(#1)}$ $\newcommand{\oto}{\text{ one-to-one }}$ $\newcommand{\ot}{\text{ onto }}$ $\newcommand{\Re}[1]{\text{Re}(#1)}$ $\newcommand{\Im}[1]{\text{Im}(#1)}$ $\newcommand{\Vcw}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Vce}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Vcr}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Vct}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Vcy}[6]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{bmatrix}}$ $\newcommand{\Vcu}[7]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{bmatrix}}$ $\newcommand{\vcw}[2]{\begin{matrix} #1 \br #2 \end{matrix}}$ $\newcommand{\vce}[3]{\begin{matrix} #1 \br #2 \br #3 \end{matrix}}$ $\newcommand{\vcr}[4]{\begin{matrix} #1 \br #2 \br #3 \br #4 \end{matrix}}$ $\newcommand{\vct}[5]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \end{matrix}}$ $\newcommand{\vcy}[6]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{matrix}}$ $\newcommand{\vcu}[7]{\begin{matrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{matrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Mqr}[4]{\begin{bmatrix} #1 & #2 & #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqt}[5]{\begin{bmatrix} #1 & #2 & #3 & #4 & #5 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mrq}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Mtq}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Mww}[4]{\begin{bmatrix} #1 & #2 \br #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mwe}[6]{\begin{bmatrix} #1 & #2 & #3\br #4 & #5 & #6 \end{bmatrix}}$ $\newcommand{\Mew}[6]{\begin{bmatrix} #1 & #2 \br #3 & #4 \br #5 & #6 \end{bmatrix}}$ $\newcommand{\Mee}[9]{\begin{bmatrix} #1 & #2 & #3 \br #4 & #5 & #6 \br #7 & #8 & #9 \end{bmatrix}}$
Definition: Orthonormal Basis

Let $V $ be an inner product vector space over a field $F = \R $ or $\C$.

Then a basis $\beta $ of $V $ is called an orthonormal basis of $V $ if $\beta $ is an ordered basis and $\beta $ is an orthonormal set .

Theorem 6.3

Let $V $ be an inner product vector space over a field $F = \R $ or $\C$.

Let $S = \set{ v_1, v_2, …, v_{ k }} $ be a orthogonal set of non-zero vectors. Let $v $ be an vector in $V $ s.t. $v \in \spa{ S } $. Then

$$v = \sum_{i = 1}^{k} \frac{ \inner{ v }{ v_i }}{ \norm{ v_i }^2 } v_i$$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$\because v \in \spa{ S } \exists $

$a_1, _2, …, _{ k } \in s.t. $

$\begin{align*} v &= a_1v_1 + a_2v_2 + … + a_{ n }v_{ n } \br &= \sum_{ i=1 }^{ n }a_iv_i \br \end{align*}$

Solve for $a_1 $ of both sides:

Take inner product $\inner{ \cdot }{ v_1 } $

$\inner{ v_1 }{ v_1 } = \inner{ \sum_{ i=1 }^{ k }a_iv_i }{ v_1 }$

$\begin{align*} \inner{ v_1 }{ v_1 } &= \sum_{ i=1 }^{ k } \inner{ v_i }{ v_1 } \br &= a_1 \inner{ v_1 }{ v_1 } (\because S \text{ is orthogonal}) \br &= a_1 \norm{ v_1 }^2 \end{align*}$

$\implies a_1 = \frac{ \inner{ v_1 }{ v_1 }}{ \norm{ v_1 }^2 } (\because v_1 \neq 0)$

Similarly, to solve for $a_i$, take inner product $\inner{ \cdot }{ v_i } $.

$$a_i = \frac{ v, v_i }{ \norm{ v_i }^2 }, \forall i = 1, 2, …, k $$

Now, $v = \sum_{ i-1 }^{ k } a_iv_i = \sum_{ i=1 }^{ k } (\frac{ \inner{ v }{ v_i }}{ \norm{ v_i }^2 })v_i $

Corollary

If $S = \set{ v_1, v_2, …, v_{ k }}$ is an orthonormal set, then for any $v \in \spa{ S }$,

$$v = \sum_{ i = 1 }^{ k } \inner{ v }{ v_i }v_i $$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

since $S = \set{ v_1, v_2, …, v_{ k }}$ is an orthonormal set, $\norm{ v_i } = \norm{ v_i } = 1, 2, …, k $

$\therefore$ it follows from $(1) $

Corollary

Let $V $ be an inner product vector space over a field $F = \R $ or $\C$, and $S$ is a orthogonal set of non-zero vectors. Then $S $ is linearly independent.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Let $v_1, _2, …, _{ k } $ be same arbitrary vectors in $S $. Consider the equation $a_1v_1 + a_2v_2 + … + a_{ k }v_{ k } = 0 $.

$$\begin{align*} \inner{ a_1v_1 + a_2v_2 + … + a_{ k }v_{ k }}{ v_1 } &= \inner{ 0 }{ v_1 } \br a_1 \inner{ v _1} {v_1}+ a_2 \inner{ v _2}{v_1} + … + a_{ k } \inner{ v _{ k }}{v_1} &= 0 \br a_1 \norm{ v_1 }^2 &= 0 \end{align*}$$

$\because $ the vectors in $S $ are non-zero vectors by hypotheses $\norm{ v_1 } \neq 0 $

$\therefore a_1 = 0 $

Similarly, $a_2 = 0 , …, a_k = 0$

$\therefore, S $ is linearly independent.

Theorem : Gram-Schmidt Orthogonalization Process

Let $V $ be an inner product vector space over a field $F = \R $ or $\C$, and $S = \set{ w_1, w_2, …, w_{ n }}$ be a linearly independent set.

Define a set $S' = \set{ v_1, v_2, …, v_{ n }} $ in the following way:

$$\begin{align*} v_1 &= w_1 \br v_k &= w_k - \sum_{ i=1 }^{ k-1 } \frac{ \inner{ w_k }{ v_i }}{ \norm{ v_i }^2 } v_i, \forall k = 2,3,…n \end{align*}$$

Then $S' $ is an orthogonal set of non-zero vectors and $\spa{ S' } = \spa{S} $.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

For $n = 1, S = \set{ w_1 } , S' = \set{ v_1 } = \set{ w_1 }$

Then $S'$ is orthogonal and $\spa{ S } = \spa{ S' } $.

Assume that the statement is true for $n = k - 1 $, i.e. $S'_{k -1} = \set{ v_1 … v_k }$ is constructed using the algorithm satisfies the conclusion.

$S_k = \set{ w_1, w_2, …, w_{ k }} = S_{k-1} \cup \set{ w_k } $,

$S'_k = \set{ v_1, v_2, …, v_{ k }} = s'_{k-1} \cup \set{ v_k } $.

$v_k = w_k - \sum_{ i=1 }^{ k-1 } \frac{ \inner{ w_k }{ v_i }}{ \norm{ v_i }^2 } v_i $

If $v_k = 0 $, then $w_k = \sum_{ i=1 }^{ k-1 } \frac{ \inner{ w_k }{ v_i }}{ \norm{ v_i }^2 } v_i $

$\implies w_k \in \spa{ v_1, v_2, …, v_{ k-1 }} = \spa{ S'_{k-1}} = \spa{ S_{k-1}} $

i.e. $w_k \in \spa{ S_{k-1}} $

A contradiction, since $s_k = s_{k-1} \cup \set{ w_k }$ is linearly independent.

$v_j \in S'_{k -1} = \set_{ v_1, v_2, …, v_{ k-1 }} $

$\inner{ v_k }{ v_j } = \inner{ w_k }{ v_j } - \sum_{ i = 1 }^{ k } \frac{ \inner{ w_k }{ v_i }}{ \norm{ v_i }^2 } \inner{ v_i }{ v_j } = \inner{ w_k }{ v_j } - \frac{ \inner{ w_k }{ v_i }}{ \norm{ v_i }^2 } \norm{ v_i }^2 \br= \inner{ w_k }{ v_j } - \inner{ w_k }{ v_j } = 0$

We know

$v_1 = w_1 \in \spa{ S_k } $

$v_2 = w_2 - \frac{ \inner{ w_2 }{ v_1 }}{ \norm{ v_1 }^2 } v_1 \in \spa{ S_k } $

Since $S_k $ is linearly independent, \dim \spa{ S_k } = k.

Since $S'_k $ is an orthogonal set of non-zero vectors and $\abs{ S'_k } = k $

$\therefore \dim \spa{ S'_k } = k = \dim \spa{ S_k } $

$\implies \spa{S'_k} = \spa{ S_k } $

Note: What Happens in an Induction if We Don't Check the Base Case?

Prove that $n^2 + 5n + 6 $ is an odd integer for all $n \geq 1 $.

Suppose we do not check the base case.

Assume for $n = k, k^2 + 5k + 6 $ is odd. Then

$$\begin{align*} (k+1)^2 + 5(k +1) + 6 &= K^2 +2k +1 + 5k + 5 + 6 \br &= (k^2 +5k + 6) + 2k + 6 \br &= m + 2(k +3) \end{align*}$$

Since $m$ is odd, $2(k +3) $ is even, $(k+1)^2 + 5(k +1) + 6 $ is therefore odd?

NO, because the base case is FALSE.

Example

$\R^3 $. $S = \set{(1, -1, 0), (1, 0, 1), (0, -1, 1)} $ is a basis of $\R^3 $, Starting from $S $, and apply Gram-Schmidt process to construct an orthonormal basis of $\R^3 $.

$w_1 = (1, -1, 0), w_2 = (1, 0, 1), w_3 = (0, -1, 1) $

$v_1 = w_1 = (1, -1, 0) $

$v_2 = w_2 - \frac{ \inner{ w_2 }{ v_1 }}{ \norm{ v_1 }^2 } v_1 = (\frac{ 1 }{ 2 }, \frac{ 1 }{ 2 }, 1)$

$v_3 = w_3 - \frac{ \inner{ w_3 }{ v_1 }}{ \norm{ v_1 }^2 } v_1 - \frac{ \inner{ w_3 }{ v_2 }}{ \norm{ v_2 }^2 } v_2 = (- \frac{ 2 }{ 3 }, - \frac{ 2 }{ 3 }, \frac{ 2 }{ 3 })$

$\beta = \set{ \frac{ v_1 }{ \norm{ v_1 }}, \frac{ v_2 }{ \norm{ v_2 }}, \frac{ v_3 }{ \norm{ v_3 }}} $