Basis, Dimensions of Vector Spaces.

$\newcommand{\br}{\\}$ $\newcommand{\R}{\mathbb{R}}$ $\newcommand{\Q}{\mathbb{Q}}$ $\newcommand{\Z}{\mathbb{Z}}$ $\newcommand{\N}{\mathbb{N}}$ $\newcommand{\C}{\mathbb{C}}$ $\newcommand{\P}{\mathbb{P}}$ $\newcommand{\F}{\mathbb{F}}$ $\newcommand{\L}{\mathcal{L}}$ $\newcommand{\spa}[1]{\text{span}(#1)}$ $\newcommand{\set}[1]{\{#1\}}$ $\newcommand{\emptyset}{\varnothing}$ $\newcommand{\otherwise}{\text{ otherwise }}$ $\newcommand{\if}{\text{ if }}$ $\newcommand{\union}{\cup}$ $\newcommand{\intercept}{\cap}$ $\newcommand{\abs}[1]{| #1 |}$ $\newcommand{\pare}[1]{\left\(#1\right\)}$ $\newcommand{\t}[1]{\text{ #1 }}$ $\newcommand{\head}{\text H}$ $\newcommand{\tail}{\text T}$ $\newcommand{\inv}[1]{{#1}^{-1}}$ $\newcommand{\nullity}[1]{\text{nullity}(#1)}$ $\newcommand{\rank}[1]{\text{rank }#1}$ $\newcommand{\oto}{\text{ one-to-one }}$ $\newcommand{\ot}{\text{ onto }}$ $\newcommand{\Vcw}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Vce}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Vcr}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Vct}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Vcy}[6]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \end{bmatrix}}$ $\newcommand{\Vcu}[7]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \br #6 \br #7 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Mqr}[4]{\begin{bmatrix} #1 & #2 & #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqt}[5]{\begin{bmatrix} #1 & #2 & #3 & #4 & #5 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mrq}[4]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \end{bmatrix}}$ $\newcommand{\Mtq}[5]{\begin{bmatrix} #1 \br #2 \br #3 \br #4 \br #5 \end{bmatrix}}$ $\newcommand{\Mqw}[2]{\begin{bmatrix} #1 & #2 \end{bmatrix}}$ $\newcommand{\Mwq}[2]{\begin{bmatrix} #1 \br #2 \end{bmatrix}}$ $\newcommand{\Mww}[4]{\begin{bmatrix} #1 & #2 \br #3 & #4 \end{bmatrix}}$ $\newcommand{\Mqe}[3]{\begin{bmatrix} #1 & #2 & #3 \end{bmatrix}}$ $\newcommand{\Meq}[3]{\begin{bmatrix} #1 \br #2 \br #3 \end{bmatrix}}$ $\newcommand{\Mwe}[6]{\begin{bmatrix} #1 & #2 & #3\br #4 & #5 & #6 \end{bmatrix}}$ $\newcommand{\Mew}[6]{\begin{bmatrix} #1 & #2 \br #3 & #4 \br #5 & #6 \end{bmatrix}}$ $\newcommand{\Mee}[9]{\begin{bmatrix} #1 & #2 & #3 \br #4 & #5 & #6 \br #7 & #8 & #9 \end{bmatrix}}$
Definition: Basis

$V/F$, a subset $\beta\subseteq V$ is called a basis of $V$ if

  1. $\beta$ generates $V$, i.e. $\spa{\beta}=V$.
  2. $\beta$ is linearly independent.
Example

Prove that $\beta=\set{(1,0), (0,1)}$ is a basis of $\R^2$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
  1. need to prove $\beta$ generates $V$.
    $\forall (a,b) \in \R^2,$
    $(a, b)=a(1,0)+b(0,1)$ i.e. $\spa{\beta}=\R^2$.

  2. need to prove $\beta$ is linearly independent.
    $x(1,0) + y(0,1) = (0,0)$
    $(x,y) = (0,0)$
    $x = 0, y = 0$

$\therefore$ $\beta$ is a basis of $\R^2$.

Example

$e_1=(1,0,0,…)$, $e_2=(0,1,0, …)$, $e_3=(0,0,1, …)$,
$e_n=(0,0,…,1, 0, …)$.
Is $S= \set{e_1, e_2, e_3, …}$abasis of $\R^\infty$?

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

No. it is not.

Let $e = (1, 1, 1, 1, …), e \in \R^\infty$.

$e \notin \spa{S}$, but $e \in \R^\infty$.

$\spa{\beta} \neq R^\infty$ i.e. $\spa{\beta}$ cannot generate $\R^\infty$.

Hence, $\beta$ is not a basis of $\R$.

Remarks

$R^\infty$ does have a basis, although we don’t know what it is.
Every vector space have a basis.

Example: $\R^n/\R$

$e_1=(1,0,0,…)$, $e_2=(0,1,0, …)$, $e_3=(0,0,1, …)$,
$e_n=(0,0,…,1, …, 0)$.
$S= \set{e_1, e_2, e_3, …, e_n}$ is a basis of $\R^n$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
  1. need to prove that $S$ generates $\R^n$.
    $(a_1, a_2, …, a_{n}) \in \R^n$
    $(a_1, a_2, …, a_{n}) = a_1e_1 + a_2e_2 + … + a_{n} e_{n}$
    $\spa{S} = R $ i.e. $S$ generates $\R^n$.

  2. need to prove $S$ is linearly independent.
    $a_1e_1 + a_2e_2 + … + a_{n} e_{n} = 0$
    $(a_1, a_2, …, a_{n}) = 0$
    $a_i = 0, i = 1, 2, …, n $ $\therefore, S$ linearly independent.

Example: $\P\_n(X)/\R$

$\beta = \set{1, x, x^2, …, x^n}$ is a basis of $P_n(X)$ over $\R$.

Example: $\P(X)/\R$

Prove that $\beta = \set{1, x, x^2, …}$ is a basis of $\P(x) / \R$.

  1. need to prove that $\beta$ generates $\P(x) $.
    $a_0 + a_1x + a_2x^2 + … + a_{n} x^{n} \in \P(x)$ $\spa{\beta} = \P(x) $

  2. need to prove that $\beta$ is linearly independent.
    To prove an infinite set is linearly independent, we only need to prove its every finite subset’s only solution is the trivial representation.

Remarks: the basis of $\set{0}/F$

The empty set $\emptyset$ is the basis of $V=\set{0}/F$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
  1. need to prove that $\emptyset$ generates $\set{0}$.
    Trivial. It is defined that $\spa{\emptyset} = \set{0}$.
  2. need to prove that $\emptyset $ is linearly independent.
    Trivial.
Theorem 1.8: Uniqueness of Linear Combination With Bases

$\beta \subseteq V$,
$\beta$ is a basis of $V \iff $ $\forall v \in V, \exists v_1, v_2, …, v_n \in \beta$ and unique $a_1, a_2, …, a_n \in F$ s.t. $$v = \sum^n_{i=1}a_iv_i$$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$\implies$

$\spa{\beta} = V$, since $\beta$ is a basis of $V$.

Assume that for $v\in V, \exists a_1, a_2, …, a_n, b_1, b_2, …, b_n$ s.t.

$$v = \sum_{i=1}^na_iv_i= \sum_{i=1}^nb_iv_i$$

then we have

$$\sum_{i=1}^n(a_i-b_i)v_i= 0$$

$\because \beta$ linearly independent $\therefore a_i-b_i = 0, \forall i = 1 … n$.

$\therefore \forall i, a_i=b_i$, the coefficients are unique.

$\impliedby$

  1. need to prove $\spa{\beta} = V$.
    Since $\forall v \in V, v \in \spa{\beta}, V\subseteq \spa{\beta}$.
    $\beta \subseteq V \implies \spa{\beta} \subseteq V$.
    $\therefore V = \spa{\beta}$.

  2. need to prove $\beta$ is linearly independent.
    Choose $v = 0$, then $\forall v_1, v_2, …, v_k (k \leq n)$, there exist unique $a_1, a_2, …, a_k$ s.t.
    $$a_1v_1 + a_2v_2 + … + a_kv_k = 0$$
    then $a_i$ must be all zero. Hence $\beta$ is linearly dependent.

$\therefore$ by definition, $\beta$ is a basis of $V$.

Theorem 1.9: Deletion Theorem

$V/F, S$ is a finite generating subset of $V$. Then $\exists T \subseteq S, T$ is a basis of $V$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

case 1: If $S=\emptyset$ or $S= \set{0}$, then $\spa{S} = \set{0} = V.$

Then $\emptyset \subseteq S$ is the basis of $V = \set{0}, T = \emptyset.$


case 2: If $S \neq \emptyset$ and $S \neq \set{0}$, then $\exists v_1 \in S$, s.t. $v_1 \neq 0, \set{v_1}$ linearly independent.

Step 1: Is $\spa{{v_1}} = V$?

If yes, then stop and set $T=\set{v_1}$.

Otherwise, we claim that $S \setminus \spa{\set{v_1}} \neq \emptyset$


Indeed $S\setminus \spa{\set{v_1}} \neq \emptyset$.

Assume that $S\setminus \spa{\set{v_1}} = \emptyset$.

then $S \subseteq \spa{v_1}$

then $\spa{S} \subseteq \spa{v_1}$

then $V \subseteq \spa{v_1}$


$v_2\in V \setminus \spa{\set{v_1}}$. Thus $\set{v_1,v_2} = \set{v_1}\cup\set{v_2}$ linearly independent by Theorem 1.7.


Step 2: check if $\spa{\set{v_1, v_2}} = V$? If yes, then stop… otherwise ….

Repeat this way after finitely many steps (since S is a finite) we will get a set $T=\set{v_1, v_2, …, v_k}$ s.t. $T$ is linearly independent $\subset S$ and $\spa{T} = V$.

Theorem 1.10: Replacement Theorem

Let $V/F$ and $\beta = \set{v_1, v_2, …, v_n}$ be a basis of $V$.

Let $v \in V$ be a non-zero vector and $v=c_1v_1 + c_2v_2 + … + c_jv_j + … + a_nv_n$ s.t. $c_j \neq 0$ Then $\set{v_1, v_2, …, v_{j-1}, v, v_{j+1}, …, v_n}$ is a basis of $V$. i.e. the $j$th vector of $\beta$ can be replaced by $v$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Denote $\set{v_1, v_2, …, v_{j-1}, v, v_{j+1}, …, v_n}$ by $\beta'$.

want to prove: $\spa{\beta'} = V, \beta'$ is linearly independent.

let $u \in V, since \beta = \set{v_1, v_2, …, v_j, …, v_n}$ is a basis of $V, \exists a_1, a_2, a_n \in F$ s.t. $u = \set{v_1, v_2, …, v_{j} …, v_n}$

We are given that $v = c_1v_1+ c_2v_2+ …+ c_jv_j, …+ c_nv_n$, and $c_j \neq 0$

Solve for $v_j$: $c_jv_j = c_1v_1 + … + c_{j-1}v_{j-1} -v + c_{j+1}v_{j+1} + … + c_nv_n$

$\because c_j \neq 0, \frac1{c_j}$ exists

$v_j = \sum^n_{i=1}(-\frac{-c_i}{c_j}v_i + \frac{1}{c_j} v) (i\neq j)$

$u = \sum^n_{i=1}a_iv_i + \sum^n_{i=1}(-\frac{-c_i}{c_j})a_jv_i + \frac{1}{c_j} a_jv (i\neq j)$

$u = \sum^n_{i=1}(a_i-\frac{-c_i}{c_j}c_i)v_i + \frac{1}{c_j} v (i\neq j)$

$b_1v_1 +b_2v_2 + … + b_{j-1}v_{j-1} + bv + b{j+1}v_{j+1} + … + b_nv_n = 0

$b_1v_1 +b_2v_2 + … + b_{j-1}v_{j-1} + b(c_1v_1 + … + c_nv_n) + b{j+1}v_{j+1} + … + b_nv_n = 0

$(b_1 + bc_1)v_1 +(b_2 + bc_2)v_2 + … + (b_{j-1} + bc_{j-1})v_{j-1} + bc_jv_j + …+ (b_n + bc_n) v_n = 0

(c_j \neq 0)$

since $\beta = \set {v_1, v_2, …, v_j, …, v_n}$ is a basis. It is linearly dependent.

$\therefore b_c_j = 0 \implies b = 0,c …..

lol, cant catch up

Example: $\R^3$

$\beta= \set{e_1,e_2,e_3}$, standard basis.

$u = (-1, 0, \frac12) = -1e_1 + 0e_2+ \frac12 e_3$

$\beta' = \set{u, e_2, e_3}$ is a basis.

Corollary 1: Of Theorem 1.10

$V/F, V$ is finite-dimensional,

Then any two basis of $V$ has exactly same number of elements, namely, $\dim V$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

let $\beta = \set{v_1, v_2, …, v_n}$ and $\gamma = \set{u_1, u_2, …, u_n}$ be two basis of $V$.

need to prove $\abs{\beta} \geq \abs{\gamma}$ and $\abs{\gamma} \geq \abs{\beta}$

$u_1 \neq 0$

write $u_1 = a_1v_1 + a_2v_2 + … + a_mv_m$ one of the $a_i$ is non-zero without loss of generality we may assume that $a_1 \neq 0$. Then by the replacement theorem, $\set{v_1,v_2, …, v_n}$ is a basis of $V$. $\delta_1 = \set{u_1, v_2, v_3, …, v_m}$ is a basis of V.

write $u_2 \in \delta$ is a linear combination from $\delta$, i.e. $u_2 = b_1u_1 + b_2v_2 + b_3v_3 + … = b_mv_m$

$u_2 \neq 0$

Claim At least that one of $b_2, b_3, …, b_m$ is non-zero

If not, then assume they are all zeor i.e. b_2=b_3=…= b_m =0

Then u_2= b_1u_1 = $\set{u_1, u_2}$ is linearly dependent. This is a contradiction since \delta is a basis if $b_i \neq 0$

Aussume W.L.G. that $b2 \neq = 0$.

Then by Reprlacement theorem applied to $\delta_1, \delta_2 = \set{u_1, u_2, v_3, v_n, …, v_m}$ is a basis.

Continue this way until $\set{u_1, u_2, u_3, …, u_n, v_{n+1}, v_{n+2}, …, v_m}$ Similarly, $\abs{m}\leq n$

Definition: Dimensions of Vector Spaces

$V/F$. If $V$ is generated by a finite subset, then $V$ is called a finite-dimensional vector space. In this case the number of vectors in a basis of $V$ is called the dimension of $V$ denoted by $\dim V$.

If $V$ is NOT a finite-dimensional vector space, or alternatively if there exists an infinite linearly independent subset $S$ of $V$, i.e. $\abs{S} = \infty$ and $S$ is linearly independent, Then $V$ is called an infinite-dimensional vector space, and we write $\dim V = \infty$.

Example: $P(x) / \R$

$\beta = \set{1, x, x^2, x^3, …}$

$\dim_\R \P(x) = \infty$

Example: $\R^\infty/\R$

$\dim_\R \R^\infty = \infty$

Example: $\R^n/\R$

$e_1 = (1, 0, …, 0)$

$e_2 = (0, 1, …, 0)$

$e_n = (0, 1, …, n)$

$\beta = \set{e_1, e_2, …, e_n}, \dim_\R R^n = n$.

Example: $M\_{m\times n}(\R)$

$\dim_\R M_{m\times n} = mn$

Example: $W \subseteq M\_{2\times 2}(\R)$

$W = \set{A \in M_{2\times 2}(\R) | tr(A) = 0}$

$\dim_\R W = 3$

Example: $\R/\R$

$\dim_\R \R = 1$

In fact,

$\dim_F F = 1$

Example: $\C/\R$

$\dim_\C \R = 2$

Example: $\C^2/\R$
  1. $V = \C^2, \F = \C, \dim_\C \C^2 = 1$
    $\beta_1 = \set{(1, 0), (0, 1)}/\C$
  2. $V = \C^2, \F = \R, \dim_\R \C^2 = 2$
    $\beta_2 = \set{(1, 0), (i, 0), (0, 1), (0, -i)}/\C$
  3. $V = \C^n, \F = \R, \dim_\R \C^2 = 2n$
Remarks: $\R/\Q$

$\dim_\Q \R = \infty$(will be useful in MATH 131A)

Corollary 2: Of Theorem 1.9 & 1.10

Let $V$ be a $n$ dimension vector space over a field $F$.

i.e. $\dim V = n$

  1. (a) Any finite generating set of $V$ must contain at least $n$ elements.
    $\forall S \subseteq V, \spa{S} = V \implies \abs{S} \geq n.$
    (b) Any finite generating set of $V$ containing exactly $n$ elements is a basis.
    $\forall S \subseteq V, \spa{S} = V, \abs{S} = n \implies S$ is a basis of $V$.

  2. Any linearly independent subset of $V$ containing exactly $n$ elements is a basis.
    $\forall S \subseteq V, S$ is linearly independent, $\abs{S} = n \implies S$ is a basis of $V$.

  3. Every linearly independent subset of $V$ can be extended to a basis of $V$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Let $S$ be a finite set and $\spa{S} = V$.

On the contrary, assume that $\abs{S} < n$.

Since $S$ is finite and $\spa{S} = V$,
by possibly remaining same vector from $S$, let $T$ be a subset of $S$ s.t. $T$ is a basis. (theorem 1.9)

$\dim V = \abs{T} \leq \abs{S} < n = \dim V$. Contradiction.

$\therefore \abs{S} \geq \dim V. $

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

$\abs{S} = n$ and $\spa{S} = V$. We try to prove that $S$ is not a basis.

On the contrary assume that $S$ is not a basis. Then, $S$ must be linearly dependent.

Let $T \subset S$ s.t. $T$ is a basis.

Then $\dim V = \abs{T} < \abs{S} = n = \dim{V}$. Contradiction.

$\therefore S $ must be a basis.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Let $S$ be a linearly independent, $\abs{S} = n = \dim V$.

Let $\beta = \set{v_1, v_2, …, v_n}$ be a basis of $V$.

Let $S = \set{u_1, u_2, …, u_n}$. Since $S$ is linearly independent. $u_i \neq 0, \forall i = 1, 2, …, n$.

Then by theorem 1.10, we can replace $v_1, v_2, v_3, …, v_n$ with $u_1, u_2, …, u_n$ one by one.

$\therefore u_1, u_2, …, u_n$ is a basis.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Let $S = {v_1, v_2, …, v_k}$ be a linearly independent set in $V$.

if $\spa{S} = V$, then $S$ is already a basis. (definition of basis)

If not, there exist a vector $w_{k+1} \in V \setminus S$ s.t.

$T_{k + 1} = S \cup \set{w_{k+1}}$ is linearly independent.

Check if $\spa{T_{k+1}} = V$?

If yes, then stop. and $T_{k + 1}$ is a basis.

If not, continue is a basis.

Continuing this way, we will stop after a finitely many steps and with $T_n = S \cup \set{w_{k+1}, w_{k+2}, w_{k+3}, … w_{n}}$ since $\dim V = n$.

Theorem

Let $V$ be a finite dimension vector space over a field $F$, and $W$ a subspace of $V$, then $\dim W \leq \dim V$.

Moreover, if $\dim W = \dim V$, then $W = V$.

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

DOUBLE CHECK

let $\dim V = n \geq 0$,

If $W = \set{0}, \dim W = 0 \leq n = \dim V$
so assume that $W \neq \set{0}$.

Then $\exists w_1 \in w$ s.t. $w_1 \neq 0$

continuing this way, let $\set{w_1, w_2, …, w_k}$ be a linearly independent subset of $W$ s.t. if $w$ is any other vector in $W$ s.t. $w \notin \set{w_1, w_2, …, w_k}$, then $\set{w_1, w_2, …, w_k}\cup \set{w}$ is linearly dependent.

But $w_1, w_2, …, w_k$ is also linearly independent in V, since $W \subseteq V \implies$

Proof
  </span>
</span>
<span class="proof__expand"><a>[expand]</a></span>

Assume that $\dim W = \dim V = n$ let #\beta = \set{w_1, w_2, …, _n}# be a basis of $W$. Then $\spa{\beta} = W$ and $w_1, w_2, …, w_n$ is a linearly independent set in $V$, since $W\subseteq V$. This implies that $w_1, w_2, …, w_n$ is a bais of $V$, since $\dim V = n$, \therefore, $\spa{w_1, w_2, …, w_n}= V$ i.e. $\spa{\beta} = V$ We get $W=V$