Mathematical Analysis

Lecture 21

11.1. Vector spaces

Definition 11.1.1. Let $X$ be a set together with the operations of addition, $+\, \colon X \times X \to X,$ and multiplication, $\pd \,\colon \R \times X \to X,$ (we usually write $ax$ instead of $a \pd x$). $X$ is called a vector space (or a real vector space) if the following conditions are satisfied:

  1. If $u, v, w \in X,$ then $u+(v+w) = (u+v)+w.$
  2. If $u, v \in X,$ then $u+v = v+u.$
  3. There is a $0 \in X$ such that $v+0=v$ for all $v \in X.$
  4. For every $v \in X,$ there is a $-v \in X,$ such that $v+(-v)=0.$


11.1. Vector spaces

Definition 11.1.1. Let $X$ be a set together with the operations of addition, $+\, \colon X \times X \to X,$ and multiplication, $\cdot \,\colon \R \times X \to X,$ (we usually write $ax$ instead of $a \cdot x$). $X$ is called a vector space (or a real vector space) if the following conditions are satisfied:

  1. If $u, v, w \in X,$ then $u+(v+w) = (u+v)+w.$
  2. If $u, v \in X,$ then $u+v = v+u.$
  3. There is a $0 \in X$ such that $v+0=v$ for all $v \in X.$
  4. For every $v \in X,$ there is a $-v \in X,$ such that $v+(-v)=0.$
  5. If $a \in \R,$ $u,v \in X,$ then $a(u+v) = au+av.$
  6. If $a,b \in \R,$ $v \in X,$ then $(a+b)v = av+bv.$
  7. If $a,b \in \R,$ $v \in X,$ then $(ab)v = a(bv).$
  8. $1v = v$ for all $v \in X.$


Example 11.1.1.

An example vector space is $\R^n.$ Addition and multiplication by a scalar is done componentwise.

If $v = (v_1,v_2,\ldots,v_n),$ $w = (w_1,w_2,\ldots,w_n) \in \R^n,$ and $a \in \R$, then

$\ds v+w \coloneqq (v_1,v_2,\ldots,v_n) + (w_1,w_2,\ldots,w_n) \qquad $

$\ds \,= (v_1+w_1,v_2+w_2,\ldots,v_n+w_n)$

$\ds \;\; a v \coloneqq a (v_1,v_2,\ldots,v_n)$ $ = (a v_1, a v_2,\ldots, a v_n) .$



Example 11.1.2.

The set $X \coloneqq \{ 0 \}$ is a vector space. 🤯

The operations are defined as:

$0 + 0 \coloneqq 0\;$ and $\;a0 \coloneqq 0\;$ ($a\in \R$).

$X$ is the smallest possible vector space.





Example 11.1.3.

The space $C([0,1],\R)$ of continuous functions on the interval $[0,1]$ is a vector space. For two functions $f,g\in C([0,1],\R)$ and $a \in \R,$ we make the obvious definitions of $f+g$ and $af$:

$ (f+g)(x) \coloneqq f(x) + g(x), \; $ $ \,(af) (x) \coloneqq a\bigl(f(x)\bigr) . $

The 0 is the function that is identically zero. That is, $f(x)=0$ for all $x\in [0,1].$



11.1. Vector spaces

Remark: If $X$ is a vector space, to check that a subset $S \subset X$ is a vector subspace, we only need to show:

  1. $0 \in S,$
  2. $S$ is closed under addition, adding two vectors in $S$ gets us a vector in $S,$ and
  3. $S$ is closed under scalar multiplication, multiplying a vector in $S$ by a scalar gets us a vector in $S.$


Vector space representation












Vector space representation






















Vector space representation











Vector space representation

11.2 Linear combinations and dimension


The following concepts should be
familiar to you from MATH1051.







11.2 Linear combinations and dimension

Definition 11.2.1. Suppose $X$ is a vector space, $x_1, x_2, \ldots, x_k \in X$ are vectors, and $a_1, a_2, \ldots, a_k \in \R$ are scalars. Then \begin{equation*} a_1 x_1 + a_2 x_2 + \cdots + a_k x_k \end{equation*} is called a linear combination of the vectors $x_1, x_2, \ldots, x_k$.

If $Y \subset X$ is a set, then the span of $Y,$ or in notation $\text{span}(Y),$ is the set of all linear combinations of all finite subsets of $Y.$ By convention, define $\text{span}(\emptyset) \coloneqq \{ 0 \}$.


11.2 Linear combinations and dimension

Definition 11.2.2. A set of vectors $\{ x_1, x_2, \ldots, x_k \} \subset X$ is linearly independent if the equation \begin{equation} a_1 x_1 + a_2 x_2 + \cdots + a_k x_k = 0 \end{equation} has only the trivial solution $a_1 = a_2 = \cdots = a_k = 0.$

A set that is not linearly independent is linearly dependent.

A linearly independent set of vectors $B$ such that $\text{span}(B) = X$ is called a basis of $X.$


11.2 Linear combinations and dimension

Remarks about dimension

If a vector space $X$ contains a linearly independent set of $d$ vectors, but no linearly independent set of $d+1$ vectors, then we say the dimension of $X$ is $d$, and we write $\dim \, X \coloneqq d.$

If for all $d \in \N$ the vector space $X$ contains a set of $d$ linearly independent vectors, we say $X$ is infinite-dimensional and write $\dim \, X \coloneqq \infty.$



11.2 Linear combinations and dimension

Remarks about dimension

$\dim \, X = d$ if and only if $X$ has a basis of $d$ vectors (and so every basis has $d$ vectors).


If $\dim \, X = d$ and a set $Y$ of $d$ vectors spans $X,$ then $Y$ is linearly independent.





11.2 Linear combinations and dimension

For $\R^n$ we define the standard basis of $\R^n$ as

$e_1 \coloneqq (1,0,0,\ldots,0) ,$

$e_2 \coloneqq (0,1,0,\ldots,0) , $

$\vdots $

$e_n \coloneqq (0,0,0,\ldots,1). $

The dimension of $\R^n$ is $n.$

i.e., $\dim (\R^n)=n.$


11.2 Linear combinations and dimension




11.2 Linear combinations and dimension



11.3 Linear mappings

Definition 11.3.1. A mapping $A \colon X \to Y$ of vector spaces $X$ and $Y$ is linear (we also say $A$ is a linear transformation or a linear operator) if for all $a \in \R$ and all $x,y \in X,$ \begin{equation*} A(a x) = a A(x), \;\text{and} \; A(x+y) = A(x)+A(y) . \end{equation*}

We usually write $Ax$ instead of $A(x)$ if $A$ is linear. If $A$ is one-to-one and onto, then we say $A$ is invertible, and we denote the inverse by $A^{-1}.$ If $A \colon X \to X$ is linear, then we say $A$ is a linear operator on $X.$



11.3 Linear mappings

The vector space $L(X,Y)$

We write $L(X,Y)$ for the set of all linear transformations from $X$ to $Y,$ and just $L(X)$ for the set of linear operators on $X.$

If $a \in \R$ and $A,B \in L(X,Y),$ define the operations $A+B$ and $aA$ by \begin{equation*} (A+B)(x) := Ax + Bx, \;\;\, (aA)(x) := aAx . \end{equation*}



11.3 Linear mappings

The vector space $L(X,Y)$

Furthermore, if $A \in L(Y,Z)$ and $B \in L(X,Y),$ define the operation $AB$ as the composition $A \circ B,$ that is, \begin{equation*} ABx := A(Bx) . \end{equation*}

Finally, denote by $I \in L(X)$ the identity: the linear operator such that $Ix = x$ for all $x.$

In particular, $L(X,Y)$ is a vector space. 😃



11.3 Linear mappings

Theorem 11.3.1. If $A \in L(X,Y)$ is invertible, then $A^{-1}$ is linear.

Proof. Let $a \in \R$ and $y \in Y.$ Since $A$ is invertible, then is onto and one-to-one, that is

  1. (Onto): there is an $x$ such that $y = Ax,$
  2. (1-1): $A^{-1}(Az) = z$ for all $z \in X.$




11.3 Linear mappings

Theorem 11.3.1. If $A \in L(X,Y)$ is invertible, then $A^{-1}$ is linear.

Proof.

  1. (Onto): there is an $x$ such that $y = Ax,$
  2. (1-1): $A^{-1}(Az) = z$ for all $z \in X.$

Thus

$A^{-1}(ay)$ $= A^{-1}(aAx)$ $= A^{-1}\bigl(A(ax)\bigr)$

$= ax $ $= aA^{-1}(y). \;\;\; $



11.3 Linear mappings

Theorem 11.3.1. If $A \in L(X,Y)$ is invertible, then $A^{-1}$ is linear.

Proof.

  1. (Onto): there is an $x$ such that $y = Ax$,
  2. (1-1): $A^{-1}(Az) = z$ for all $z \in X$.

Now let $y_1,y_2 \in Y,$ and $x_1, x_2 \in X$ such that $Ax_1 = y_1$ and $Ax_2 = y_2,$ then

$A^{-1}(y_1+y_2)$ $= A^{-1}(Ax_1+Ax_2)\qquad \qquad\qquad$

$ \qquad \;\;\;\;= A^{-1}\bigl(A(x_1+x_2)\bigr)$ $= x_1+x_2 $

$\quad = A^{-1}(y_1) + A^{-1}(y_2). \;\bs $



11.3 Linear mappings


Theorem 11.3.2. If $X$ is a finite-dimensional vector space and $A \in L(X),$ then $A$ is one-to-one if and only if it is onto.







Proof of Theorem 11.3.2

Let $\{ x_1,x_2,\ldots,x_n \}$ be a basis for $X$ and $A\in L(X).$

$\nec$ First suppose $A$ is one-to-one. Let $c_1,c_2,\ldots,c_n$ be such that

$0 =\ds \sum_{k=1}^n c_k \, Ax_k$ $=\ds A\sum_{k=1}^n c_k \, x_k .$

Since $A$ is one-to-one, the only vector that is taken to 0 is 0 itself.

Then $\;0 = \ds \sum_{k=1}^n c_k \, x_k\,$ and $\,c_k = 0\,$ for all $\,k$.



Proof of Theorem 11.3.2

Let $\{ x_1,x_2,\ldots,x_n \}$ be a basis for $X$ and $A\in L(X).$

$\nec$ This means that $\{ Ax_1, Ax_2, \ldots, Ax_n \}$ is a linearly independent set. Since the dimension is $n$, we can deduce that $\{ Ax_1, Ax_2, \ldots, Ax_n \}$ spans $X.$ Thus any point $x \in X$ can be written as

$x =\ds \sum_{k=1}^n a_k \, Ax_k $ $= \ds A\sum_{k=1}^n a_k \, x_k ,$

so $A$ is onto.






Proof of Theorem 11.3.2

Let $\{ x_1,x_2,\ldots,x_n \}$ be a basis for $X$.

$\suf$ Now suppose $A$ is onto. As $A$ is determined by the action on the basis, every element of $X$ is in the span of $\{ Ax_1, Ax_2, \ldots, Ax_n \}.$ Suppose that for some $c_1,c_2,\ldots,c_n,$

$0 =\ds A\sum_{k=1}^n c_k \, x_k $ $=\ds \sum_{k=1}^n c_k \, Ax_k .$

As $\{ Ax_1, Ax_2, \ldots, Ax_n \}$ span $X,$ the set is linearly independent, and hence $c_k = 0$ for all $k.$ In other words, if $Ax = 0$, then $x=0.$ If $Ax = Ay,$ then $A(x-y) = 0$ and so $x=y.$ This means that $A$ is one-to-one. $\;\bs$



11.3 Linear mappings

The last fact to keep in mind about the vector space $L(X,Y)$


Theorem 11.3.3. If $X$ and $Y$ are finite-dimensional vector spaces, then $L(X,Y)$ is also finite-dimensional.







Credits