Calculus &
Linear Algebra II

Chapter 7

7 Vector spaces

By the end of this section, you should be able to understand:

  • The definition of a vector space.
  • Many new examples of vector spaces.
  • The significance of a basis.
  • How to find the transition matrix from one basis to another.

Notation. For the next set of lectures on linear algebra, $\BF$ stands for $\R$ or $\C.$ Thus, if a statement holds for or applies to both number sets, we may simply state it for $\BF.$ Elements of $\BF$ are often called scalars.



7.1 Definition

Let $V$ be a nonempty set on which are defined operations "$+$" (called addition) and "$\cdot$" (called scalar multiplication).

$V$ is a vector space (over $\BF$) if the following hold for all $\bfu,\bfv,\bfw\in V$ and all $k,\ell\in\BF$:

(V1) $\bfu+\bfv\in V$ (closure)
(V2) $\bfu+\bfv=\bfv+\bfu$ (additive commutativity)
(V3) $\bfu+(\bfv+\bfw)=(\bfu+\bfv)+\bfw$ (additive associativity)


7.1 Definition

$V$ is a vector space (over $\BF$) if the following hold for all $\bfu,\bfv,\bfw\in V$ and all $k,\ell\in\BF$:

(V1) $\bfu+\bfv\in V$ (closure)
(V2) $\bfu+\bfv=\bfv+\bfu$ (additive commutativity)
(V3) $\bfu+(\bfv+\bfw)=(\bfu+\bfv)+\bfw$ (additive associativity)
(V4) $\exists\,{\bf 0}\in V$ such that $\bfu+{\bf 0}=\bfu$ (zero vector, or additive identity)
(V5) For each $\bfu\in V$, $\exists\,(-\bfu)\in V$ such that $\bfu+(-\bfu)={\bf 0}$ (additive inverse)
(V6) $k\cdot\bfu\in V$ (closure)
(V7) $k\cdot(\bfu+\bfv)=k\cdot\bfu+k\cdot\bfv$ (multiplicative-additive distributivity)
(V8) $(k+\ell)\cdot\bfu=k\cdot\bfu+\ell\cdot\bfu$ (additive-multiplicative distributivity)
(V9) $k\cdot(\ell\cdot\bfu)=(k\ell)\cdot\bfu$ (multiplicative-multiplicative distributivity)
(V10) $1\cdot\bfu=\bfu$ (multiplicative identity)

7.1 Definition

(V1) $\bfu+\bfv\in V$ (closure)
(V2) $\bfu+\bfv=\bfv+\bfu$ (additive commutativity)
(V3) $\bfu+(\bfv+\bfw)=(\bfu+\bfv)+\bfw$ (additive associativity)
(V4) $\exists\,{\bf 0}\in V$ such that $\bfu+{\bf 0}=\bfu$ (zero vector, or additive identity)
(V5) For each $\bfu\in V$, $\exists\,(-\bfu)\in V$ such that $\bfu+(-\bfu)={\bf 0}$ (additive inverse)
(V6) $k\cdot\bfu\in V$ (closure)
(V7) $k\cdot(\bfu+\bfv)=k\cdot\bfu+k\cdot\bfv$ (multiplicative-additive distributivity)
(V8) $(k+\ell)\cdot\bfu=k\cdot\bfu+\ell\cdot\bfu$ (additive-multiplicative distributivity)
(V9) $k\cdot(\ell\cdot\bfu)=(k\ell)\cdot\bfu$ (multiplicative-multiplicative distributivity)
(V10) $1\cdot\bfu=\bfu$ (multiplicative identity)

The scalar multiplication symbol is often omitted. Elements of a vector space are usually called vectors.



7.2 Example: $\BF^n$ - set of $n$-tuples

1) Identify elements of the set: $$\BF^n = \big\{ \u = \left( u_1, u_2, \ldots , u_n\right)~|~ u_1, u_2, \ldots , u_n \in \BF \big\}$$

2) & 3) Check for closure of addition and scalar multiplication: $$\u+ \v = \left( u_1+v_1, u_2+v_2, \ldots , u_n+v_n\right)$$ $$\;\,k \cdot \u = \left( ku_1, ku_2, \ldots , ku_n\right)$$

4) Identify the vector zero: $\mathbf 0 = \left( 0, 0, \ldots , 0\right)$

5) Identify the inverse additive: $ - \mathbf u = \left( -u_1, -u_2, \ldots , -u_n\right)$



7.2 Example: $\BF^n$ - set of $n$-tuples

1) Identify elements of the set: $$\BF^n = \big\{ \u = \left( u_1, u_2, \ldots , u_n\right)~|~ u_1, u_2, \ldots , u_n \in \BF \big\}$$

2) & 3) Check for closure of addition and scalar multiplication: $$\u+ \v = \left( u_1+v_1, u_2+v_2, \ldots , u_n+v_n\right)$$ $$\;\,k \cdot \u = \left( ku_1, ku_2, \ldots , ku_n\right)$$

4) Identify the vector zero: $\mathbf 0 = \left( 0, 0, \ldots , 0\right)$

5) Identify the inverse additive: $ - \mathbf u = \left( -u_1, -u_2, \ldots , -u_n\right)$

Now you can continue checking that the other properties hold. 📝

Remark: This is just an strategy you can use. But if you prefer, you can verify each property one by one in the given order, that is, from (V1) to (V10).


7.3 Example: $M_{m,n}(\BF)$ - set of $m\times n$ matrices

1) $M_{m,n} = \left\{ \left( \begin{array}{ccc} a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn} \\ \end{array} \right) ~\Bigg|~ a_{ij}\in \BF, 1\leq i\leq m, 1\leq j\leq n \right\}$

2) & 3) Usual addition and scalar multiplication for matrices.

4) $\mathbf 0 = \left( \begin{array}{ccc} 0 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & 0 \\ \end{array} \right)$

5) $- \mathbf u = \left( \begin{array}{ccc} -a_{11} & \cdots & -a_{1n} \\ \vdots & \ddots & \vdots \\ -a_{m1} & \cdots & -a_{mn} \\ \end{array} \right)$


7.4 Example: $C[a,b]$ - set of continuous real-valued functions on $[a,b]$

1) $\mathbf f, \mathbf g\in C[a,b]$, often represented as $f(x), g(x).$

2) & 3) $\left(\,f+g\right)(x) = f(x) + g(x)$
$\qquad \quad \,\, (k \cdot f)(x) = kf(x)$

4) $\mathbf 0 = \,?$ 🤔

5) $ - \mathbf f = -f(x) $




7.5 Example: $P_n(\BF)$ - set of polynomials of degree at most $n$

1) $\mathbf p \in P_n(\BF)$, with $\mathbf p = a_0 + a_1x + \cdots + a_n x^n$ and $a_k\in \BF,$ $\forall k.$

2) & 3) Operations similar to Example 7.4.

4) $\mathbf 0 = \,?$ 🤔

5) $ - \mathbf p = -p(x) .$





7.6 Example: Set of solutions to a homogeneous linear ODE

For example: $y''+p(x) y' + q(x) y = 0 $ ($y=y(x)$).

1) Let $y_1$ and $y_2$ be solutions.

2) Operations similar to Example 7.4.

3) Superposition principle: $y_1+y_2 $ is also a solution.

4) The zero vector is the zero function.

5) Given a solution $y$, $-y$ is also a solution.




Vector space representation












Vector space representation






















Vector space representation











Vector space representation

7.7 Familiar concepts in linear algebra

Here we lists several concepts with which you should already be familiar.

  • Linear combination For $v_1,v_2,\ldots,v_n\in V$ and $\alpha_1,\alpha_2,\ldots,\alpha_n\in \mathbb{F}$, we call $$ \alpha_1 v_1+\alpha_2 v_2+\cdots +\alpha_n v_n $$ a linear combination of the vectors $v_1,v_2,\ldots,v_n$.
  • Linear independence A non-empty set of vectors $S=\{v_1,v_2,\ldots,v_n\}$ in $V$ is said to be linearly dependent if there exist scalars $\alpha_1,\alpha_2,\ldots,\alpha_n$ not all zero such that $$ \alpha_1 v_1+\alpha_2v_2 + \cdots + \alpha_n v_n=0. $$ Otherwise, $S$ is called linearly independent, i.e. $S$ is linearly independent if $$ \alpha_1 v_1+\alpha_2v_2 + \cdots + \alpha_n v_n=0 \ \ \Rightarrow \ \ \alpha_1=\alpha_2=\cdots=\alpha_n=0. $$

7.7 Familiar concepts in linear algebra

Here we lists several concepts with which you should already be familiar.

  • Subspace A subset $W\subseteq V$ is called a subspace if $W$ is also a real vector space with the same addition and scalar multiplication. In particular, $W$ is required to close under addition and scalar multiplication.
  • Span The span of a non-empty set of vectors $S=\{v_1,v_2,\ldots,v_n\}$ in $V$ is the set of all linear combinations of vectors in $S$, denoted span$(S)$. The set span$(S)$ is a subspace of $V$.




7.7 Familiar concepts in linear algebra

  • Linear combination For $v_1,v_2,\ldots,v_n\in V$ and $\alpha_1,\alpha_2,\ldots,\alpha_n\in \mathbb{F}$, we call $$ \alpha_1 v_1+\alpha_2 v_2+\cdots +\alpha_n v_n $$ a linear combination of the vectors $v_1,v_2,\ldots,v_n$.
  • Linear independence A non-empty set of vectors $S=\{v_1,v_2,\ldots,v_n\}$ in $V$ is said to be linearly dependent if there exist scalars $\alpha_1,\alpha_2,\ldots,\alpha_n$ not all zero such that $$ \alpha_1 v_1+\alpha_2v_2 + \cdots + \alpha_n v_n=0. $$ Otherwise, $S$ is called linearly independent, i.e. $S$ is linearly independent if $$ \alpha_1 v_1+\alpha_2v_2 + \cdots + \alpha_n v_n=0 \ \ \Rightarrow \ \ \alpha_1=\alpha_2=\cdots=\alpha_n=0. $$
  • Subspace A subset $W\subseteq V$ is called a subspace if $W$ is also a real vector space with the same addition and scalar multiplication. In particular, $W$ is required to close under addition and scalar multiplication.
  • Span The span of a non-empty set of vectors $S=\{v_1,v_2,\ldots,v_n\}$ in $V$ is the set of all linear combinations of vectors in $S$, denoted span$(S)$. The set span$(S)$ is a subspace of $V$.

7.8 Basis

Let $\beta=\{\bfv_1,\ldots,\bfv_n\}$ be a set of vectors in the vector space $V$. $\,\beta$ is a basis for $V$ if

(B1) $\,\beta$ is linearly independent;
(B2) $\,\beta$ spans $V$.

Note that the notion of a basis is only defined here for finite sets. A nonzero vector space is finite-dimensional if it contains a finite set of vectors that forms a basis. If no such set exists, the vector space is infinite-dimentional.

Let $V$ be a finite-dimensional vector space. The number of vectors in any basis for $V$ is the same, and this number is known as the dimension of $V$.



7.8 Basis

An ordered basis for a vector space is a basis endowed with a specific order. For some vector spaces, there is a canonical ordered basis, called a standard basis.

For example, for $\R^3$ we have \[ \beta = \left\{ \left( \begin{array}{c} 1 \\ 0\\ 0 \\ \end{array} \right), \left( \begin{array}{c} 0 \\ 1\\ 0 \\ \end{array} \right), \left( \begin{array}{c} 0 \\ 0\\ 1 \\ \end{array} \right) \right\}. \]

For $P_3 \left( \R \right)$ we have \[ \beta = \left\{1, x, x^2, x^3\right\}. \]



7.9 Decomposition theorem

Let $\beta=\{\bfv_1,\ldots,\bfv_n\}$ be a set of vectors in the vector space $V$. Then, $\beta$ is a basis for $V$ iff each $\bfw\in V$ can be uniquely expressed as a linear combination of vectors in $\beta$.

Proof: Exercise 📝






7.10 Transition matrix

Let $\beta=\{\bfv_1,\ldots,\bfv_n\}$ be an ordered basis for the vector space $V$. For $\bfu\in V$, let $a_1,\ldots,a_n$ be (the unique) scalars such that $$ \bfu=\sum_{i=1}^na_i\bfv_i. $$

The coordinate vector of $\bfu$ relative to $\beta$ is given by $$ [\bfu]_\beta = \begin{pmatrix} a_1\\ \vdots\\ a_n \end{pmatrix}. $$ Another common notation for this is $[\bfu]^\beta$.


7.10 Transition matrix

Let $\beta'$ be another ordered basis for $V$. The coordinate vector of $\bfu$ relative to $\beta'$ is thus denoted by $[\bfu]_{\beta'}$. The transition matrix from $\beta$ to $\beta'$, denoted by $P_{\beta\to\beta'}$, relates the two coordinate vectors of $\bfu$ as $$ [\bfu]_{\beta'}=P_{\beta\to\beta'}[\bfu]_\beta. $$ If $\beta''$ is yet another ordered basis for $V$, then $$ P_{\beta'\to\beta''}P_{\beta\to\beta'}=P_{\beta\to\beta''}\;\, \Ra \;\, P_{\beta'\to\beta}P_{\beta\to\beta'}=P_{\beta\to\beta}=I, $$ where $I$ is the $n\times n$ identity matrix.



7.10 Transition matrix



7.10 Transition matrix

To illustrate, let us consider the two ordered bases $\beta=\{1,x\}$ and $\beta'=\{1+x,2x\}$ for $P_1(\BF)$. As the vector (or polynomial) $\bfu=a+bx$ also can be written as $$ \bfu=a(1+x)+\tfrac{1}{2}(b-a)(2x), $$ we have

$$ [\bfu]_\beta=\begin{pmatrix} a\\ b \end{pmatrix} \quad\text{and}\quad [\bfu]_{\beta'}= \begin{pmatrix} a\\ \frac{1}{2}(b-a) \end{pmatrix}. $$

The corresponding transition matrix $P_{\beta\to\beta'}$ is given as follows:



7.10 Transition matrix

$$ [\bfu]_\beta=\begin{pmatrix} a\\ b \end{pmatrix} \quad\text{and}\quad [\bfu]_{\beta'}= \begin{pmatrix} a\\ \frac{1}{2}(b-a) \end{pmatrix}. $$

The corresponding transition matrix $P_{\beta\to\beta'}$ is given as follows:

\[ \begin{pmatrix} a\\ \frac{1}{2}(b-a) \end{pmatrix} = \begin{pmatrix} 1 & 0\\ -\frac{1}{2} & \frac{1}{2} \end{pmatrix} \begin{pmatrix} a\\ b\end{pmatrix}, \] so \[ P_{\beta\ra \beta'}= \begin{pmatrix} 1 & 0\\ -\frac{1}{2} & \frac{1}{2} \end{pmatrix}. \]


7.10 Transition matrix

In general we have that \[ P_{\beta\ra \beta'}= \left( \left[ \v_1\right]_{\beta'} ~|~ \left[ \v_2\right]_{\beta'}~|~\cdots ~|~ \left[ \v_n\right]_{\beta'}\right) \] where $\left[ \v_i\right]_{\beta'}$ is the coordinate vector of basis $\v_i$ in $\beta$ relative to the basis $\beta'$.

We can check this for our example:

We have that $\beta = \left\{\v_1, \v_2\right\}= \left\{1, x\right\}$ and

$\qquad \;\;\quad \quad\beta' = \left\{\v_1', \v_2'\right\}= \left\{1+x, 2x\right\}.$



7.10 Transition matrix

$\beta = \left\{\v_1, \v_2\right\}= \left\{1, x\right\}$ and $\beta' = \left\{\v_1', \v_2'\right\}= \left\{1+x, 2x\right\}.$

$\v_1 = 1$ $= a_1 \v_1' + a_2 \v_2'$ $=a_1\left(1+ x\right)+a_2\left(2x\right)$

$=a_1+ \left(a_1+ 2a_2\right)x.\qquad \qquad \qquad \;$

Comparing coefficients we get $a_1=1$ and $a_1+2a_2 = 0$, that is, $a_2 = -\dfrac{1}{2}.$

Thus $\ds\left[\v_1\right]_{\beta'}=\begin{pmatrix} 1\\ -\frac{1}{2} \end{pmatrix}.$


7.10 Transition matrix

$\beta = \left\{\v_1, \v_2\right\}= \left\{1, x\right\}$ and $\beta' = \left\{\v_1', \v_2'\right\}= \left\{1+x, 2x\right\}.$

Thus $\ds\left[\v_1\right]_{\beta'}=\begin{pmatrix} 1\\ -\frac{1}{2} \end{pmatrix}.$

Similarly, we have

$\v_2 = x$ $= a_1 \v_1' + a_2 \v_2'$ $=a_1+ \left(a_1+ 2a_2\right)x.$

Comparing coefficients we obtain $a_1=0$ and $a_2 = \dfrac{1}{2},$ $$ \Ra\, \ds\left[\v_2\right]_{\beta'}=\begin{pmatrix} 0\\ \frac{1}{2} \end{pmatrix}. $$


7.10 Transition matrix


👉 $\;\ds\left[\v_1\right]_{\beta'}=\begin{pmatrix} 1\\ -\dfrac{1}{2} \end{pmatrix}\;\;$ and $\;\;\ds\left[\v_2\right]_{\beta'}=\begin{pmatrix} 0\\ \dfrac{1}{2} \end{pmatrix}$


👉 $\,\ds P_{\beta\ra \beta'} = \left( \left[ \v_1\right]_{\beta'} ~|~ \left[ \v_2\right]_{\beta'} \right) =\begin{pmatrix} 1 & 0 \\ -\dfrac{1}{2} & \dfrac{1}{2}\end{pmatrix}.$




Credits