Mathematical Analysis

Lecture 22

11.4 Norms

Definition 11.4.1. If $X$ is a vector space, then we say a function $\snorm{\pd} \colon X \to \R$ is a norm if

  1. $\snorm{x} \geq 0,$ with $\snorm{x}=0$ if and only if $x=0.$
  2. $\snorm{cx} = \sabs{c} \, \snorm{x}$ for all $c \in \R$ and $x \in X.$
  3. $\snorm{x+y} \leq \snorm{x}+\snorm{y}$ for all $x,y \in X.$

A vector space equipped with a norm is called a normed vector space.



11.4 Norms

Definition 11.4.2.
Let $x=(x_1,x_2,\ldots,x_n),$ $y=(y_1,y_2,\ldots,y_n) \in \R^n$ be two vectors, the dot product is defined as

$ \ds x \pd y := \sum_{j=1}^n x_j\, y_j . $

Remark 1: The dot product is bilinear. That is, it is linear in each variable separately. In other words, if $y$ is fixed, the map $x\mapsto x\pd y$ is a linear map from $\R^n$ to $\R$. Similarly, if $x$ is fixed, the map $y\mapsto x\pd y$ is a linear.

Remark 2: It is also simetric. That is, $x\pd y = y\pd x.$





11.4 Norms

Definition 11.4.3. For $x=(x_1,x_2,\ldots,x_n) \in \R^n,$ the euclidean norm is defined as

$\ds \snorm{x} :=$ $ \snorm{x}_{\R^n} $ $:= \sqrt{x \pd x} \qquad \qquad \qquad \qquad \;\, $

$\ds \;\;\;\, \qquad \qquad =\sqrt{(x_1)^2+(x_2)^2 + \cdots + (x_n)^2}.$

It is easy to see that the euclidean norm satisfies properties 1 and 2 in Definition 11.4.1.

The triangle inequality follows from

$\snorm{x+y}^2 $ $ = x \pd x + y \pd y + 2$ $ (x \cdot y)$ $\qquad \qquad \qquad\quad \quad$

$\qquad \leq \snorm{x}^2 + \snorm{y}^2 + 2 $ $\snorm{x} \,\snorm{y}$ $= {\bigl(\snorm{x} + \snorm{y}\bigr)}^2 .$



11.4 Norms

Theorem 11.4.1. (Cauchy-Schwarz inequality) Let $x, y \in \R^n,$ then \begin{equation*} \sabs{x \pd y} \leq \snorm{x} \, \snorm{y} = \sqrt{x\pd x}\, \sqrt{y\pd y}, \end{equation*} with equality if and only if $x = \lambda y$ or $y = \lambda x$ for some $\lambda \in \R$.




Proof of Cauchy-Schwarz inequality

If $x=0$ or $y = 0,$ the result follows immediatly. So assume $x\not= 0$ and $y \not= 0.$

If $x$ is a scalar multiple of $y,$ that is $x = \lambda y$ for some $\lambda \in \R$, then the theorem holds with equality:

$\sabs{ x \pd y } $ $ = \sabs{\lambda y \pd y} $ $ = \sabs{\lambda} \, \sabs{y\pd y} $ $= \sabs{\lambda} \, \snorm{y}^2 $

$= \snorm{\lambda y} \, \snorm{y} $ $= \snorm{x} \, \snorm{y} .\;\;\; $




Proof of Cauchy-Schwarz inequality

Now consider fixed $x$ and $y,$ and the variable $t.$ Then we have

$\snorm{x+ty}^2 $ $ = (x+ty) \pd (x+ty) $

$ \;\;\,\quad \qquad = x \pd x + x \pd ty + ty \pd x + ty \pd ty\;$ (bilinearity)

$\;\;\,\quad \qquad= \snorm{x}^2 + 2t(x \pd y) + t^2 \snorm{y}^2 ,\; $ (simetry)

which is a polynomial of degree 2. If $x$ is not a scalar multiple of $y,$ then $\snorm{x+ty}^2 > 0$ for all $t.$ So the polynomial $\snorm{x+ty}^2$ is never zero.


Proof of Cauchy-Schwarz inequality

Since $\snorm{x+ty}^2>0.$ Elementary algebra says that the discriminant must be negative. That is

$4 {(x \pd y)}^2 - 4 \snorm{x}^2\snorm{y}^2 \lt 0, $

or in other words,

${(x \pd y)}^2 \lt \snorm{x}^2\snorm{y}^2.\; \bs$




11.4 Norms

Standard distance in $\R^n$

The distance $$d(x,y) \coloneqq \snorm{x-y}$$ is the standard distance (standard metric) on $\R^n$ that we used when we talked about metric spaces.





11.4 Norms

Operator norm

Definition 11.4.4. Let $A \in L(X,Y).$ Define \begin{equation*} \snorm{A} := \sup \bigl\{ \snorm{Ax} : x \in X \text{ with } \snorm{x} = 1 \bigr\} . \end{equation*} The number $\snorm{A}$ (possibly $\infty$) is called the operator norm.

In particular, the norm operator is a norm for finite-dimensional spaces. When it is necessary to emphasize which norm we are talking about, we may write it as

$\ds\snorm{A}_{L(X,Y)}.$



11.4 Norms

Operator norm

For example, if $X=\R^1$ with norm $\snorm{x}=\sabs{x},$ we think of elements of $L(X)$ as multiplication by scalars: $$x \mapsto ax.$$ If $\snorm{x} =\sabs{x}=1,$ then $\sabs{a x} = \sabs{a},$ so the operator norm of $a$ is $\sabs{a}.$



11.4 Norms

Operator norm

By linearity, $\ds \bigg|\bigg|A \frac{x}{\snorm{x}}\bigg|\bigg|$ $\ds = \frac{\snorm{Ax}}{\snorm{x}}$ for all nonzero $x \in X$.

The vector $\ds\frac{x}{\snorm{x}}$ is of norm 1. Therefore

$\snorm{A} = \sup \bigl\{ \snorm{Ax} : x \in X \text{ with } \snorm{x} = 1 \bigr\} $

$= \ds\sup_{\substack{x \in X\\x\neq 0}} \frac{\snorm{Ax}}{\snorm{x}} .\qquad \qquad\qquad \; $



11.4 Norms

Operator norm

👉   $\snorm{A} = \ds\sup_{\substack{x \in X\\x\neq 0}} \frac{\snorm{Ax}}{\snorm{x}} $

Assuming $\snorm{A}$ is not infinity, this implies, that for every $x \in X,$

$\ds\snorm{Ax} \leq \snorm{A} \snorm{x} .$

From the definition that $\snorm{A} = 0$ if and only if $A = 0,$ where by $A=0$ we mean that $A$ takes every vector to the zero vector.



11.4 Norms

Operator norm

👉   $\snorm{A} = \ds\sup_{\substack{x \in X\\x\neq 0}} \frac{\snorm{Ax}}{\snorm{x}} $

What would be the operator norm
of the identity operator? 🤔

$\snorm{I} $ $ = \ds\sup_{\substack{x \in X\\x\neq 0}} \frac{\snorm{Ix}}{\snorm{x}} $ $ = \ds \sup_{\substack{x \in X\\x\neq 0}} \frac{\snorm{x}}{\snorm{x}} $ $ = 1. $   😃





11.4 Norms

Operator norm

Theorem 11.4.2. Let $X$ and $Y$ be normed vector spaces. Suppose that $X$ is finite-dimensional. If $A \in L(X,Y),$ then $\snorm{A} \lt \infty,$ and $A$ is uniformly continuous.

Remark: To prove that $A$ is uniformly continuous, we can prove that it is Lipschitz continuous . That is, we need to prove that there exists a $K>0$ such that

$\snorm{Av - Aw }\leq \snorm{A} \, \snorm{v-w} \;$ for all $\;v,w\in X.$




11.4 Norms

Theorem 11.4.2. Let $X$ and $Y$ be normed vector spaces. Suppose that $X$ is finite-dimensional. If $A \in L(X,Y)$, then $\snorm{A} \lt \infty$, and $A$ is uniformly continuous.

Proof. Assume $X = \R^n.$ 😃 Let $\{ e_1,e_2,\ldots,e_n \}$ the standard basis of $X$.

For $x\in X$, with $\snorm{x} = 1$, write $\ds x = \sum_{k=1}^n c_k \, e_k .$

Since $e_k \pd e_\ell = 0$ whenever $k\not=\ell$ and $e_k \pd e_k = 1$, then $c_k = x \pd e_k$ and by Cauchy-Schwarz

$\sabs{c_k}= $ $ \sabs{ x \pd e_k } $ $\leq \snorm{x} \, \snorm{e_k} $ $ = 1 . $




11.4 Norms

Theorem 11.4.2. Let $X$ and $Y$ be normed vector spaces. Suppose that $X$ is finite-dimensional. If $A \in L(X,Y)$, then $\snorm{A} \lt \infty$, and $A$ is uniformly continuous.

Proof. Then $\,\snorm{Ax} $ $= \ds \left|\left|\sum_{k=1}^n c_k \, Ae_k\right|\right|$ $ \ds \leq \sum_{k=1}^n \sabs{c_k} \, \snorm{Ae_k} $

$\qquad \qquad \qquad \Ra \,\snorm{Ax} $ $ \ds \leq \sum_{k=1}^n \snorm{Ae_k} . $

The right-hand side does not depend on $x.$

Thus we have found a finite upper bound for $\snorm{Ax}$ independent of $x,$ which means that $\snorm{A} \lt \infty$.



11.4 Norms

Theorem 11.4.2. Let $X$ and $Y$ be normed vector spaces. Suppose that $X$ is finite-dimensional. If $A \in L(X,Y)$, then $\snorm{A} \lt \infty$, and $A$ is uniformly continuous.

Proof. So we know that $\snorm{A} \lt \infty.$ Using this fact we can prove that $A$ is uniformly continuous.

For any normed vector spaces $X$ and $Y,$ and $A \in L(X,Y).$ For $v,w \in X,$

$\snorm{Av - Aw}$ $= \ds \snorm{A(v-w)} $ $ \ds \leq \snorm{A} \, \snorm{v-w} . $

Since $\snorm{A} \lt \infty$, then $A$ is Lipschitz continuous, with constant $K = \snorm{A}$.

Therefore, $A$ is uniformly continuous. $\;\bs $



11.4 Norms

Operator norm

Theorem 11.4.3. Let $X,Y,$ and $Z$ be finite-dimensional normed vector spaces.

  1. If $A,B \in L(X,Y)$ and $c \in \R,$ then \begin{equation*} \snorm{A+B} \leq \snorm{A}+\snorm{B}, \;\; \snorm{cA} = \sabs{c} \, \snorm{A} . \end{equation*} In particular, the operator norm is a norm on the vector space $L(X,Y).$
  2. If $A \in L(X,Y)$ and $B \in L(Y,Z),$ then $ \snorm{BA} \leq \snorm{B} \, \snorm{A} . $

Proof.   📝    👀 Complementary reading 📖



Relatioship between Linear operators and Matrices

Consider the vector space $M_{m\times n}$ consisting of all $m\times n$ matrices, and let $X$ and $Y$ be vector spaces.

If $\{x_1,\ldots,x_n\}$ is a basis of $X$ and $\{y_1,\ldots,y_m\}$ is a basis of $Y,$ then for each $A\in L(X,Y)$, we have a matrix $\mathcal M(A)\in M_{m\times n}.$ In other words, once bases have been fixed for $X$ and $Y,$ $\mathcal M$ becomes a linear mapping from $L(X,Y)$ to $M_{m\times n}.$

Moreover $\mathcal M$ is bijection between $L(X,Y)$ and $M_{m\times n}.$


Relatioship between Linear operators and Matrices

Moreover $\mathcal M$ is bijection between $L(X,Y)$ and $M_{m\times n}.$


11.5 Determinants

Let $A,B\in M_{n\times n}.$ Some important facts about determinants:

  1. $\det(I) = 1,$ where $I$ is the identity matrix.
  2. If two columns of $A$ are equal, then $\det(A) = 0.$
  3. If a column of $A$ is zero, then $\det(A) = 0.$
  4. $A \mapsto \det(A)$ is a continuous function on $L(\R^n).$




11.5 Determinants

Let $A,B\in M_{n\times n}.$ Some important facts about determinants:

  1. $\det(I) = 1,$ where $I$ is the identity matrix.
  2. If two columns of $A$ are equal, then $\det(A) = 0.$
  3. If a column of $A$ is zero, then $\det(A) = 0.$
  4. $A \mapsto \det(A)$ is a continuous function on $L(\R^n).$
  5. $\det\left( \begin{bmatrix} a & b \\ c &d \end{bmatrix} \right) = ad-bc,\,$ and $\,\det \bigl( [a] \bigr) = a.$
  6. $\det(AB) = \det(A)\det(B).$
  7. $A$ is invertible if and only if $\det(A) \not= 0$ and in this case, $\det\left(A^{-1}\right) = \dfrac{1}{\det(A)}.$


11.5 Determinants

Remark: The determinant is number assigned to square matrices that measures how the corresponding linear mapping stretches the space. In particular, this number, can be used to test for invertibility of a matrix.





Linear transformations in 2D

Source: SCiMS


Linear transformations in 3D

Source: Matrix transformations


Affine transformations

$ \ds f(x,y)=\begin{bmatrix} a & b\\ c & d \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} e \\ f \end{bmatrix} $

$ \ds f_1(x,y)=\begin{bmatrix} 0.00 & 0.00 \\ 0.00 & 0.16 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} $

$ \ds f_2(x,y)=\begin{bmatrix} 0.85 & 0.04 \\ -0.04 & 0.85 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} 0.00 \\ 1.60 \end{bmatrix} $

$ \ds f_3(x,y)=\begin{bmatrix} 0.20 & -0.26 \\ 0.23 & 0.22 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} 0.00 \\ 1.60 \end{bmatrix} $

$ \ds f_4(x,y)=\begin{bmatrix} -0.15 & 0.28 \\ 0.26 & 0.24 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} 0.00 \\ 0.44 \end{bmatrix} $


Barnsley fern in JavaScript

Click on Play. Explore: Modify the code and click again on Play to see the changes. Have fun! 🌿 🤓





Credits