Mathematical Analysis

Summary Week 11


❤️ Support this project ❤️

Patreon

Vector spaces

Definition: Let $X$ be a set together with the operations of addition, $+\, \colon X \times X \to X$, and multiplication, $\cdot \,\colon \R \times X \to X$, (we usually write $ax$ instead of $a \cdot x$). $X$ is called a vector space (or a real vector space) if the following conditions are satisfied:

  1. If $u, v, w \in X$, then $u+(v+w) = (u+v)+w$.
  2. If $u, v \in X$, then $u+v = v+u$.
  3. There is a $0 \in X$ such that $v+0=v$ for all $v \in X$.
  4. For every $v \in X$, there is a $-v \in X$, such that $v+(-v)=0$.
  5. If $a \in \R$, $u,v \in X$, then $a(u+v) = au+av$.
  6. If $a,b \in \R$, $v \in X$, then $(a+b)v = av+bv$.
  7. If $a,b \in \R$, $v \in X$, then $(ab)v = a(bv)$.
  8. $1v = v$ for all $v \in X$.

Vector spaces

Subvector spaces

Remark: If $X$ is a vector space, to check that a subset $S \subset X$ is a vector subspace, we only need

  1. $0 \in S$,
  2. $S$ is closed under addition, adding two vectors in $S$ gets us a vector in $S$, and
  3. $S$ is closed under scalar multiplication, multiplying a vector in $S$ by a scalar gets us a vector in $S$.

Span & Dimension

Span & Dimension

Linear mappings

Definition: A mapping $A \colon X \to Y$ of vector spaces $X$ and $Y$ is linear (we also say $A$ is a linear transformation or a linear operator) if for all $a \in \R$ and all $x,y \in X$, \begin{equation*} A(a x) = a A(x), \qquad \text{and} \qquad A(x+y) = A(x)+A(y) . \end{equation*}

We usually write $Ax$ instead of $A(x)$ if $A$ is linear. If $A$ is one-to-one and onto, then we say $A$ is invertible, and we denote the inverse by $A^{-1}$. If $A \colon X \to X$ is linear, then we say $A$ is a linear operator on $X$.

Linear mappings

We write $L(X,Y)$ for the set of all linear transformations from $X$ to $Y$, and just $L(X)$ for the set of linear operators on $X$. If $a \in \R$ and $A,B \in L(X,Y)$, define the transformations $aA$ and $A+B$ by \begin{equation*} (aA)(x) := aAx , \qquad (A+B)(x) := Ax + Bx . \end{equation*} If $A \in L(Y,Z)$ and $B \in L(X,Y)$, define the transformation $AB$ as the composition $A \circ B$, that is, \begin{equation*} ABx := A(Bx) . \end{equation*} Finally, denote by $I \in L(X)$ the identity: the linear operator such that $Ix = x$ for all $x$.

Linear mappings

Theorem: If $A \in L(X,Y)$ is invertible, then $A^{-1}$ is linear.

Theorem: If $A \in L(X,Y)$ is linear, then it is completely determined by its values on a basis of $X$. Furthermore, if $B$ is a basis of $X$, then every function $\widetilde{A} \colon B \to Y$ extends to a linear function $A$ on $X$.

Theorem: If $X$ is a finite-dimensional vector space and $A \in L(X)$, then $A$ is one-to-one if and only if it is onto.

Theorem: If $X$ and $Y$ are finite-dimensional vector spaces, then $L(X,Y)$ is also finite-dimensional.

Convexity

Definition: A subset $U$ of a vector space is convex if whenever $x,y \in U$, the line segment from $x$ to $y$ lies in $U$. That is, if the convex combination $(1-t)x+ty$ is in $U$ for all $t \in [0,1]$. Sometimes we write $[x,y]$ for this line segment.

Theorem: Let $x \in \R^n$ and $r \ge 0$. The ball $B(x,r) \subset \R^n$ (using the standard metric on $\R^n$) is convex.

Norms

Definition: If $X$ is a vector space, then we say a function $\snorm{\cdot} \colon X \to \R$ is a norm if

  1. $\snorm{x} \geq 0$, with $\snorm{x}=0$ if and only if $x=0$.
  2. $\snorm{cx} = \sabs{c} \, \snorm{x}$ for all $c \in \R$ and $x \in X$.
  3. $\snorm{x+y} \leq \snorm{x}+\snorm{y}$ for all $x,y \in X$.

A vector space equipped with a norm is called a normed vector space.

Norms

Euclidean norm

Definition: For two vectors if $x=(x_1,x_2,\ldots,x_n) \in \R^n$ and $y=(y_1,y_2,\ldots,y_n) \in \R^n$, the dot product is defined as \begin{equation*} x \cdot y := \sum_{j=1}^n x_j\, y_j . \end{equation*}

Definition: For $x=(x_1,x_2,\ldots,x_n) \in \R^n$, the euclidean norm is defined as \begin{equation*} \snorm{x} := \snorm{x}_{\R^n} := \sqrt{x \cdot x} = \sqrt{(x_1)^2+(x_2)^2 + \cdots + (x_n)^2}. \end{equation*}

Norms

Cauchy-Schwarz inequality

Theorem: Let $x, y \in \R^n$, then \begin{equation*} \sabs{x \cdot y} \leq \snorm{x} \, \snorm{y} = \sqrt{x\cdot x}\, \sqrt{y\cdot y}, \end{equation*} with equality if and only if $x = \lambda y$ or $y = \lambda x$ for some $\lambda \in \R$.

Norms

Operator norm

Definition: Let $A \in L(X,Y)$. Define \begin{equation*} \snorm{A} := \sup \bigl\{ \snorm{Ax} : x \in X \text{ with } \snorm{x} = 1 \bigr\} . \end{equation*} The number $\snorm{A}$ (possibly $\infty$) is called the operator norm. In particular, the norm operator is a norm for finite-dimensional spaces. When it is necessary to emphasize which norm we are talking about, we may write it as $\snorm{A}_{L(X,Y)}$.

Norms

Operator norm

Theorem: Let $X$ and $Y$ be normed vector spaces. Suppose that $X$ is finite-dimensional. If $A \in L(X,Y)$, then $\snorm{A} \lt \infty$, and $A$ is uniformly continuous (Lipschitz with constant $\snorm{A}$).

Theorem: Let $X$, $Y$, and $Z$ be finite-dimensional normed vector spaces.

  1. If $A,B \in L(X,Y)$ and $c \in \R$, then \begin{equation*} \snorm{A+B} \leq \snorm{A}+\snorm{B}, \qquad \snorm{cA} = \sabs{c} \, \snorm{A} . \end{equation*} In particular, the operator norm is a norm on the vector space $L(X,Y)$.
  2. If $A \in L(X,Y)$ and $B \in L(Y,Z)$, then $ \snorm{BA} \leq \snorm{B} \, \snorm{A} . $

Linear operators & Matrices

Consider the vector space $M_{m\times n}$ consisting of all $m\times n$ matrices.

If $\{x_1,\ldots,x_n\}$ is a basis of $X$ and $\{y_1,\ldots,y_m\}$ is a basis of $Y$, then for each $A\in L(X,Y)$, we have a matrix $\mathcal M(A)\in M_{m\times n}$. In other words, once bases have been fixed for $X$ and $Y$, $\mathcal M$ becomes a linear mapping from $L(X,Y)$ to $M_{m\times n}$.

Moreover $\mathcal M$ is a one-to-one correspondence between $L(X,Y)$ and $M_{m\times n}$.

Linear transformations in 2D

Linear transformations in 3D

Determinants

Definition: Let $S_n$ be the set of all permutations on $n$ elements. Let $A= [a_{i,j}]$ be a square $n$-by-$n$ matrix. Define the determinant of $A$ \begin{equation*} \det(A) := \sum_{\sigma \in S_n} \operatorname{sgn} (\sigma) \prod_{i=1}^n a_{i,\sigma_i} . \end{equation*} where \begin{equation} \label{eq:sgndef} \operatorname{sgn}(\sigma) = \operatorname{sgn}(\sigma_1,\ldots,\sigma_n) := \prod_{p \lt q} \operatorname{sgn}(\sigma_q-\sigma_p) . \end{equation}

Remark: The determinant is number assigned to square matrices that measures how the corresponding linear mapping stretches the space. In particular, this number, can be used to test for invertibility of a matrix.

Determinants

Some important facts about determinants:

  1. $\det(I) = 1$, where $I$ is the identity matrix.
  2. If two columns of $A$ are equal, then $\det(A) = 0$.
  3. If a column is zero, then $\det(A) = 0$.
  4. $A \mapsto \det(A)$ is a continuous function on $L(\R^n)$.
  5. $\det\left( \left[\begin{smallmatrix} a & b \\ c &d \end{smallmatrix}\right] \right) = ad-bc$, and $\det \bigl( [a] \bigr) = a$.
  6. If $A$ and $B$ are $n$-by-$n$ matrices, then $\det(AB) = \det(A)\det(B)$.
  7. $A$ is invertible if and only if $\det(A) \not= 0$ and in this case, $\det\left(A^{-1}\right) = \frac{1}{\det(A)}$.

Credits