Mathematical Analysis

Lecture 19








Metric spaces

10.1 Metric spaces

Definition 10.1.1. Let $X$ be a set, and let $d \colon X \times X \to \R$ be a function such that for all $x,y,z \in X$

  1. $d(x,y) \geq 0$.
  2. $d(x,y) = 0$ if and only if $x = y.$
  3. $d(x,y) = d(y,x).$
  4. $d(x,z) \leq d(x,y)+ d(y,z).$

The pair $(X,d)$ is called a metric space. The function $d$ is called the metric or the distance function. Sometimes we write just $X$ as the metric space instead of $(X,d)$ if the metric is clear from context.



Example 10.1.1.

The set of real numbers $\R$ is a metric space with the metric \begin{equation*} d(x,y) := \abs{x-y} . \end{equation*}

Items (1)-(3) of the definition are easy to verify. 😃

1. $d(x,y)=\abs{x-y}$ $\geq 0 $ (properties of $\abs{\cdot}$).

2. $d(x,y) = 0$ $\iff \abs{x-y} = 0 $ $\iff x = y. $

3. $d(x,y) = \abs{x-y}$ $=\abs{y-x} $ $=d(y,x). $


Example 10.1.1.

The set of real numbers $\R$ is a metric space with the metric $ d(x,y) := \abs{x-y} . $

The triangle inequality follows immediately from the standard triangle inequality for real numbers.

That is, $\;d(x,z)$ $ = \abs{x-z}$ $ = \abs{x-y+y-z}\qquad \quad \;$

$ \ds\leq \abs{x-y}+\abs{y-z}\;\;$

$= d(x,y)+ d(y,z) .\;\;$

This is the standard metric on $\R$. If we talk about $\R$ as a metric space without mentioning a specific metric, we mean this particular metric.



10.1 Metric spaces

An well-known metric space is the $n$-dimensional euclidean space

$\R^n = \R \times \R \times \cdots \times \R.$

We use the following notation for points: $x =(x_1,x_2,\ldots,x_n) \in \R^n$.




10.1 Metric spaces

$\R^n = \R \times \R \times \cdots \times \R.$

We use the following notation for points:

$x =(x_1,x_2,\ldots,x_n) \in \R^n$.

Remark 1: We will not write $\vec{x}$ nor $\mathbf{x}$ for a vector, we just give it a name such as $x$ and keep in mind that $x$ is a vector.

Remark 2: We also write simply $0 \in \R^n$ to mean the point $(0,0,\ldots,0)$.


10.1 Metric spaces

Theorem 10.1.1. (Cauchy-Schwarz inequality) If $x =(x_1,x_2,\ldots,x_n),\,$ $\,y =(y_1,y_2,\ldots,y_n) \in \R^n,$ then \begin{equation*} {\biggl( \sum_{k=1}^n x_k y_k \biggr)}^2 \leq \biggl(\sum_{k=1}^n x_k^2 \biggr) \biggl(\sum_{k=1}^n y_k^2 \biggr) . \end{equation*}





Proof of Theorem 10.1.1 (Cauchy-Schwarz inequality)

Any square of a real number is nonnegative. Thus any sum of squares is nonnegative. That is:

$\ds 0 \leq $ $\ds \sum_{j=1}^n \sum_{k=1}^n {(x_j y_k - x_k y_j)}^2 \qquad \qquad \qquad $

$\ds = \sum_{j=1}^n \sum_{k=1}^n \bigl( x_j^2 y_k^2 + x_k^2 y_j^2 - 2 x_j x_k y_j y_k \bigr) \; $




Proof of Theorem 10.1.1 (Cauchy-Schwarz inequality)

Any square of a real number is nonnegative. Thus any sum of squares is nonnegative. That is:

$\ds 0 \leq $ $\ds \sum_{j=1}^n \sum_{k=1}^n {(x_j y_k - x_k y_j)}^2 \qquad \qquad \qquad \qquad \qquad $

$\ds = \sum_{j=1}^n \sum_{k=1}^n \bigl( x_j^2 y_k^2 + x_k^2 y_j^2 - 2 x_j x_k y_j y_k \bigr) \qquad \qquad \; $

$\ds \;\;= \biggl( \sum_{j=1}^n x_j^2 \biggr) \biggl( \sum_{k=1}^n y_k^2 \biggr) $ $\ds \; +\; \biggl( \sum_{j=1}^n y_j^2 \biggr) \biggl( \sum_{k=1}^n x_k^2 \biggr) \qquad $

$\ds\quad \, - \;2 \biggl( \sum_{j=1}^n x_j y_j \biggr) \biggl( \sum_{k=1}^n x_k y_k \biggr) . $


Proof of Theorem 10.1.1 (Cauchy-Schwarz inequality)

We relabel and divide by 2 to get \begin{equation*} 0 \leq \biggl( \sum_{k=1}^n x_k^2 \biggr) \biggl( \sum_{k=1}^n y_k^2 \biggr) - {\biggl( \sum_{k=1}^n x_k y_k \biggr)}^2. \end{equation*}

That is $\ds {\biggl( \sum_{k=1}^n x_k y_k \biggr)}^2 \leq \biggl( \sum_{k=1}^n x_k^2 \biggr) \biggl( \sum_{k=1}^n y_k^2 \biggr). $ $\;\blacksquare$



Example 10.1.2.

Using the previous result we can construct the standard metric for $\R^n$. Define

$\ds d(x,y) := \sqrt{ {(x_1-y_1)}^2 + {(x_2-y_2)}^2 + \cdots + {(x_n-y_n)}^2 } $

$\ds = \sqrt{ \sum_{k=1}^n {\left(x_k-y_k\right)}^2 } . \qquad \qquad \qquad \;\;\; $

Again proving items (1)-(3) is super easy! 😃




Example 10.1.2.

The standard metric for $\R^n$: $\ds d(x,y) \ds = \sqrt{ \sum_{k=1}^n {\left(x_k-y_k\right)}^2 } . $

1. $d(x,y)=\sqrt{ \sum_{k=1}^n {\left(x_k-y_k\right)}^2 }$ $\geq 0 $.

2. $d(x,y) = 0$ $\iff \sqrt{ \sum_{k=1}^n {\left(x_k-y_k\right)}^2 } = 0 $ $\iff x_k = y_k $

$\qquad \quad \qquad \iff x = y. $

3. $d(x,y) =\sqrt{ \sum_{k=1}^n {\left(x_k-y_k\right)}^2 }$ $=\sqrt{ \sum_{k=1}^n {\left(y_k-x_k\right)}^2 } $

$\qquad \quad \;\;=d(y,x). $



Example 10.1.2.

Using the Cauchy-Schwarz inequality we can determine the triangle inequality working with the square of the metric:

$\ds {\bigl(d(x,z)\bigr)}^2 $ $\ds = \sum_{k=1}^n {(x_k-z_k)}^2 $ $\ds = \sum_{k=1}^n {(x_k-y_k+y_k-z_k)}^2 $

$\ds = \sum_{k=1}^n {(x_k-y_k)}^2 + \sum_{k=1}^n {(y_k-z_k)}^2 $

$\ds \qquad \qquad +\; 2 \sum_{k=1}^n (x_k-y_k)(y_k-z_k) $





Example 10.1.2.

Using the Cauchy-Schwarz inequality we can determine the triangle inequality working with the square of the metric:

$\ds {\bigl(d(x,z)\bigr)}^2 $ $\ds = \sum_{k=1}^n {(x_k-y_k)}^2 + \sum_{j=1}^n {(y_k-z_k)}^2 \qquad\qquad\qquad $

$\ds \qquad +\; 2 \sum_{k=1}^n (x_k-y_k)(y_k-z_k) $

$\ds \leq \sum_{k=1}^n {(x_k-y_k)}^2 + \sum_{k=1}^n {(y_k-z_k)}^2\qquad $

$\ds \,\quad\qquad \qquad +\; 2 \sqrt{ \sum_{k=1}^n {(x_k-y_k)}^2 \sum_{k=1}^n {(y_k-z_k)}^2 } $





Example 10.1.2.

${\bigl(d(x,z)\bigr)}^2 \ds \leq \sum_{k=1}^n {(x_k-y_k)}^2 + \sum_{k=1}^n {(y_k-z_k)}^2\qquad\qquad\quad $

$\ds \,\quad\qquad +\; 2 \sqrt{ \sum_{k=1}^n {(x_k-y_k)}^2 \sum_{k=1}^n {(y_k-z_k)}^2 } $

$\ds\;\qquad = {\left( \sqrt{ \sum_{k=1}^n {(x_k-y_k)}^2 } + \sqrt{ \sum_{k=1}^n {(y_k-z_k)}^2 } \right)}^2 $

$\ds\qquad = {\bigl( d(x,y) + d(y,z) \bigr)}^2 .\qquad\qquad\qquad\;\;\, $



Example 10.1.2.

Thus we have that

${\bigl(d(x,z)\bigr)}^2 \leq {\bigl( d(x,y) + d(y,z) \bigr)}^2 . $

Therefore

$d(x,z) \leq d(x,y) + d(y,z) . $


Note: The square root is an increasing function. Thus the inequality is preserved when we take the square root of both sides. 🧐



Example 10.1.3.

Consider the set of complex numbers $$\C = \left\{ x+iy ~|~ x,y\in \R, i^2=-1\right\}.$$

For $z=x+iy\in \C,$ we define the modulus by \[ \abs{z} := \sqrt{x^2+y^2}. \] Then for two complex numbers $z_1 = x_1 + iy_1$ and $z_2 = x_2 + iy_2,$ the distance is

$ d(z_1,z_2) $ $ = \sqrt{{\left(x_1-x_2\right)}^2+ {\left(y_1-y_2\right)}^2} $ $=\sabs{z_1-z_2}.$



Example 10.1.3.

Thus, the complex numbers $\C$ is a metric space with metric $$ d(z_1,z_2) = \sqrt{{\left(x_1-x_2\right)}^2+ {\left(y_1-y_2\right)}^2} =\sabs{z_1-z_2}.$$






Mandelbrot set: Click on image to interact

$z_{n+1}=z_n^2+z_0$

Use  ⌨️  Orbit (O), Info (I), Reset (R).   Use 🖱️ wheel to Zoom In/Out
For more details visit Mandelbrot & Julia sets

Example 10.1.4.

For any set $X$, define the discrete metric as \begin{equation*} d(x,y) \coloneqq \begin{cases} 1 & \text{if } x \not= y, \\ 0 & \text{if } x = y. \end{cases} \end{equation*}

Items 1-3 of the metric definition are easy to check. Try it! 📝





Example 10.1.4.

It remains to establish the triangle inequality \[ d(x, y) \leq d(x, z) + d(z, y). \] If $x=y $, then $d(x, y)=0$, and the inequality certainly holds.

If $x\neq y$, then $d(x, y)=1$. Since $x\neq y$, we must have either $z\neq x$ or else $z\neq y$. Thus, the right hand side is at least 1 and the triangle inequality holds in any case.



Example 10.1.5.

Let $C\bigl([a,b],\R\bigr)$ be the set of continuous real-valued functions on the interval $[a,b]$. Define the metric on $C\bigl([a,b],\R\bigr)$ as \begin{equation*} d(f,g) \coloneqq \sup_{x \in [a,b]} \abs{f(x)-g(x)} . \end{equation*}

Again, items 1-3 of the metric definition are easy to check. Try it! 📝



Example 10.1.5.

Let's prove the triangle inequality:

$d(f,g)\,$ $\ds = \sup_{x \in [a,b]} \abs{f(x)-g(x)}\qquad \qquad \qquad \qquad\quad $

$ =\ds \sup_{x \in [a,b]} \abs{f(x)-h(x)+h(x)-g(x)}\quad \qquad $

$ \ds \leq \sup_{x \in [a,b]} \bigl( \abs{f(x)-h(x)}+\abs{h(x)-g(x)} \bigr) \quad\;\; $

$\ds \leq \sup_{x \in [a,b]} \abs{f(x)-h(x)}+ \sup_{x \in [a,b]} \abs{h(x)-g(x)} $

$ = d(f,h) + d(h,g) .\; \bs\qquad \qquad \qquad \quad\;$



10.1 Metric spaces

Theorem 10.1.2 Let $(X,d)$ be a metric space and $Y \subset X.$ Then the restriction $d|_{Y \times Y}$ is a metric on $Y.$


Definition 10.1.2. If $(X,d)$ is a metric space, $Y \subset X,$ and $d' := d|_{Y \times Y},$ then $(Y,d')$ is said to be a subspace of $(X,d).$




10.1 Metric spaces

Definition 10.1.3. Let $(X,d)$ be a metric space. A subset $S \subset X$ is said to be bounded if there exists a $p \in X$ and a $B \in \R$ such that \begin{equation*} d(p,x) \leq B \quad \text{for all } x \in S. \end{equation*} We say $(X,d)$ is bounded if $X$ itself is a bounded subset.



Credits