Contraction Theorem

Keywords: fixed point with error bounds, Banach’s Fixed Point Theorem

1. Definitions

$\textbf{Fixed point}$: A fixed point of a mapping $T: X \xrightarrow{} X$ of a set X into itself is an $x \in X$ which is mapped onto itself, that is $Tx = x$

$\textbf{Contraction}$: Let $(X, d)$ denote a metric space. A mapping $T: X \xrightarrow{} X$ is called a contraction on X if there exists a positive constant $K < 1$ such that: $d(T(x), T(y)) < Kd(x, y)$, $\forall x, y \in X$.

2. Banach’s Fixed Point Theorem

Let $(X, d)$ denote a complete (Recall: Cauchy sequence) metric space and let $T: X \xrightarrow{} X$ be a contraction on X. Then T has a unique fixed point $x \in X$ (such that T(x) = x).

Proof:

Let $x$ and $\tilde{x}$ be two different fixed point.

$d(x, \tilde{x}) = d(T(x), T(\tilde(x))) \leq Kd(x, \tilde{x})$. Since $0< K < 1$, $d(x, \tilde(x))$ must be zero. Therefore, the uniqueness has been proved.

3. Iterations and error bounds

Let’s start with any $x0 \in X$, and define the sequence $(x_n)$, where $x\{n + 1} = T(x_n)$.

$d(x_{m + 1}, x_{m}) = d(T(x_{m}), T(X{m-1})) \leq Kd(x\{m}, x_{m-1}) \leq K^m d(x_1, x_0)$

By the triangle inequality, we can achieve the generalized distance inequality for any $m > n$:

$d(x_m, x_n) \leq d(x_m, x_{m-1}) + \dots + d(x_{n+1}, x_n) \leq \frac{K^n}{1-K} d(x_1, x_0)$

$\textbf{Prior estimation of error bounds}$ Note that $d(x_m, x) \leq \frac{K^m}{1- K} d(x_0, x_1)$, where x is the fixed point of contraction T.

Proof:

$d(x_m, x) = d(T(x_{m-1}), T(x)) \leq Kd(x_{m - 1}, x) \leq K(d(x_{m-1}, x_m) + d(x_m, x))$

$d(x_m, x) \leq \frac{K^m}{1 - K}d(x_1, x_0)$

$\textbf{Posterior estimation of error bounds}$:

$d(x_m, x) \leq \frac{K}{1-K}d(x_{m-1}, x_{m})$

Foundation of Machine Learning-Part1-PAC

Comments

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×