Weierstrass Approximation Theorem/Proof 2

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $f$ be a real function which is continuous on the closed interval $\Bbb I = \closedint a b$.


Then $f$ can be uniformly approximated on $\Bbb I$ by a polynomial function to any given degree of accuracy.


Proof

Without loss of generality, assume $\Bbb I = \closedint 0 1$

For each $n \in \N$, let:

$\ds \map {P_n} x := \sum_{k \mathop = 0}^n \map f {\dfrac k n } \dbinom n k x^k \paren {1 - x}^{n - k}$

We shall show that $\lim_{n \to \infty} \norm { P_n - f}_\infty = 0$.


Let $\epsilon \in \R_{>0}$.

By Heine-Cantor Theorem, there is a $\delta \in \R_{>0}$ such that:

$\forall x,y \in \Bbb I : \size {x - y} \le \delta \implies \size {\map f x - \map f y} \le \epsilon $


Let $p \in \Bbb I$.

Let $Z_n$ be a random variable such that:

$\ds n Z_n \sim \Binomial n p$

where $\Binomial n p$ denotes the binomial distribution with parameters $n$ and $p$.

Observe that:

\(\ds \expect {\map f {Z_n} }\) \(=\) \(\ds \sum_{k \mathop = 0}^n \map f {\dfrac k n} \map \Pr {Z_n = \dfrac k n}\) Definition of Expectation of Discrete Random Variable
\(\ds \) \(=\) \(\ds \sum_{k \mathop = 0}^n \map f {\dfrac k n} \map \Pr {n Z_n = k}\)
\(\ds \) \(=\) \(\ds \sum_{k \mathop = 0}^n \map f {\dfrac k n} \dbinom n k p^k \paren {1 - p}^{n - k}\) Definition of Binomial Distribution
\(\text {(1)}: \quad\) \(\ds \) \(=\) \(\ds \map {P_n} p\)

Furthermore:

\(\ds \map \Pr {\size { Z_n - p} > \delta }\) \(\le\) \(\ds \dfrac {\expect {\size {Z_n - p } ^2} } {\delta^2}\) Bienaymé-Chebyshev Inequality
\(\ds \) \(=\) \(\ds \dfrac {\expect {\size {n Z_n - n p} ^2} } {\delta^2 n^2}\) Expectation is Linear
\(\ds \) \(=\) \(\ds \dfrac {\expect {\size {n Z_n - \expect {n Z_n} } ^2} } {\delta^2 n^2}\) Expectation of Binomial Distribution
\(\ds \) \(=\) \(\ds \dfrac {\var {n Z_n} } {\delta^2 n^2}\) Definition of Variance
\(\ds \) \(=\) \(\ds \dfrac {p \paren {1 - p} } {\delta^2 n}\) Variance of Binomial Distribution
\(\ds \) \(\le\) \(\ds \dfrac 1 {4 \delta^2 n}\) Cauchy's Mean Theorem


On the other hand:

\(\ds \size {\map f {Z_n} - \map f p}\) \(\le\) \(\ds \epsilon + \size {\map f {Z_n} - \map f p } \chi_{\set {\size {Z_n - p} > \delta } }\)
\(\ds \) \(\le\) \(\ds \epsilon + 2 \norm f_\infty \chi_{\set {\size { Z_n - p } > \delta} }\)


Therefore:

\(\ds \size {\map {P_n} p - \map f p}\) \(=\) \(\ds \size {\expect {\map f { Z_n } } - \map f p}\) by $(1)$
\(\ds \) \(=\) \(\ds \size {\expect {\map f {Z_n} } - \expect {\map f p} }\) Expectation of Almost Surely Constant Random Variable
\(\ds \) \(=\) \(\ds \size {\expect {\map f {Z_n} - \map f p} }\) Expectation is Linear
\(\ds \) \(\le\) \(\ds \expect {\size {\map f {Z_n} - \map f p} }\)
\(\ds \) \(\le\) \(\ds \epsilon + 2 \norm f_\infty \map \Pr {\size {Z_n - p} > \delta}\)
\(\ds \) \(\le\) \(\ds \epsilon + \dfrac {\norm f_\infty} {2 \delta^2 n}\)

Thus for all $n \in \N_{> 2 \delta^2 / \norm f_\infty}$ we have:

$\size {\map {P_n} p - \map f p} \le 2 \epsilon$

As the above is true for all $p \in \Bbb I$, we have:

$\forall n \in \N_{> 2 \delta^2 / \norm f_\infty} : \norm { P_n - f}_\infty \le 2 \epsilon$

$\blacksquare$


Source of Name

This entry was named for Karl Weierstrass.