Central Limit Theorem

From ProofWiki
Jump to: navigation, search

Theorem

Let $X_1, X_2, \ldots$ be a sequence of independent identically distributed random variables with:

Mean $E \left[{X_i}\right] = \mu \in \left({-\infty \,.\,.\, \infty}\right)$
Variance $V \left({X_i}\right) = \sigma^2 > 0$

Let:

$\displaystyle S_n = \sum_{i \mathop = 1}^n X_i$

Then:

$\displaystyle \frac {S_n - n \mu} {\sqrt {n \sigma^2} } \xrightarrow {D} N \left({0, 1}\right)$ as $n \to \infty$

that is, converges in distribution to a standard normal.


Proof

Let $Y_i = \dfrac {X_i - \mu} {\sigma}$.

We have that:

$E \left[{Y_i}\right] = 0$

and:

$E \left[{Y_i^2}\right] = 1$


Then by Taylor's Theorem the characteristic function can be written:

$\phi_{Y_i} = 1 - \dfrac {t^2} 2 + o \left({t^2}\right)$

Now let:

\(\displaystyle U_n\) \(=\) \(\displaystyle \frac {S_n - n \mu} {\sqrt {n \sigma^2} }\)                    
\(\displaystyle \) \(=\) \(\displaystyle \sum_{i \mathop = 1}^n \frac {X_i - \mu} {\sqrt {n \sigma^2} }\)                    
\(\displaystyle \) \(=\) \(\displaystyle \frac 1 {\sqrt n} \sum_{i \mathop = 1}^n \left({\frac {X_i - \mu} {\sigma} }\right)\)                    
\(\displaystyle \) \(=\) \(\displaystyle \frac 1 {\sqrt n} \sum_{i \mathop = 1}^n Y_i\)                    


Then its characteristic function is given by:

\(\displaystyle \phi_{U_n} \left({t}\right)\) \(=\) \(\displaystyle E \left[{e^{i t U_n} }\right]\)                    
\(\displaystyle \) \(=\) \(\displaystyle E \left[{\exp \left({\frac {i t} {\sqrt n} \sum_n^{i \mathop = 1} Y_i}\right)}\right]\)                    
\(\displaystyle \) \(=\) \(\displaystyle \prod_n^{i \mathop = 1} E \left[{\exp \left({\frac{i t} {\sqrt n} Y_i}\right)}\right]\)          since $Y_i$ are independent identically distributed          
\(\displaystyle \) \(=\) \(\displaystyle \prod_n^{i \mathop = 1} \phi_{Y_i} \left({\frac t {\sqrt n} }\right)\)                    
\(\displaystyle \) \(=\) \(\displaystyle \left({\phi_{Y_i} \left({\frac t {\sqrt n} }\right)}\right)^n\)          since $Y_i$ are independent identically distributed          
\(\displaystyle \) \(=\) \(\displaystyle \left({1 - \frac {t^2} {2 n} + o \left({t^2}\right)}\right)^n\)                    



Recall that the characteristic equation of a standard normal is given by:

$e^{-\frac 1 2 t^2}$.

Indeed the characteristic equations of the series converges to the standard normal characteristic equation:

$\left({1 - \dfrac {t^2} {2 n} + o \left({t^2}\right)}\right)^n \to e^{-\dfrac 1 2 t^2}$ as $n \to \infty$

Then Lévy’s continuity theorem applies.

In particular, the convergence in distribution of the $U_n$ to some random variable with standard normal distribution is equivalent to continuity of the limiting characteristic equation at $t = 0$.

But, $e^{-\frac 1 2 t^2}$ is clearly continuous at $0$.

So we have that $\dfrac{S_n - n \mu} {\sqrt{n \sigma^2} }$ converges in distribution to a standard normal random variable.

$\blacksquare$

Sources