Central Limit Theorem

Theorem
Let $X_1, X_2, \ldots$ be a sequence of independent identically distributed random variables with:


 * expectation $\expect {X_i} = \mu \in \openint \gets \to$
 * variance $\var {X_i} = \sigma^2 > 0$

Let:
 * $\ds S_n = \sum_{i \mathop = 1}^n X_i$

Then:


 * $\ds \dfrac {S_n - n \mu} {\sqrt {n \sigma^2} } \xrightarrow D \map N {0, 1}$ as $n \to \infty$

that is, converges in distribution to a standard Gaussian.

Proof
Let $Y_i = \dfrac {X_i - \mu} {\sigma}$.

We have that:
 * $\expect {Y_i} = 0$

and:
 * $\expect {Y_i^2} = 1$

Then by Taylor's Theorem the characteristic function can be written:


 * $\map {\phi_{Y_i} } t = 1 - \dfrac {t^2} 2 + \map o {t^2}$

Now let:

Then its characteristic function is given by:

Recall that the Characteristic Function of Gaussian Distribution is given by:
 * $\map \phi t = e^{i t \mu - \frac 1 2 t^2 \sigma^2}$
 * $\map \phi t = e^{i t \paren {0 } - \frac 1 2 t^2 \paren {1}^2}$
 * $\map \phi t = e^{- \frac 1 2 t^2}$.

Indeed the characteristic equations of the series converges to the Characteristic Function of Gaussian Distribution:
 * $\paren {1 - \dfrac {t^2} {2 n} + \map o {t^2} }^n \to e^{-\frac 1 2 t^2}$ as $n \to \infty$

Then Lévy's Continuity Theorem applies.

In particular, the convergence in distribution of the $U_n$ to some random variable with standard Gaussian distribution is equivalent to continuity of the limiting characteristic equation at $t = 0$.

But, $e^{-\frac 1 2 t^2}$ is clearly continuous at $0$.

So we have that $\dfrac {S_n - n \mu} {\sqrt{n \sigma^2} }$ converges in distribution to a standard Gaussian random variable.