Kolmogorov's Law

Theorem
Let $P$ be a population.

Let $P$ have mean $\mu$ and finite variance.

Let $\sequence {X_n}_{n \mathop \ge 1}$ be a sequence of random variables forming a random sample from $P$.

Let:


 * $\ds {\overline X}_n = \frac 1 n \sum_{i \mathop = 1}^n X_i$

Then:


 * $\ds {\overline X}_n \xrightarrow {\text {a.s.} } \mu$

where $\xrightarrow {\text {a.s.} }$ denotes almost sure convergence.

Proof
Let $\varepsilon \in \R_{>0}$.

For $k \in \N_{>0}$ let:
 * $\ell_k := \floor {\paren {1 + \epsilon}^n }$

be the floor of $\paren {1 + \epsilon}^n$.

Then:

Thus:
 * $\ds \sum_{k \mathop = 1}^\infty \map \Pr {\size { {\overline X}_{\ell_k} - \mu} \ge \paren {1 + \epsilon}^{-n/4} } < \infty$

By Borel-Cantelli Lemma:
 * $\ds \map \Pr { \limsup_{k \mathop \to \infty} \set { \size { {\overline X}_{\ell_k} - \mu} \ge \paren {1 + \epsilon}^{-n/4} } } = 0$

That is:
 * $\ds \map \Pr { \liminf_{k \mathop \to \infty} \set { \size { {\overline X}_{\ell_k} - \mu} < \paren {1 + \epsilon}^{-n/4} } } = 1$

In particular:
 * $\ds \map \Pr { \lim_{k \mathop \to \infty} \size { {\overline X}_{\ell_k} - \mu} = 0 } = 1$

We claim:
 * $\ds \map \Pr { \limsup_{n \mathop \to \infty} \size { {\overline X}_n - \mu} \le 2 \epsilon \expect {\size {X_1} } } = 1$

Also see

 * Bernoulli's Theorem, also known as the Law of Large Numbers
 * Khinchin's Law, also known as the Weak Law of Large Numbers