Bienaymé-Chebyshev Inequality/Proof 1

Proof
Let $f$ be the function:


 * $\map f x = \begin{cases} k^2\sigma^2 & : \size {x - \mu} \ge k\sigma \\

0 & : \text{otherwise} \end{cases}$

By construction, we see that $\map f x \le \size {x - \mu}^2 = \paren {x - \mu}^2$ for all $x$.

This means that $\expect {\map f X} \le \expect {\paren {X - \mu}^2}$.

From the definition of variance,


 * $\expect {\paren {X - \mu}^2} = \var X = \sigma^2$

From the definition of discrete expectated value, we can show that:


 * $\expect {\map f X} = k^2\sigma^2 \map \Pr {\size {X - \mu} \ge k\sigma} + 0 \cdot \map \Pr {\size {X - \mu} \leq k\sigma} = k^2\sigma^2 \map \Pr {\size {X - \mu} \ge k\sigma}$

Putting this together, we have:


 * $\expect {\map f X} \le \expect {\paren {X - \mu}^2} \leadsto k^2\sigma^2 \map \Pr {\size {X - \mu} \ge k\sigma} \le \sigma^2$

By dividing both sides by $k^2\sigma^2$, we get:


 * $\map \Pr {\size {X - \mu} \ge k\sigma} \le \dfrac 1 {k^2}$