Variance as Expectation of Square minus Square of Expectation/Continuous

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X$ be a continuous random variable.

Then the variance of $X$ can be expressed as:

$\var X = \expect {X^2} - \paren {\expect X}^2$


That is, it is the expectation of the square of $X$ minus the square of the expectation of $X$.


Proof

Let $\mu = \expect X$.

Let $X$ have probability density function $f_X$.


As $f_X$ is a probability density function:

$\ds \int_{-\infty}^\infty \map {f_X} x \rd x = \Pr \paren {-\infty < X < \infty} = 1$


Then:

\(\ds \var X\) \(=\) \(\ds \expect {\paren {X - \mu}^2}\) Definition of Variance of Continuous Random Variable
\(\ds \) \(=\) \(\ds \int_{-\infty}^\infty \paren {X - \mu}^2 \map {f_X} x \rd x\) Definition of Expectation of Continuous Random Variable
\(\ds \) \(=\) \(\ds \int_{-\infty}^\infty \paren {x^2 - 2 \mu x + \mu^2} \map {f_X} x \rd x\)
\(\ds \) \(=\) \(\ds \int_{-\infty}^\infty x^2 \map {f_X} x \rd x - 2 \mu \int_{-\infty}^\infty x f_X \paren x \rd x + \mu^2 \int_{-\infty}^\infty \map {f_X} x \rd x\)
\(\ds \) \(=\) \(\ds \expect {X^2} - 2 \mu^2 + \mu^2\) Definition of Expectation of Continuous Random Variable
\(\ds \) \(=\) \(\ds \expect {X^2} - \mu^2\)
\(\ds \) \(=\) \(\ds \expect {X^2} - \paren {\expect X}^2\)

$\blacksquare$