Variance as Expectation of Square minus Square of Expectation

Theorem
Let $X$ be a random variable.

Then the variance of $X$ can be expressed as:
 * $\var X = \expect {X^2} - \paren {\expect X}^2$

That is, it is the expectation of the square of $X$ minus the square of the expectation of $X$.

Comment
This is a significantly more convenient way of defining the variance than the first-principles version. In particular, it is far easier to program a computer to calculate this (you don't need to maintain a record of all the divergences). Therefore, this is by far the more usually encountered of the definitions for variance.

Also see

 * Sum of Squared Deviations from Mean