From ProofWiki
Jump to navigation Jump to search


Let $X$ be a discrete random variable.

Then the variance of $X$, written $\var X$, is a measure of how much the values of $X$ varies from the expectation $\expect X$, and is defined as:

Definition 1

$\var X := \expect {\paren {X - \expect X}^2}$

That is: it is the expectation of the squares of the deviations from the expectation.

Definition 2

Using $\mu = \expect X$, we can consider $\paren {X - \mu}^2$ as a function of $X$ and apply Expectation of Function of Discrete Random Variable, to obtain:

$\ds \var X := \sum_{x \mathop \in \Omega_X} \paren {x - \mu^2} \map \Pr {X = x}$


$\mu := \expect X$ is the expectation of $X$
$\Omega_X$ is the image of $X$
$\map \Pr {X = x}$ is the probability mass function of $X$.

Definition 3

Far easier to work with than the above definition is the result:

$\var X := \expect {X^2} - \paren {\expect X}^2$

Also denoted as

In contexts where the standard deviation is of interest, the variance is often denoted ${\sigma^2}_X$.

Some sources present it as ${\sigma_X}^2$.

Others do not bother with such niceties, and merely present it as $\sigma^2_X$.


For a given discrete random variable, it is useful to know how much different it can get from its "central" value. The variance gives a convenient way of determining this.

For a given value of $x \in X$, when $x$ is fairly close to the expected value, then $\size {x - \expect X}$ tends to be small, and so will $\paren {x - \expect X}^2$

But if $x$ is further away from the expected value, then $\size {x - \expect X}$, and therefore $\paren {x - \expect X}^2$, will be bigger.

If you add up all those squares of the distances away from the expected value, you can then calculate the expected value of how far away from the expected value a particular value is going to be.

Hence the motivation behind this definition.

Also see

Technical Note

The $\LaTeX$ code for \(\var {X}\) is \var {X} .

When the argument is a single character, it is usual to omit the braces:

\var X