Definition:Variance/Discrete

Definition
Let $$X$$ be a discrete random variable.

Then the variance of $$X$$, written $$\operatorname{var} \left({X}\right)$$, is a measure of how much the values of $$X$$ varies from the expectation $$E \left({X}\right)$$, and is defined as:
 * $$\operatorname{var} \left({X}\right) = E \left({\left({X - E \left({X}\right)}\right)^2}\right)$$

That is: it is the expectation of the squares of the distances from the expectation.

Using $$\mu = E \left({X}\right)$$, we can consider $$\left({X - \mu}\right)^2$$ as a function of $$X$$ and apply Expectation of Function of Discrete Random Variable, to obtain:
 * $$\operatorname{var} \left({X}\right) = \sum_{x \in \Omega_X} \left({x - \mu}\right)^2 \Pr \left({X = x}\right)$$

where $$\Omega_X$$ is the image of $$X$$.

Comment
For a given discrete random variable, it is useful to know how much different it can get from its "central" value. The variance gives a convenient way of determining this.

For a given value of $$x \in X$$, when $$x$$ is fairly close to the expected value, then $$\left|{x - E \left({X}\right)}\right|$$ tends to be small, and so will $$\left({x - E \left({X}\right)}\right)^2$$

But if $$x$$ is further away from the expected value, then $$\left|{x - E \left({X}\right)}\right|$$, and therefore $$\left({x - E \left({X}\right)}\right)^2$$, will be bigger.

If you add up all those squares of the distances away from the expected value, you can then calculate the expected value of how far away from the expected value a particular value is going to be.

Hence the motivation behind this definition.