Definition:Expectation

Definition
Let $$X$$ be a discrete random variable.

The expectation of $$X$$ is written $$E \left({X}\right)$$, and is defined as:
 * $$E \left({X}\right) \ \stackrel {\mathbf {def}} {=\!=} \ \sum_{x \in \operatorname{Im} \left({X}\right)} x \Pr \left({X = x}\right)$$

whenever the sum is absolutely convergent, i.e. when:
 * $$\sum_{x \in \operatorname{Im} \left({X}\right)} \left|{x \Pr \left({X = x}\right)}\right| < \infty$$

Note that the index of summation is not actually limited to the image of $$X$$, as $$\forall x \in \R: x \notin \operatorname{Im} \left({X}\right) \implies \Pr \left({X = x}\right) = 0$$.

Hence we can express the expectation as:
 * $$E \left({X}\right) \ \stackrel {\mathbf {def}} {=\!=} \ \sum_{x \in \R} x \Pr \left({X = x}\right)$$

Also, from the definition of probability mass function, we see it can also be written:
 * $$E \left({X}\right) \ \stackrel {\mathbf {def}} {=\!=} \ \sum_{x \in \R} x p_X \left({x}\right)$$

The expectation of $$X$$ is also called the expected value of $$X$$ or the mean of $$X$$, and (for a given discrete random variable) is often denoted $$\mu$$.

The terminology is appropriate, as it can be seen that an expectation is an example of a normalized weighted mean. This follows from the fact that a probability mass function is a normalized weight function.

It can also be seen that the expectation of a discrete random variable is its first moment.