Definition:Expectation

Definition
Let $X$ be a discrete random variable.

The expectation of $X$ is written $\operatorname E \paren X$, and is defined as:
 * $\displaystyle \operatorname E \paren {X} := \sum_{x \mathop \in \image X} x \Pr \paren {X = x}$

whenever the sum is absolutely convergent, that is, when:
 * $\displaystyle \sum_{x \mathop \in \image X} \size {x \Pr \paren {X = x} } < \infty$

where $\Pr \paren {X = x}$ is the probability mass function of $X$.

Note that the index of summation does not actually need to be limited to the image of $X$, as:
 * $\forall x \in \R: x \notin \image X \implies \Pr \paren {X = x} = 0$

Hence we can express the expectation as:
 * $\displaystyle \operatorname E \paren X := \sum_{x \mathop \in \R} x \Pr \paren {X = x}$

Also, from the definition of probability mass function, we see it can also be written:
 * $\displaystyle \operatorname E \paren X := \sum_{x \mathop \in \R} x p_X \paren x$

Also known as
The expectation of $X$ is also called the expected value of $X$ or the mean of $X$, and (for a given discrete random variable) is often denoted $\mu$.

The terminology is appropriate, as it can be seen that an expectation is an example of a normalized weighted mean. This follows from the fact that a probability mass function is a normalized weight function.

Also see
It can also be seen that the expectation of a discrete random variable is its first moment.

Linguistic Note
Don't you dare call it expectoration, you disgusting children.