Definition:Uncertainty

Definition
Let $X$ be a random variable.

Let $X$ take a finite number of values with probabilities $p_1, p_2, \dotsc, p_n$.

The uncertainty of $X$ is defined to be:


 * $\map H X = \displaystyle -\sum_k p_k \lg p_k$

where:
 * $\lg$ denotes logarithm base $2$
 * the summation is over those $k$ where $p_k > 0$.

Also known as
The uncertainty of a random variable is also known as its entropy.