Definition:Uncertainty

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ be a random variable.

Let $X$ take a finite number of values with probabilities $p_1, p_2, \dotsc, p_n$.


The uncertainty of $X$ is defined to be:

$\map H X = \displaystyle -\sum_k p_k \lg p_k$

where:

$\lg$ denotes logarithm base $2$
the summation is over those $k$ where $p_k > 0$.


Also known as

The uncertainty of a random variable is also known as its entropy.


Examples

Example 1

Let $R_1$ and $R_2$ be horseraces.


Let $R_1$ have $7$ runners:

$3$ of which each have probability $\dfrac 1 6$ of winning
$4$ of which each have probability $\dfrac 1 8$ of winning.


Let $R_2$ have $8$ runners:

$2$ of which each have probability $\dfrac 1 4$ of winning
$6$ of which each have probability $\dfrac 1 {12}$ of winning.


Then $R_1$ and $R_2$ have equal uncertainty.


Sources