Definition:Entropy (Probability Theory)

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ be a discrete random variable that takes on the values of $\set {x_1, x_2, \ldots, x_n}$ and has a probability mass function of $\map p {x_i}$.

Then the entropy of $X$ is:

$\ds \map H X := -\sum_{i \mathop = 1}^n \map p {x_i} \log_2 \map p {x_i}$

and is measured in units of bits.


By convention $0 \log_2 0 = 0$, which is justified since $\ds \lim_{x \mathop \to 0^+} x \log_2 x = 0$.