Definition:Entropy (Probability Theory)

Let $$X$$ be a discrete random variable that takes on the values of {$$x_1,x_2,\ldots,x_n$$} and has a probability mass function of $$p(x_i)$$.

Then the entropy of $$X$$ is:


 * $$H(X):= - \sum_{i=1}^{n} p(x_i) \log_2 p(x_i)$$

and is measured in units of bits.

By convention $$0 \log_2 0 = 0$$, which is justified since $$\lim_{x \to 0^+} x \log_2x = 0$$.

Note
The base of the logarithm can take on other values.

By Change of Base of Logarithm:
 * $$\log_b p = \log_b a \log_a p$$

this amounts to merely a change of scale.