Definition:Differential Entropy

Definition
Differential entropy extends the concept of entropy to continuous random variables.

Let $X$ be a continuous random variable.

Let $X$ have probability density function $f_X$.

Then the differential entropy of $X$, $h \left({X}\right)$ measured in nats, is given by:


 * $\displaystyle h \left({X}\right) = -\int_{-\infty}^\infty f_X \left({x}\right) \ln f_X \left({x}\right) \rd x$

Where $f_X \left({x}\right) = 0$, we take $f_X \left({x}\right) \ln f_X \left({x}\right) = 0$ by convention.