Definition:Differential Entropy

From ProofWiki
Jump to navigation Jump to search

Definition

Differential entropy extends the concept of entropy to continuous random variables.

Let $X$ be a continuous random variable.

Let $X$ have probability density function $f_X$.

Then the differential entropy of $X$, $\map h X$ measured in nats, is given by:

$\ds \map h X = -\int_{-\infty}^\infty \map {f_X} x \ln \map {f_X} x \rd x$


Where $\map {f_X} x = 0$, we take $\map {f_X} x \ln \map {f_X} x = 0$ by convention.


Also see

  • Results about differential entropy can be found here.


Sources