Differential Entropy of Gaussian Distribution

Theorem
Let $X \sim N \left({\mu, \sigma^2}\right)$ for some $\mu \in \R, \sigma \in \R_{> 0}$, where $N$ is the Gaussian distribution.

Then the differential entropy of $X$, $h \left({X}\right)$, is given by:


 * $h \left({X}\right) = \ln \left({ \sigma \sqrt{2 \pi} }\right) + \dfrac 1 2$

Proof
From the definition of the Gaussian distribution, $X$ has probability density function:


 * $f_X \left({x}\right) = \dfrac 1 {\sigma \sqrt{2 \pi} } \, \exp \left({-\dfrac { \left({x - \mu}\right)^2} {2 \sigma^2} }\right)$

From the definition of differential entropy:


 * $\displaystyle h \left({X}\right) = - \int_{-\infty}^\infty f_X \left({x}\right) \ln f_X \left({x}\right) \rd x$

So: