Differential Entropy of Gaussian Distribution

Theorem
Let $X \sim \Gaussian \mu {\sigma^2}$ for some $\mu \in \R, \sigma \in \R_{> 0}$, where $N$ is the Gaussian distribution.

Then the differential entropy $\map h X$ of $X$ is given by:


 * $\map h X = \map \ln {\sigma \sqrt {2 \pi} } + \dfrac 1 2$

Proof
From the definition of the Gaussian distribution, $X$ has probability density function:


 * $\map {f_X} x = \dfrac 1 {\sigma \sqrt {2 \pi} } \, \map \exp {-\dfrac {\paren {x - \mu}^2} {2 \sigma^2} }$

From the definition of differential entropy:


 * $\displaystyle \map h X = -\int_{-\infty}^\infty \map {f_X} x \ln \map {f_X} x \rd x$

So: