Differential Entropy of Continuous Uniform Distribution

Theorem
Let $X \sim \operatorname U \left[{a \,.\,.\, b}\right]$ for some $a, b \in \R$, $a \ne b$, where $\operatorname U$ is the continuous uniform distribution.

Then the differential entropy of $X$, $h \left({X}\right)$, is given by:


 * $h \left({X}\right) = \ln \left({b - a}\right)$

Proof
From the definition of the continuous uniform distribution, $X$ has probability density function:


 * $\displaystyle f_X \left({x}\right) = \begin{cases} \frac 1 {b - a} & a \le x \le b \\ 0 & \text{otherwise} \end{cases}$

From the definition of differential entropy:


 * $\displaystyle h \left({X}\right) = - \int_{-\infty}^\infty f_X \left({x}\right) \ln f_X \left({x}\right) \rd x$

So: