Expectation of Gaussian Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $X \sim N \paren {\mu, \sigma^2}$ for some $\mu \in \R, \sigma \in \R_{> 0}$, where $N$ is the Gaussian distribution.

Then:

$\expect X = \mu$


Proof 1

From the definition of the Gaussian distribution, $X$ has probability density function:

$f_X \left({x}\right) = \dfrac 1 {\sigma \sqrt{2 \pi} } \, \exp \left({-\dfrac { \left({x - \mu}\right)^2} {2 \sigma^2} }\right)$

From the definition of the expected value of a continuous random variable:

$\displaystyle \mathbb E \left[{X}\right] = \int_{-\infty}^\infty x f_X \left({x}\right) \rd x$

So:

\(\displaystyle \mathbb E \left[{X}\right]\) \(=\) \(\displaystyle \frac 1 { \sigma \sqrt{2 \pi} } \int_{-\infty}^\infty x \exp \left({- \frac {\left({x - \mu}\right)^2} {2 \sigma^2} }\right) \rd x\)
\(\displaystyle \) \(=\) \(\displaystyle \frac {\sqrt 2 \sigma} { \sigma \sqrt{2 \pi} } \int_{-\infty}^\infty \left({\sqrt 2 \sigma t + \mu}\right) \exp \left({-t^2}\right) \rd t\) substituting $t = \dfrac {x - \mu} {\sqrt 2 \sigma}$
\(\displaystyle \) \(=\) \(\displaystyle \frac 1 {\sqrt \pi} \left({\sqrt 2 \sigma \int_{-\infty}^\infty t \exp \left({-t^2}\right) \rd t + \mu \int_{-\infty}^\infty \exp \left({-t^2}\right) \rd t}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \frac 1 {\sqrt \pi} \left({\sqrt 2 \sigma \left[{-\frac 1 2 \exp \left({-t^2}\right)}\right]_{-\infty}^\infty + \mu \sqrt \pi}\right)\) Fundamental Theorem of Calculus, Gaussian Integral
\(\displaystyle \) \(=\) \(\displaystyle \frac {\mu \sqrt \pi} {\sqrt \pi}\) Exponential Tends to Zero and Infinity
\(\displaystyle \) \(=\) \(\displaystyle \mu\)

$\blacksquare$


Proof 2

By Moment Generating Function of Gaussian Distribution, the moment generating function of $X$ is given by:

$M_X \left({t}\right) = \exp \left({\mu t + \dfrac 1 2 \sigma^2 t^2}\right)$

From Moment in terms of Moment Generating Function:

$\mathbb E \left[{X}\right] = M'_X \left({0}\right)$

We have:

\(\displaystyle M_X' \left({t}\right)\) \(=\) \(\displaystyle \frac \d {\d t} \exp \left({\mu t + \dfrac 1 2 \sigma^2 t^2}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \frac \d {\d t} \left({\mu t + \frac 1 2 \sigma^2 t^2}\right) \frac \d {\d \left({\mu t + \dfrac 1 2 \sigma^2 t^2}\right)} \exp \left({\mu t + \dfrac 1 2 \sigma^2 t^2}\right)\) Chain Rule
\(\displaystyle \) \(=\) \(\displaystyle \left({\mu + \sigma^2 t}\right) \exp \left({\mu t + \dfrac 1 2 \sigma^2 t^2}\right)\) Derivative of Power, Derivative of Exponential Function

Setting $t = 0$:

\(\displaystyle M_X' \left({0}\right)\) \(=\) \(\displaystyle \left({\mu + 0\sigma^2}\right) \exp \left({0\mu + 0 \sigma^2}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \mu \exp 0\)
\(\displaystyle \) \(=\) \(\displaystyle \mu\) Exponential of Zero

$\blacksquare$