Variance of Gamma Distribution/Proof 1

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X \sim \map \Gamma {\alpha, \beta}$ for some $\alpha, \beta > 0$, where $\Gamma$ is the Gamma distribution.

The variance of $X$ is given by:

$\var X = \dfrac \alpha {\beta^2}$


Proof

From the definition of the Gamma distribution, $X$ has probability density function:

$\map {f_X} x = \dfrac {\beta^\alpha x^{\alpha - 1} e^{-\beta x} } {\map \Gamma \alpha}$

From Variance as Expectation of Square minus Square of Expectation:

$\ds \var X = \int_0^\infty x^2 \map {f_X} x \rd x - \paren {\expect X}^2$

So:

\(\ds \var X\) \(=\) \(\ds \frac {\beta^\alpha} {\map \Gamma \alpha} \int_0^\infty x^{\alpha + 1} e^{-\beta x} \rd x - \paren {\frac \alpha \beta}^2\) Expectation of Gamma Distribution
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha} {\map \Gamma \alpha} \int_0^\infty \paren {\frac t \beta}^{\alpha + 1} e^{-t} \frac {\d t} \beta - \frac {\alpha^2} {\beta^2}\) substituting $t = \beta x$
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha} {\beta^{\alpha + 2} \map \Gamma \alpha} \int_0^\infty t^{\alpha + 1} e^{-t} \rd t - \frac {\alpha^2} {\beta^2}\)
\(\ds \) \(=\) \(\ds \frac {\map \Gamma {\alpha + 2} } {\beta^2 \map \Gamma \alpha} - \frac {\alpha^2} {\beta^2}\) Definition of Gamma Function
\(\ds \) \(=\) \(\ds \frac {\map \Gamma {\alpha + 2} - \alpha^2 \map \Gamma \alpha} {\beta^2 \map \Gamma \alpha}\)
\(\ds \) \(=\) \(\ds \frac {\alpha \paren {\alpha + 1} \map \Gamma \alpha - \alpha^2 \map \Gamma \alpha} {\beta^2 \map \Gamma \alpha}\) Gamma Difference Equation
\(\ds \) \(=\) \(\ds \frac {\alpha \map \Gamma \alpha \paren {\alpha + 1 - \alpha} } {\beta^2 \map \Gamma \alpha}\)
\(\ds \) \(=\) \(\ds \frac \alpha {\beta^2}\)

$\blacksquare$