Variance of Gamma Distribution

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X \sim \map \Gamma {\alpha, \beta}$ for some $\alpha, \beta > 0$, where $\Gamma$ is the Gamma distribution.

The variance of $X$ is given by:

$\var X = \dfrac \alpha {\beta^2}$


Proof 1

From the definition of the Gamma distribution, $X$ has probability density function:

$\map {f_X} x = \dfrac {\beta^\alpha x^{\alpha - 1} e^{-\beta x} } {\map \Gamma \alpha}$

From Variance as Expectation of Square minus Square of Expectation:

$\ds \var X = \int_0^\infty x^2 \map {f_X} x \rd x - \paren {\expect X}^2$

So:

\(\ds \var X\) \(=\) \(\ds \frac {\beta^\alpha} {\map \Gamma \alpha} \int_0^\infty x^{\alpha + 1} e^{-\beta x} \rd x - \paren {\frac \alpha \beta}^2\) Expectation of Gamma Distribution
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha} {\map \Gamma \alpha} \int_0^\infty \paren {\frac t \beta}^{\alpha + 1} e^{-t} \frac {\d t} \beta - \frac {\alpha^2} {\beta^2}\) substituting $t = \beta x$
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha} {\beta^{\alpha + 2} \map \Gamma \alpha} \int_0^\infty t^{\alpha + 1} e^{-t} \rd t - \frac {\alpha^2} {\beta^2}\)
\(\ds \) \(=\) \(\ds \frac {\map \Gamma {\alpha + 2} } {\beta^2 \map \Gamma \alpha} - \frac {\alpha^2} {\beta^2}\) Definition of Gamma Function
\(\ds \) \(=\) \(\ds \frac {\map \Gamma {\alpha + 2} - \alpha^2 \map \Gamma \alpha} {\beta^2 \map \Gamma \alpha}\)
\(\ds \) \(=\) \(\ds \frac {\alpha \paren {\alpha + 1} \map \Gamma \alpha - \alpha^2 \map \Gamma \alpha} {\beta^2 \map \Gamma \alpha}\) Gamma Difference Equation
\(\ds \) \(=\) \(\ds \frac {\alpha \map \Gamma \alpha \paren {\alpha + 1 - \alpha} } {\beta^2 \map \Gamma \alpha}\)
\(\ds \) \(=\) \(\ds \frac \alpha {\beta^2}\)

$\blacksquare$


Proof 2

By Moment Generating Function of Gamma Distribution, the moment generating function of $X$ is given by:

$\map {M_X} t = \paren {1 - \dfrac t \beta}^{-\alpha}$

for $t < \beta$.


From Variance as Expectation of Square minus Square of Expectation:

$\var X = \expect {X^2} - \paren {\expect X}^2$

From Expectation of Gamma Distribution:

$\expect X = \dfrac \alpha \beta$


From Moment Generating Function of Gamma Distribution: Second Moment:

$\map { {M_X}} t = \dfrac {\beta^\alpha \alpha \paren {\alpha + 1} } {\paren {\beta - t}^{\alpha + 2} }$


From Moment in terms of Moment Generating Function, we also have:

$\expect {X^2} = \map { {M_X}} 0$


Setting $t = 0$, we obtain the second moment:

\(\ds \map {M_X} 0\) \(=\) \(\ds \expect {X^2}\)
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha \alpha \paren {\alpha + 1} } {\paren {\beta - 0}^{\alpha + 2} }\)
\(\ds \) \(=\) \(\ds \frac {\beta^\alpha \alpha \paren {\alpha + 1} } {\beta^{\alpha + 2} }\)
\(\ds \) \(=\) \(\ds \frac {\alpha \paren {\alpha + 1} } {\beta^2}\)

So:

\(\ds \var X\) \(=\) \(\ds \frac {\alpha \paren {\alpha + 1} } {\beta^2} - \frac {\alpha^2} {\beta^2}\)
\(\ds \) \(=\) \(\ds \frac {\alpha^2 + \alpha - \alpha^2} {\beta^2}\)
\(\ds \) \(=\) \(\ds \frac \alpha {\beta^2}\)

$\blacksquare$


Proof 3

From Expectation of Power of Gamma Distribution‎, we have:

$\expect {X^n} = \dfrac {\alpha^{\overline n} } {\beta^n}$

where $\alpha^{\overline n}$ denotes the $n$th rising factorial of $\alpha$.


Hence:

\(\ds \var X\) \(=\) \(\ds \expect {X^2} - \paren {\expect X}^2\) Variance as Expectation of Square minus Square of Expectation
\(\ds \) \(=\) \(\ds \dfrac {\alpha^{\overline 2} } {\beta^2} - \paren {\dfrac {\alpha^{\overline 1} } \beta}^2\)
\(\ds \) \(=\) \(\ds \dfrac {\alpha \paren {\alpha + 1} } {\beta^2} - \paren {\dfrac \alpha \beta}^2\) Definition of Rising Factorial
\(\ds \) \(=\) \(\ds \frac {\alpha^2 + \alpha - \alpha^2} {\beta^2}\)
\(\ds \) \(=\) \(\ds \frac \alpha {\beta^2}\)

$\blacksquare$


Sources