Variance of Poisson Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete random variable with the Poisson distribution with parameter $\lambda$.


Then the variance of $X$ is given by:

$\var X = \lambda$


Proof 1

From the definition of Variance as Expectation of Square minus Square of Expectation:

$\var X = \expect {X^2} - \paren {\expect X}^2$

From Expectation of Function of Discrete Random Variable:

$\displaystyle \expect {X^2} = \sum_{x \mathop \in \Omega_X} x^2 \, \map \Pr {X = x}$


So:

\(\displaystyle \expect {X^2}\) \(=\) \(\displaystyle \sum_{k \mathop \ge 0} {k^2 \dfrac 1 {k!} \lambda^k e^{-\lambda} }\) $\quad$ Definition of Poisson Distribution $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda e^{-\lambda} \sum_{k \mathop \ge 1} {k \dfrac 1 {\paren {k - 1}!} \lambda^{k - 1} }\) $\quad$ Note change of limit: term is zero when $k=0$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda e^{-\lambda} \paren {\sum_{k \mathop \ge 1} {\paren {k - 1} \dfrac 1 {\paren {k - 1}!} \lambda^{k - 1} } + \sum_{k \mathop \ge 1} {\frac 1 {\paren {k - 1}!} \lambda^{k - 1} } }\) $\quad$ straightforward algebra $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda e^{-\lambda} \paren {\lambda \sum_{k \mathop \ge 2} {\dfrac 1 {\paren {k - 2}!} \lambda^{k - 2} } + \sum_{k \mathop \ge 1} {\dfrac 1 {\paren {k - 1}!} \lambda^{k - 1} } }\) $\quad$ Again, note change of limit: term is zero when $k-1=0$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda e^{-\lambda} \paren {\lambda \sum_{i \mathop \ge 0} {\dfrac 1 {i!} \lambda^i} + \sum_{j \mathop \ge 0} {\dfrac 1 {j!} \lambda^j} }\) $\quad$ putting $i = k - 2, j = k - 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda e^{-\lambda} \paren {\lambda e^\lambda + e^\lambda}\) $\quad$ Taylor Series Expansion for Exponential Function $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda \paren {\lambda + 1}\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda^2 + \lambda\) $\quad$ $\quad$


Then:

\(\displaystyle \var X\) \(=\) \(\displaystyle \expect {X^2} - \paren {\expect X}^2\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda^2 + \lambda - \lambda^2\) $\quad$ Expectation of Poisson Distribution: $\expect X = \lambda$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda\) $\quad$ $\quad$

$\blacksquare$


Proof 2

From Variance of Discrete Random Variable from PGF, we have:

$\operatorname {var} \left({X}\right) = \Pi''_X \left({1}\right) + \mu - \mu^2$

where $\mu = E \left({x}\right)$ is the expectation of $X$.


From the Probability Generating Function of Poisson Distribution, we have:

$\Pi_X \left({s}\right) = e^{-\lambda \left({1-s}\right)}$


From Expectation of Poisson Distribution, we have:

$\mu = \lambda$


From Derivatives of PGF of Poisson Distribution, we have:

$\Pi''_X \left({s}\right) = \lambda^2 e^{- \lambda \left({1-s}\right)}$


Putting $s = 1$ using the formula $\Pi''_X \left({1}\right) + \mu - \mu^2$:

$\operatorname {var} \left({X}\right) = \lambda^2 e^{- \lambda \left({1 - 1}\right)} + \lambda - \lambda^2$

and hence the result.

$\blacksquare$


Proof 3

From Moment Generating Function of Poisson Distribution, the moment generating function of $X$, $M_X$, is given by:

$\displaystyle \map {M_X} t = e^{\lambda \paren {e^t - 1} }$

From Variance as Expectation of Square minus Square of Expectation, we have:

$\displaystyle \var X = \expect {X^2} - \paren {\expect X}^2$

From Moment in terms of Moment Generating Function:

$\displaystyle \expect {X^2} = \map {M_X''} 0$

In Expectation of Poisson Distribution, it is shown that:

$\displaystyle \map {M_X'} t = \lambda e^t e^{\lambda \paren {e^t - 1} }$

Then:

\(\displaystyle \map {M''_X} t\) \(=\) \(\displaystyle \frac \d {\d t} \paren {\lambda e^t e^{\lambda \paren {e^t - 1} } }\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda \frac \d {\d t} \paren {e^{\lambda \paren {e^t - 1} + t} }\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda \frac \d {\d t} \paren {\lambda \paren {e^t - 1} + t} \frac \d {\d \paren {\lambda \paren {e^t - 1} + t} } \paren {e^{\lambda \paren {e^t - 1} + t} }\) $\quad$ Chain Rule $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda \paren {\lambda e^t + 1} e^{\lambda \paren {e^t - 1} + t}\) $\quad$ Derivative of Power, Derivative of Exponential Function $\quad$

Setting $t = 0$:

\(\displaystyle \expect {X^2}\) \(=\) \(\displaystyle \lambda \paren {\lambda e^0 + 1} e^{\lambda \paren {e^0 - 1} + 0}\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda \paren {\lambda + 1}\) $\quad$ Exponential of Zero $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \lambda^2 + \lambda\) $\quad$ $\quad$

From Expectation of Poisson Distribution:

$\displaystyle \expect X = \lambda$

So:

$\displaystyle \var X = \lambda^2 + \lambda - \lambda^2 = \lambda$

$\blacksquare$


Also see


Sources