Variance of Discrete Random Variable from PGF

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X$ be a discrete random variable whose probability generating function is $\map {\Pi_X} s$.


Then the variance of $X$ can be obtained from the second derivative of $\map {\Pi_X} s$ with respect to $s$ at $x = 1$:

$\var X = \map {\Pi_X} 1 + \mu - \mu^2$

where $\mu = \expect X$ is the expectation of $X$.


Proof

From the definition of the probability generating function:

$\ds \map {\Pi_X} s = \sum_{x \mathop \ge 2} \map p x s^x$


From Derivatives of Probability Generating Function at One:

$\ds \map {\Pi_X} s = \sum_{x \mathop \ge 2} x \paren {x - 1} \map p x s^{x - 2}$

But it also holds when you include $x = 0$ and $x = 1$ in the sum, as in both cases the term evaluates to zero and therefore vanishes.

So:

$\ds \map {\Pi_X} s = \sum_{x \mathop \ge 0} x \paren {x - 1}\map p x s^{x - 2}$


Plugging in $s = 1$ gives:

\(\ds \map {\Pi_X} 1\) \(=\) \(\ds \sum_{x \mathop \ge 0} x \paren {x - 1} \map p x 1^{x - 2}\)
\(\ds \) \(=\) \(\ds \sum_{x \mathop \ge 0} x^2 \map p x - \sum_{x \mathop \ge 0} x \map p x\)
\(\ds \) \(=\) \(\ds \expect {X^2} - \expect X\)


The result follows from the definition of variance:

$\var X = \expect {X^2} - \paren {\expect X}^2$

after a little algebra.

$\blacksquare$


Comment

So, in order to find the variance of a discrete random variable, then there is no need to go through the tedious process of what might be a complicated and fiddly summation.

All you need to do is differentiate its PGF twice, and plug in $1$.

Assuming, of course, you know what its PGF is.


Sources