Expectation of Discrete Random Variable from PGF

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete random variable whose probability generating function is $\Pi_X \left({s}\right)$.


Then the expectation of $X$ is the value of the first derivative of $\Pi_X \left({s}\right)$ WRT $s$ at $s=1$.

That is:

$E \left({X}\right) = \Pi'_X \left({1}\right)$


Proof

For ease of notation, write $p \left({x}\right)$ for $\Pr \left({X = x}\right)$.


From the definition of the probability generating function:

$\displaystyle \Pi_X \left({s}\right) = \sum_{x \mathop \ge 0} p \left({x}\right) s^x = p \left({0}\right) + p \left({1}\right) s + p \left({2}\right) s^2 + p \left({3}\right) s^3 + \cdots$

Differentiate this WRT $s$:

\(\displaystyle \Pi'_X \left({s}\right)\) \(=\) \(\displaystyle \frac {\mathrm d} {\mathrm d s} \sum_{k \mathop = 0}^\infty \Pr \left({X = k}\right) s^k\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 0}^\infty \frac {\mathrm d} {\mathrm d s} \Pr \left({X = k}\right) s^k\) Abel's Theorem
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 0}^\infty k \Pr \left({X = k}\right) s^{k-1}\) Power Rule for Derivatives


Plugging in $s = 1$ gives:

$\displaystyle \Pi'_X \left({1}\right) = \sum_{k \mathop = 0}^\infty k \Pr \left({X = k}\right) 1^{k-1} = p \left({1}\right) + 2 p \left({2}\right) + 3 p \left({3}\right) + \cdots$

But:

$\displaystyle \sum_{k \mathop = 0}^\infty k \Pr \left({X = k}\right) 1^{k-1} = \sum_{k \mathop = 0}^\infty k \Pr \left({X = k}\right)$

is precisely the definition of the expectation.

$\blacksquare$


Comment

So, in order to find the expectation of a discrete random variable, then there is no need to go through the tedious process of what might be a complicated and fiddly summation.

All you need to do is differentiate the PGF and plug in $1$.

Assuming, of course, you know what the PGF is.


Sources