Expectation of Discrete Random Variable from PGF

Theorem
Let $$X$$ be a discrete random variable whose probability generating function is $$\Pi_X \left({s}\right)$$.

Then the expectation of $$X$$ is the value of the first derivative of $$\Pi_X \left({s}\right)$$ WRT $$s$$ at $$x=1$$.

That is:
 * $$E \left({X}\right) = \Pi'_X \left({1}\right)$$

Proof
From the definition of the probability generating function:
 * $$\Pi_X \left({s}\right) = \sum_{x \ge 0} p \left({x}\right) s^x = p \left({0}\right) + p \left({1}\right) s + p \left({2}\right) s^2 + p \left({3}\right) s^3 + \cdots$$

Differentiate this WRT $$s$$:
 * $$\Pi'_X \left({s}\right) = \sum_{x \ge 0} x p \left({x}\right) s^{x-1} = p \left({1}\right) + 2 p \left({2}\right) s + 3 p \left({3}\right) s^2 + \cdots$$

Plugging in $$s = 1$$ gives:
 * $$\Pi'_X \left({1}\right) = \sum_{x \ge 0} x p \left({x}\right) 1^{x-1} = p \left({1}\right) + 2 p \left({2}\right) + 3 p \left({3}\right) + \cdots$$

But $$\sum_{x \ge 0} x p \left({x}\right) 1^{x-1} = \sum_{x \ge 0} x p \left({x}\right)$$ is precisely the definition of the expectation.

Comment
So, in order to find the expectation of a discrete random variable, then there is no need to go through the tedious process of what might be a complicated and fiddly summation.

All you need to do is differentiate the p.g.f and plug in $$1$$.

Assuming, of course, you know what the p.g.f is.