# Probability Generating Function as Expectation

## Theorem

Let $X$ be a discrete random variable whose codomain, $\Omega_X$, is a subset of the natural numbers $\N$.

Let $p_X$ be the probability mass function for $X$.

Let $\Pi_X \left({s}\right)$ be the probability generating function for $X$.

Then:

$\Pi_X \left({s}\right) = E \left({s^X}\right)$

where $E \left({s^X}\right)$ denotes the expectation of $s^X$.

## Proof

By definition of probability generating function:

$\displaystyle \Pi_X \left({s}\right) := \sum_{n \mathop = 0}^\infty p_X \left({n}\right) s^n = p_X \left({0}\right) + p_X \left({1}\right) s + p_X \left({2}\right) s^2 + \cdots$

where $p_X$ is the probability mass function for $X$.

For any real function $g: \R \to \R$, by Expectation of Function of Discrete Random Variable:

$\displaystyle E \left({g \left({X}\right)}\right) = \sum_{x \mathop \in \Omega_X} g \left({x}\right) \Pr \left({X = x}\right)$

whenever the sum is absolutely convergent.

In this case, $g \left({X}\right) = s^X$.

Thus:

$\displaystyle E \left({s^X}\right) = \sum_{x \mathop \in \Omega_X} s^x \Pr \left({X = x}\right)$

or, using $p_X \left({x}\right) := \Pr \left({X = x}\right)$:

$\displaystyle E \left({s^X}\right) = \sum_{x \mathop \in \Omega_X} p_X \left({x}\right) s^x$

By definition of $X$, this can then be expressed as:

$\displaystyle E \left({s^X}\right) = \sum_{n \mathop = 0}^\infty p_X \left({n}\right) s^n$

Thus whenever $\displaystyle \sum_{n \mathop = 0}^\infty p_X \left({n}\right) s^n$ is absolutely convergent:

$\Pi_X \left({s}\right) = E \left({s^X}\right)$

by definition of probability mass function.

As $p_X$ is a probability mass function for $X$:

$\displaystyle \sum_{k \mathop = 0}^\infty p_X \left({k}\right) = 1$

Thus the condition on absolute convergence is satisfied by all $s$ such that $\left|{s}\right| < 1$ by the observation:

$\displaystyle \sum_{k \mathop = 0}^\infty \left|{p_X \left({k}\right) s^k}\right| \le \sum_{k \mathop = 0}^\infty p_X \left({k}\right) = 1$

$\blacksquare$