Expectation of Function of Discrete Random Variable

Theorem
Let $$X$$ be a discrete random variable.

Let $$E \left({X}\right)$$ be the expectation of $$X$$.

Let $$g: \R \to \R$$ be a real function.

Then:
 * $$E \left({g \left({X}\right)}\right) = \sum_{x \in \Omega_X} g \left({x}\right) \Pr \left({X = x}\right)$$

whenever the sum is absolutely convergent.

Proof
Let $$\Omega_X = \operatorname{Im} \left({X}\right) = I$$.

Let $$Y = g \left({X}\right)$$.

Thus $$\Omega_Y = \operatorname{Im} \left({Y}\right) = g \left({I}\right)$$.

So:

$$ $$ $$

From the definition of expectation, this last sum applies only when the last sum is absolutely convergent.