Expectation of Function of Discrete Random Variable

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X$ be a discrete random variable.

Let $E \left({X}\right)$ be the expectation of $X$.

Let $g: \R \to \R$ be a real function.


Then:

$\displaystyle E \left({g \left({X}\right)}\right) = \sum_{x \mathop \in \Omega_X} g \left({x}\right) \Pr \left({X = x}\right)$

whenever the sum is absolutely convergent.


Proof

Let $\Omega_X = \operatorname{Im} \left({X}\right) = I$.

Let $Y = g \left({X}\right)$.

Thus $\Omega_Y = \operatorname{Im} \left({Y}\right) = g \left({I}\right)$.

So:

\(\displaystyle E \left({Y}\right)\) \(=\) \(\displaystyle \sum_{y \mathop \in g \left({I}\right)} y \Pr \left({Y = y}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{y \mathop \in g \left({I}\right)} y \sum_{ {x \mathop \in I} \atop {g \left({x}\right) \mathop = y} } \Pr \left({X = x}\right)\) Probability Mass Function of Function of Discrete Random Variable
\(\displaystyle \) \(=\) \(\displaystyle \sum_{x \mathop \in I} g \left({x}\right) \Pr \left({X = x}\right)\)

From the definition of expectation, this last sum applies only when the last sum is absolutely convergent.

$\blacksquare$


Sources