Total Expectation Theorem

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\mathcal E = \struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $x$ be a discrete random variable on $\mathcal E$.

Let $\set {B_1 \mid B_2 \mid \cdots}$ be a partition of $\omega$ such that $\map \Pr {B_i} > 0$ for each $i$.


Then:

$\displaystyle \expect X = \sum_i \expect {X \mid B_i} \, \map \Pr {B_i}$

whenever this sum converges absolutely.


In the above:

$\expect X$ denotes the expectation of $X$
$\expect {X \mid B_i}$ denotes the conditional expectation of $X$ given $B_i$.


Proof

\(\displaystyle \sum_i \expect {X \mid B_i}\) \(=\) \(\displaystyle \sum_i \sum_x x \, \map \Pr {\set {X = x} \cap B_i}\) Definition of Conditional Expectation
\(\displaystyle \) \(=\) \(\displaystyle \sum_x x \map \Pr {\set {X \in x} \cap \paren {\bigcup_i B_i} }\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_x x \map \Pr {X = x}\)
\(\displaystyle \) \(=\) \(\displaystyle \expect X\) Definition of Expectation

$\blacksquare$


Also known as

Some sources refer to this as the partition theorem, which causes ambiguous as that name is used for other things as well.


Sources