Sum of Expectations of Independent Trials/Proof 1

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\EE_1, \EE_2, \ldots, \EE_n$ be a sequence of experiments whose outcomes are independent of each other.

Let $X_1, X_2, \ldots, X_n$ be discrete random variables on $\EE_1, \EE_2, \ldots, \EE_n$ respectively.


Let $\expect {X_j}$ denote the expectation of $X_j$ for $j \in \set {1, 2, \ldots, n}$.


Then we have, whenever both sides are defined:

$\ds \expect {\sum_{j \mathop = 1}^n X_j} = \sum_{j \mathop = 1}^n \expect {X_j}$


That is, the sum of the expectations equals the expectation of the sum.


Proof

The proof proceeds by induction on the number of terms $n$ in the sum.

For all $n \in \Z_{\ge 0}$, let $\map P n$ be the proposition:

$\ds \expect {\sum_{j \mathop = 1}^n X_j} = \sum_{j \mathop = 1}^n \expect {X_j}$


Basis for the Induction

$\map P 1$ is the case:

$\ds \expect {\sum_{j \mathop = 1}^1 X_j} = \sum_{j \mathop = 1}^1 \expect {X_j}$

That is:

$\expect {X_1} = \expect {X_1}$

which is tautologically true.


This is our basis for the induction.


Induction Hypothesis

Now it needs to be shown that if $\map P k$ is true, where $k \ge 1$, then it logically follows that $\map P {k + 1}$ is true.


So this is the induction hypothesis: $\ds \expect {\sum_{j \mathop = 1}^k X_j} = \sum_{j \mathop = 1}^k \expect {X_j}$


from which it is to be shown that: $\ds \expect {\sum_{j \mathop = 1}^{k + 1} X_j} = \sum_{j \mathop = 1}^{k + 1} \expect {X_j}$


Induction Step

This is our induction step:


Denote $Y = \ds \sum_{j \mathop = 1}^k X_j$

Then we compute:

\(\ds \expect {\sum_{j \mathop = 1}^{k + 1} X_j}\) \(=\) \(\ds \expect {Y + X_{k + 1} }\)
\(\ds \) \(=\) \(\ds \sum_{y \mathop + x_{k + 1} \mathop \in \R} \paren {y + x_{k + 1} } \map \Pr {Y + X_{k + 1} = y + x_{k + 1} }\) Definition of Expectation
\(\ds \) \(=\) \(\ds \sum_{y \mathop \in \R} \sum_{x_{k + 1} \mathop \in \R} \paren {y + x_{k + 1} } \map \Pr {Y = y, X_{k + 1} = x_{k + 1} }\) Definition of Joint Probability Mass Function
\(\ds \) \(=\) \(\ds \sum_{y \mathop \in \R} \sum_{x_{k + 1} \mathop \in \R} \paren {y + x_{k + 1} } \map \Pr {Y = y} \, \map \Pr {X_{k + 1} = x_{k + 1} }\) Independence of $Y$ and $X_{k + 1}$
\(\ds \) \(=\) \(\ds \sum_{y \mathop \in \R} \sum_{x_{k + 1} \mathop \in \R} y \, \map \Pr {Y = y} \, \map \Pr {X_{k + 1} = x_{k + 1} } + \sum_{y \mathop \in \R} \sum_{x_{k + 1} \mathop \in \R} x_{k + 1} \, \map \Pr {Y = y} \, \map \Pr {X_{k + 1} = x_{k + 1} }\) splitting the summation
\(\ds \) \(=\) \(\ds \sum_{y \mathop \in \R} y \, \map \Pr {Y = y} + \sum_{x_{k + 1} \mathop \in \R} x_{k + 1} \, \map \Pr {X_{k + 1} = x_{k + 1} }\) because $\ds \sum_{x_{k + 1} \mathop \in \R} \map \Pr {X_{k + 1} = x_{k + 1} } = 1 = \sum_{y \mathop \in \R} \map \Pr {Y = y}$
\(\ds \) \(=\) \(\ds \expect Y + \expect {X_{k + 1} }\) Definition of Expectation
\(\ds \) \(=\) \(\ds \sum_{j \mathop = 1}^{k + 1} \expect {X_j}\) by the Induction Hypothesis

The result follows by the Principle of Mathematical Induction.

$\blacksquare$