Expectation of Product of Independent Random Variables is Product of Expectations

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $X$ and $Y$ be non-negative real-valued random variables that are independent.


Then:

$\expect {X Y} = \expect X \expect Y$


Corollary

Let $X$ and $Y$ be integrable random variables that are independent.


Then:

$\expect {X Y} = \expect X \expect Y$


Proof

We first prove the claim in the case that $X = \chi_A$ for $A \in \Sigma$ and $Y = \chi_B$ for $B \in \Sigma$.

In particular, we have $A \in \map \sigma X$ and $B \in \map \sigma Y$ where $\map \sigma X$ and $\map \sigma Y$ are the $\sigma$-algebras generated by $A$ and $B$ respectively.

Then, we have:

\(\ds \expect {X Y}\) \(=\) \(\ds \expect {\chi_A \cdot \chi_B}\)
\(\ds \) \(=\) \(\ds \expect {\chi_{A \cap B} }\) Characteristic Function of Intersection
\(\ds \) \(=\) \(\ds \map \Pr {A \cap B}\) Integral of Characteristic Function
\(\ds \) \(=\) \(\ds \map \Pr A \map \Pr B\) Definition of Independent Sigma-Algebras
\(\ds \) \(=\) \(\ds \expect {\chi_A} \expect {\chi_B}\) Integral of Characteristic Function
\(\ds \) \(=\) \(\ds \expect X \expect Y\)


Now let $X$ and $Y$ be general non-negative real-valued random variables.

Let $X'$ and $Y'$ be positive simple real-valued random variables that are $\map \sigma X$ and $\map \sigma Y$ measurable respectively.

From Simple Function has Standard Representation, there exists:

a finite sequence $a_1, \ldots, a_n$ of real numbers
a partition $A_1, A_2, \ldots, A_n$ of $\map \sigma X$-measurable sets

with:

$\ds X' = \ds \sum_{i \mathop = 1}^n a_i \chi_{A_i}$

and:

a finite sequence $b_1, \ldots, b_m$ of real numbers
a partition $B_1, B_2, \ldots, B_m$ of $\map \sigma Y$-measurable sets

with:

$\ds Y' = \ds \sum_{j \mathop = 1}^m b_j \chi_{B_j}$

Then we have:

\(\ds \expect {X' Y'}\) \(=\) \(\ds \expect {\paren {\sum_{i \mathop = 1}^n a_i \chi_{A_i} } \paren {\sum_{j \mathop = 1}^m b_j \chi_{B_j} } }\)
\(\ds \) \(=\) \(\ds \expect {\sum_{i \mathop = 1}^n \sum_{j \mathop = 1}^m a_i b_j \chi_{A_i} \chi_{B_j} }\)
\(\ds \) \(=\) \(\ds \expect {\sum_{i \mathop = 1}^n \sum_{j \mathop = 1}^m a_i b_j \chi_{A_i \cap B_j} }\) Characteristic Function of Intersection
\(\ds \) \(=\) \(\ds \sum_{i \mathop = 1}^n \sum_{j \mathop = 1}^m a_i b_j \map \Pr {A_i \cap B_j}\) Integral of Characteristic Function
\(\ds \) \(=\) \(\ds \sum_{i \mathop = 1}^n \sum_{j \mathop = 1}^m a_i b_j \map \Pr {A_i} \map \Pr {B_j}\) Definition of Independent Sigma-Algebras
\(\ds \) \(=\) \(\ds \paren {\sum_{i \mathop = 1}^n a_i \map \Pr{A_i} } \paren {\sum_{j \mathop = 1}^m b_j \map \Pr {B_j} }\)
\(\ds \) \(=\) \(\ds \expect {\sum_{i \mathop = 1}^n a_i \chi_{A_i} } \expect {\sum_{j \mathop = 1}^m b_j \chi_{B_j} }\) Integral of Characteristic Function, Expectation is Linear
\(\ds \) \(=\) \(\ds \expect {X'} \expect {Y'}\)

Now, from Measurable Function is Pointwise Limit of Simple Functions, there exists an increasing sequences of positive simple random variables $\sequence {X_n}_{n \mathop \in \N}$, $\sequence {Y_n}_{n \mathop \in \N}$ with:

$\ds \map X \omega = \lim_{n \mathop \to \infty} \map {X_n} \omega$

and:

$\ds \map Y \omega = \lim_{n \mathop \to \infty} \map {Y_n} \omega$

for all $\omega \in \Omega$.

Furthermore, we can show that $X_n \in \map \sigma X$ and $Y_n \in \map \sigma Y$ for each $n \in \N$.

Note that from Measurable Function is Pointwise Limit of Simple Functions, we have:

$\map {X_n} \omega = \ds \sum_{k \mathop = 0}^{n 2^n} k 2^{-n} \map {\chi_{ {A_k}^n} } \omega$

where:

${A_k}^n := \begin{cases} \set {k 2^{-n} \le X < \paren {k + 1} 2^{-n} } & : k \ne n 2^n \\ \set {X \ge n} & : k = n 2^n \end{cases}$

Each ${A_k}^n$ is $\map \sigma X$-mesurable, since by Characterization of Sigma-Algebra Generated by Collection of Mappings $\map \sigma X$ is generated by the preimages of $X$ under Borel sets.

So from Characteristic Function Measurable iff Set Measurable and Pointwise Sum of Measurable Functions is Measurable, we have that $X_n$ is $\map \sigma X$-mesurable for each $n \in \N$.

Similarly, $Y_n$ is $\map \sigma Y$-mesurable for each $n \in \N$.

So, for each $n \in \N$ we have from our previous computation on positive simple random variables:

$\expect {X_n Y_n} = \expect {X_n} \expect {Y_n}$

We have that $\sequence {X_n Y_n}_{n \mathop \in \N}$ is increasing, so we have:

$\expect {X_n Y_n} \to \expect {X Y}$

by the Monotone Convergence Theorem (Measure Theory).

Also applying the Monotone Convergence Theorem (Measure Theory) we have:

$\expect {X_n} \to \expect X$

and:

$\expect {Y_n} \to \expect Y$

So we have:

$\expect {X_n} \expect {Y_n} \to \expect X \expect Y$

and so:

$\expect {X Y} = \expect X \expect Y$

as desired.

$\blacksquare$