Condition for Independence from Product of Expectations

Theorem
Let $\left({\Omega, \Sigma, \Pr}\right)$ be a probability space.

Let $X$ and $Y$ be discrete random variables on $\left({\Omega, \Sigma, \Pr}\right)$.

Let $E$ denote the expectation function.

Then $X$ and $Y$ are independent iff:
 * $E \left({g \left({X}\right) h \left({Y}\right)}\right) = E \left({g \left({X}\right)}\right) E \left({h \left({Y}\right)}\right)$

for all functions $g, h: \R \to \R$ for which the latter two expectations exist.

Corollary
Let $X$ and $Y$ be independent discrete random variables on $\left({\Omega, \Sigma, \Pr}\right)$.

Then:
 * $E \left({X Y}\right) = E \left({X}\right) E \left({Y}\right)$

assuming the latter expectations exist.

Further, let $X_1, X_2, \ldots, X_n$ be independent discrete random variables.

Then:
 * $\displaystyle E \left({\prod_{k=1}^n {X_k}}\right) = \prod_{k=1}^n E \left({X_k}\right)$

assuming the latter expectations exist.

Sufficient Condition
Suppose that $X$ and $Y$ were not independent.

That is:
 * $\Pr \left({X = a, Y = b}\right) \ne \Pr \left({X = a}\right)\Pr \left({Y = b}\right)$

for some $a, b \in \R$.

Now, suppose that
 * $E \left({g \left({X}\right) h \left({Y}\right)}\right) = E \left({g \left({X}\right)}\right) E \left({h \left({Y}\right)}\right)$

for all functions $g, h: \R \to \R$ for which the latter two expectations exist.

Let us define $g$ and $h$ as examples of such functions as follows:

1 & : x = a \\ 0 & : x \ne a \end{cases}$
 * $g \left({x}\right) := \begin{cases}

1 & : y = b \\ 0 & : y \ne b \end{cases}$
 * $h \left({y}\right) := \begin{cases}

where $a \in \R$ and $b \in \R$ are arbitrary real numbers.

Then:

So by hypothesis:
 * $\Pr \left({X = a, Y = b}\right) = \Pr \left({X = a}\right)\Pr \left({Y = b}\right)$.

But by hypothesis, $\Pr \left({X = a, Y = b}\right) \ne \Pr \left({X = a}\right)\Pr \left({Y = b}\right)$.

This contradicts our supposition that:
 * $E \left({g \left({X}\right) h \left({Y}\right)}\right) = E \left({g \left({X}\right)}\right) E \left({h \left({Y}\right)}\right)$

for all functions $g, h: \R \to \R$ for which the latter two expectations exist.

So, if the above supposition holds, then $X$ and $Y$ have to be independent.

Necessary Condition
Suppose $X$ and $Y$ are independent.

Let $g, h: \R \to \R$ be any real functions such that $E \left({g \left({X}\right)}\right)$ and $E \left({h \left({Y}\right)}\right)$ exist.

Then:

thus proving that
 * $E \left({g \left({X}\right) h \left({Y}\right)}\right) = E \left({g \left({X}\right)}\right) E \left({h \left({Y}\right)}\right)$

whatever $g$ and $h$ are.

Proof of Corollary
This follows immediately from the main result, setting both $g$ and $h$ to the identity functions:
 * $\forall x \in \R: g \left({x}\right) = x$
 * $\forall y \in \R: h \left({y}\right) = y$

It follows directly that if $X$ and $Y$ are independent, then:
 * $E \left({X Y}\right) = E \left({X}\right) E \left({Y}\right)$

assuming the latter expectations exist.

An inductive proof can be used to demonstrate the general case.

Note on Converse to Corollary
Note that the converse of the corollary does not necessarily hold.

Suppose that:
 * $E \left({X Y}\right) = E \left({X}\right) E \left({Y}\right)$

Then it is not necessarily the case that $X$ and $Y$ are independent.

Example
Let $X$ be a discrete random variable whose distribution is defined as:
 * $p_X \left({-1}\right) = p_X \left({0}\right) = p_X \left({1}\right) = \frac 1 3$

Let $Y$ be the discrete random variable defined as:
 * $Y = \begin{cases}

0 & : X = 0 \\ 1 & : X \ne 0 \end{cases}$

We have:

So $X$ and $Y$ are dependent.

But:

So $E \left({X Y}\right) = E \left({X}\right) E \left({Y}\right)$ but $X$ and $Y$ are not independent.