Linearity of Expectation Function

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $X$ and $Y$ be random variables on $\struct {\Omega, \Sigma, \Pr}$.

Let $\expect X$ denote the expectation of $X$.


Then:

$\forall \alpha, \beta \in \R: \expect {\alpha X + \beta Y} = \alpha \, \expect X + \beta \, \expect Y$


Proof

Discrete Random Variable

Follows directly from Expectation of Function of Joint Probability Mass Distribution, thus:


\(\ds \expect {\alpha X + \beta Y}\) \(=\) \(\ds \sum_x \sum_y \paren {\alpha x + \beta y} \, \map \Pr {X = x, Y = y}\) Expectation of Function of Joint Probability Mass Distribution
\(\ds \) \(=\) \(\ds \alpha \sum_x x \sum_y \map \Pr {X = x, Y = y}\)
\(\ds \) \(+\) \(\ds \beta \sum_y y \sum_x \map \Pr {X = x, Y = y}\)
\(\ds \) \(=\) \(\ds \alpha \sum_x x \, \map \Pr {X = x} + \beta \sum_y y \, \map \Pr {Y = y}\) Definition of Marginal Probability Mass Function
\(\ds \) \(=\) \(\ds \alpha \, \expect X + \beta \, \expect Y\) Definition of Expectation

$\blacksquare$


Continuous Random Variable

Let $\map {\operatorname {supp} } X$ and $\map {\operatorname {supp} } Y$ be the supports of $X$ and $Y$ respectively.

Let $f_{X, Y} : \map {\operatorname {supp} } X \times \map {\operatorname {supp} } Y \to \R$ be the joint probability density function of $X$ and $Y$.

Let $f_X$ and $f_Y$ be the marginal probability density functions of $X$ and $Y$.


Then:

\(\ds \expect {\alpha X + \beta Y}\) \(=\) \(\ds \int_{y \in \map {\operatorname {supp} } Y} \int_{x \in \map {\operatorname {supp} } X} \paren {\alpha x + \beta y} \map {f_{X, Y} } {x, y} \rd x \rd y\)
\(\ds \) \(=\) \(\ds \alpha \int_{y \in \map {\operatorname {supp} } Y} \int_{x \in \map {\operatorname {supp} } X} x \map {f_{X, Y} } {x, y} \rd x \rd y + \beta \int_{y \in \map {\operatorname {supp} } Y} \int_{x \in \map {\operatorname {supp} } X} y \map {f_{X, Y} } {x, y} \rd x \rd y\) Linear Combination of Definite Integrals
\(\ds \) \(=\) \(\ds \alpha \int_{x \in \map {\operatorname {supp} } X} x \paren {\int_{y \in \map {\operatorname {supp} } Y} \map {f_{X , Y} } {x, y} \rd y} \rd x + \beta \int_{y \in \map {\operatorname {supp} } Y} y \paren {\int_{x \in \map {\operatorname {supp} } X} \map {f_{X, Y} } {x, y} \rd x} \rd y\) rewriting
\(\ds \) \(=\) \(\ds \alpha \int_{x \in \map {\operatorname {supp} } X} x \map {f_X} x \rd x + \beta \int_{y \in \map {\operatorname {supp} } Y} y \map {f_Y} y \rd y\) Definition of Marginal Probability Density Function
\(\ds \) \(=\) \(\ds \alpha \expect X + \beta \expect Y\) Definition of Expectation of Continuous Random Variable

$\blacksquare$


Sources