Expectation of Binomial Distribution

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $X$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$ for some $n \in \N$ and $0 \le p \le 1$.


Then the expectation of $X$ is given by:

$\expect X = n p$


Proof 1

From the definition of expectation:

$\displaystyle \expect X = \sum_{x \mathop \in \Omega_X} x \map \Pr {X = x}$

Thus:

\(\displaystyle \expect X\) \(=\) \(\displaystyle \sum_{k \mathop = 0}^n k \binom n k p^k q^{n - k}\) Definition of Binomial Distribution, with $p + q = 1$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 1}^n k \binom n k p^k q^{n - k}\) since for $k = 0$, $k \dbinom n k p^k q^{n - k} = 0$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 1}^n n \binom {n - 1} {k - 1} p^k q^{n - k}\) Factors of Binomial Coefficient: $k \dbinom n k = n \dbinom {n - 1} {k - 1}$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{k \mathop = 1}^n \binom {n - 1} {k - 1} p^{k - 1} q^{\paren {n - 1} - \paren {k - 1} }\) taking out $n p$ and using $\paren {n - 1} - \paren {k - 1} = n - k$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{j \mathop = 0}^m \binom m j p^j q^{m - j}\) putting $m = n - 1, j = k - 1$
\(\displaystyle \) \(=\) \(\displaystyle n p\) Binomial Theorem and $p + q = 1$

$\blacksquare$


Proof 2

From Bernoulli Process as Binomial Distribution, we see that $X$ as defined here is a sum of discrete random variables $Y_i$ that model the Bernoulli distribution:

$\displaystyle X = \sum_{i \mathop = 1}^n Y_i$

Each of the Bernoulli trials is independent of each other, by definition of a Bernoulli process. It follows that:

\(\displaystyle E \left({X}\right)\) \(=\) \(\displaystyle E \left({ \sum_{i \mathop = 1}^n Y_i }\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{i \mathop = 1}^n E \left({Y_i}\right)\) Sum of Expectations of Independent Trials‎
\(\displaystyle \) \(=\) \(\displaystyle \sum_{i \mathop = 1}^n \ p\) Expectation of Bernoulli Distribution
\(\displaystyle \) \(=\) \(\displaystyle n p\) Sum of Identical Terms

$\blacksquare$


Proof 3

From the Probability Generating Function of Binomial Distribution, we have:

$\displaystyle \Pi_X \left({s}\right) = \left({q + ps}\right)^n$

where $q = 1 - p$.


From Expectation of Discrete Random Variable from PGF, we have:

$\displaystyle E \left({X}\right) = \Pi'_X \left({1}\right)$


We have:

\(\displaystyle \Pi'_X \left({s}\right)\) \(=\) \(\displaystyle \frac {\mathrm d} {\mathrm d s} \left({q + ps}\right)^n\)
\(\displaystyle \) \(=\) \(\displaystyle n p \left({q + ps}\right)^{n-1}\) Derivatives of PGF of Binomial Distribution


Plugging in $s = 1$:

$\displaystyle\Pi'_X \left({1}\right) = n p \left({q + p}\right)$


Hence the result, as $q + p = 1$.

$\blacksquare$


Proof 4

From Moment Generating Function of Binomial Distribution, the moment generating function of $X$, $M_X$, is given by:

$\displaystyle \map {M_X} t = \paren {1 - p + pe^t}^n$

By Moment in terms of Moment Generating Function:

$\displaystyle \expect X = \map {M_X'} 0$

We have:

\(\displaystyle \map {M_X'} t\) \(=\) \(\displaystyle \frac \d {\d t} \paren {1 - p + pe^t}^n\)
\(\displaystyle \) \(=\) \(\displaystyle \frac \d {\d t} \paren {1 - p + pe^t} \frac \d {\d \paren {1 - p + pe^t} } \paren {1 - p + pe^t}^n\) Chain Rule
\(\displaystyle \) \(=\) \(\displaystyle n p e^t \paren {1 - p + pe^t}^{n - 1}\) Derivative of Exponential Function, Derivative of Power

Setting $t = 0$ gives:

\(\displaystyle \expect X\) \(=\) \(\displaystyle n p e^0 \paren {1 - p + pe^0}^{n - 1}\)
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {1 - p + p}^{n - 1}\) Exponential of Zero
\(\displaystyle \) \(=\) \(\displaystyle n p\)

$\blacksquare$


Sources