Expectation of Binomial Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$.


Then the expectation of $X$ is given by:

$E \left({X}\right) = n p$


Proof 1

From the definition of expectation:

$\displaystyle E \left({X}\right) = \sum_{x \mathop \in \Omega_X} x \Pr \left({X = x}\right)$


Thus:

\(\displaystyle E \left({X}\right)\) \(=\) \(\displaystyle \sum_{k \mathop = 0}^n k \binom n k p^k q^{n - k}\) $\quad$ Definition of Binomial Distribution, with $p + q = 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 1}^n k \binom n k p^k q^{n - k}\) $\quad$ since for $k = 0$, $k \dbinom n k p^k q^{n - k} = 0$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 1}^n n \binom {n - 1} {k - 1} p^k q^{n - k}\) $\quad$ Factors of Binomial Coefficient: $k \dbinom n k = n \dbinom {n - 1} {k - 1}$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{k \mathop = 1}^n \binom {n - 1} {k - 1} p^{k - 1} q^{\left({n - 1}\right) - \left({k - 1}\right)}\) $\quad$ taking out $n p$ and using $\left({n - 1}\right) - \left({k - 1}\right) = n-k$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{j \mathop = 0}^m \binom m j p^j q^{m - j}\) $\quad$ putting $m = n - 1, j = k - 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p\) $\quad$ Binomial Theorem and $p + q = 1$ $\quad$

$\blacksquare$


Proof 2

From Bernoulli Process as Binomial Distribution, we see that $X$ as defined here is a sum of discrete random variables $Y_i$ that model the Bernoulli distribution:

$\displaystyle X = \sum_{i \mathop = 1}^n Y_i$

Each of the Bernoulli trials is independent of each other, by definition of a Bernoulli process. It follows that:

\(\displaystyle E \left({X}\right)\) \(=\) \(\displaystyle E \left({ \sum_{i \mathop = 1}^n Y_i }\right)\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{i \mathop = 1}^n E \left({Y_i}\right)\) $\quad$ Sum of Expectations of Independent Trials‎ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{i \mathop = 1}^n \ p\) $\quad$ Expectation of Bernoulli Distribution $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p\) $\quad$ Sum of Identical Terms $\quad$

$\blacksquare$


Proof 3

From the Probability Generating Function of Binomial Distribution, we have:

$\displaystyle \Pi_X \left({s}\right) = \left({q + ps}\right)^n$

where $q = 1 - p$.


From Expectation of Discrete Random Variable from PGF, we have:

$\displaystyle E \left({X}\right) = \Pi'_X \left({1}\right)$


We have:

\(\displaystyle \Pi'_X \left({s}\right)\) \(=\) \(\displaystyle \frac {\mathrm d} {\mathrm d s} \left({q + ps}\right)^n\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \left({q + ps}\right)^{n-1}\) $\quad$ Derivatives of PGF of Binomial Distribution $\quad$


Plugging in $s = 1$:

$\displaystyle\Pi'_X \left({1}\right) = n p \left({q + p}\right)$


Hence the result, as $q + p = 1$.

$\blacksquare$


Sources