Variance of Binomial Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $X$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$.


Then the variance of $X$ is given by:

$\var X = n p \paren {1 - p}$


Proof 1

From the definition of Variance as Expectation of Square minus Square of Expectation:

$\var X = \expect {X^2} - \left({\expect X}\right)^2$

From Expectation of Function of Discrete Random Variable:

$\displaystyle \expect {X^2} = \sum_{x \mathop \in \Img X} x^2 \Pr \paren {X = x}$


To simplify the algebra a bit, let $q = 1 - p$, so $p + q = 1$.

So:

\(\displaystyle \expect {X^2}\) \(=\) \(\displaystyle \sum_{k \mathop \ge 0}^n k^2 \binom n k p^k q^{n - k}\) $\quad$ Definition of Binomial Distribution: $p + q = 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle \sum_{k \mathop = 0}^n k n \binom {n - 1} {k - 1} p^k q^{n - k}\) $\quad$ Factors of Binomial Coefficient: $k \dbinom n k = n \dbinom {n - 1} {k - 1}$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{k \mathop = 1}^n k \binom {n - 1} {k - 1} p^{k - 1} q^{\paren {n - 1} - \paren {k - 1} }\) $\quad$ Change of limit: term is zero when $k - 1 = 0$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \sum_{j \mathop = 0}^m \paren {j + 1} \binom m j p^j q^{m - j}\) $\quad$ putting $j = k - 1, m = n - 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {\sum_{j \mathop = 0}^m j \binom m j p^j q^{m - j} + \sum_{j \mathop = 0}^m \binom m j p^j q^{m - j} }\) $\quad$ splitting sum up into two $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {\sum_{j \mathop = 0}^m m \binom {m - 1} {j - 1} p^j q^{m - j} + \sum_{j \mathop = 0}^m \binom m j p^j q^{m - j} }\) $\quad$ Factors of Binomial Coefficient: $j \dbinom m j = m \dbinom {m - 1} {j - 1}$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {\paren {n - 1} p \sum_{j \mathop = 1}^m \binom {m - 1} {j - 1} p^{j - 1} q^{\paren {m - 1} - \paren {j - 1} } + \sum_{j \mathop = 0}^m \binom m j p^j q^{m - j} }\) $\quad$ Change of limit: term is zero when $j - 1 = 0$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {\paren {n - 1} p \paren {p + q}^{m - 1} + \paren {p + q}^m}\) $\quad$ Binomial Theorem $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {\paren {n - 1} p + 1}\) $\quad$ as $p + q = 1$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n^2 p^2 + n p \paren {1 - p}\) $\quad$ by algebra $\quad$


Then:

\(\displaystyle \var X\) \(=\) \(\displaystyle \expect {X^2} - \paren {\expect X}^2\) $\quad$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {1 - p} + n^2 p^2 - \paren {n p}^2\) $\quad$ Expectation of Binomial Distribution: $\expect X = n p$ $\quad$
\(\displaystyle \) \(=\) \(\displaystyle n p \paren {1 - p}\) $\quad$ $\quad$

as required.

$\blacksquare$


Proof 2

From Variance of Discrete Random Variable from PGF:

$\var X = \map {\Pi''_X} 1 + \mu - \mu^2$

where $\mu = \expect X$ is the expectation of $X$.


From the Probability Generating Function of Binomial Distribution:

$\map {\Pi_X} s = \paren {q + p s}^n$

where $q = 1 - p$.


From Expectation of Binomial Distribution:

$\mu = n p$


From Derivatives of PGF of Binomial Distribution:

$\map {\Pi''_X} s = n \paren {n - 1} p^2 \paren {q + p s}^{n - 2}$


Setting $s = 1$ and using the formula $\map {\Pi''_X} 1 + \mu - \mu^2$:

$\var X = n \paren {n - 1} p^2 + n p - n^2 p^2$

Hence the result.

$\blacksquare$


Proof 3

From Bernoulli Process as Binomial Distribution, we see that $X$ as defined here is the sum of the discrete random variables that model the Bernoulli distribution.

Each of the Bernoulli trials is independent of each other.

Hence we can use Sums of Variances of Independent Trials.

The Variance of Bernoulli Distribution is $p \paren {1 - p}$.

Thus the variance of $B \paren {n, p}$ is $n p \paren {1 - p}$.

$\blacksquare$


Sources