# Variance of Binomial Distribution

## Theorem

Let $X$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$.

Then the variance of $X$ is given by:

$\operatorname{var} \left({X}\right) = n p \left({1-p}\right)$

## Proof 1

From the definition of Variance as Expectation of Square minus Square of Expectation:

$\operatorname{var} \left({X}\right) = E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2$
$\displaystyle E \left({X^2}\right) = \sum_{x \mathop \in \operatorname{Im} \left({X}\right)} x^2 \Pr \left({X = x}\right)$

To simplify the algebra a bit, let $q = 1 - p$, so $p+q = 1$.

So:

 $$\displaystyle E \left({X^2}\right)$$ $$=$$ $$\displaystyle \sum_{k \mathop \ge 0}^n k^2 \binom n k p^k q^{n-k}$$ Definition of binomial distribution, with $p + q = 1$ $$\displaystyle$$ $$=$$ $$\displaystyle \sum_{k \mathop = 0}^n k n \binom {n - 1} {k - 1} p^k q^{n-k}$$ Factors of Binomial Coefficients: $\displaystyle k \binom n k = n \binom {n - 1} {k - 1}$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \sum_{k \mathop = 1}^n k \binom {n - 1} {k - 1} p^{k-1} q^{\left({n-1}\right)-\left({k-1}\right)}$$ note change of limit: term is zero when $k-1=0$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \sum_{j \mathop = 0}^m \left({j+1}\right) \binom m j p^j q^{m-j}$$ putting $j = k-1, m = n-1$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({\sum_{j \mathop = 0}^m j \binom m j p^j q^{m-j} + \sum_{j \mathop = 0}^m \binom m j p^j q^{m-j} }\right)$$ splitting sum up into two $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({\sum_{j \mathop = 0}^m m \binom {m-1} {j-1} p^j q^{m-j} + \sum_{j \mathop = 0}^m \binom m j p^j q^{m-j} }\right)$$ Factors of Binomial Coefficients: $\displaystyle j \binom m j = m \binom {m - 1} {j - 1}$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({\left({n-1}\right) p \sum_{j \mathop = 1}^m \binom {m-1} {j-1} p^{j-1} q^{\left({m-1}\right)-\left({j-1}\right)} + \sum_{j \mathop = 0}^m \binom m j p^j q^{m-j} }\right)$$ note change of limit: term is zero when $j-1=0$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({\left({n-1}\right) p \left({p + q}\right)^{m-1} + \left({p + q}\right)^m}\right)$$ by the Binomial Theorem $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({\left({n-1}\right) p + 1}\right)$$ as $p + q = 1$ $$\displaystyle$$ $$=$$ $$\displaystyle n^2 p^2 + n p \left({1 - p}\right)$$ by algebra

Then:

 $$\displaystyle \operatorname{var} \left({X}\right)$$ $$=$$ $$\displaystyle E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2$$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({1-p}\right) + n^2 p^2 - \left({np}\right)^2$$ Expectation of Binomial Distribution: $E \left({X}\right) = n p$ $$\displaystyle$$ $$=$$ $$\displaystyle n p \left({1-p}\right)$$

as required.

$\blacksquare$

## Proof 2

From Variance of Discrete Random Variable from PGF, we have:

$\operatorname{var} \left({X}\right) = \Pi''_X \left({1}\right) + \mu - \mu^2$

where $\mu = E \left({x}\right)$ is the expectation of $X$.

From the Probability Generating Function of Binomial Distribution, we have:

$\Pi_X \left({s}\right) = \left({q + ps}\right)^n$

where $q = 1 - p$.

From Expectation of Binomial Distribution, we have:

$\mu = n p$

From Derivatives of PGF of Binomial Distribution, we have:

$\Pi''_X \left({s}\right) = n \left({n-1}\right) p^2 \left({q + ps}\right)^{n-2}$

Putting $s = 1$ using the formula $\Pi''_X \left({1}\right) + \mu - \mu^2$:

$\operatorname{var} \left({X}\right) = n \left({n-1}\right) p^2 + np - n^2p^2$

and hence the result.

$\blacksquare$

## Proof 3

From Bernoulli Process as Binomial Distribution, we see that $X$ as defined here is the sum of the discrete random variables that model the Bernoulli distribution.

Each of the Bernoulli trials is independent of each other.

Hence we can use Sums of Variances of Independent Trials.

The Variance of Bernoulli Distribution is $p \left({1-p}\right)$ so the variance of $B \left({n, p}\right)$ is $n p\left({1-p}\right)$.

$\blacksquare$