Probability Mass Function of Binomial Distribution

Theorem
The probability mass function (pmf) of a binomially distributed random variable $X$ is equal to:


 * $\displaystyle \Pr \left({X=x}\right) = \binom n x p^x(1-p)^{n-x} $

where $n$ is the number of trials and $p$ is the probability of success.

Proof
Let $\displaystyle B_i: \ i=1, 2, \ldots, \binom n x$ be events such that:


 * (i) $B_i$ is the $i^{th}$ possible way to see $x$ successes in $n$ Bernoulli trials


 * (ii) $\forall \ i \ne j: B_i \cap B_j = \varnothing$

We can see that:


 * $ \forall i: \Pr \left({B_i}\right) = p^x \left({1-p}\right)^{n-x} $

This is true since there will be $x$ successes, each with probability $p$ of occurring, and $n-x$ failures each with probability $1-p$ of occurring.

Furthermore we can assume independent trials and thus the result follows.

See Bernoulli Process as Binomial Distribution for further analysis of this.

Now our task becomes finding:


 * $\displaystyle \Pr \left({X=x}\right) = \Pr \left({\bigcup_{i=1}^{\binom n x} B_i}\right)$ which is the probability of one of the $\binom n x$ outcomes occurring.

Then by the Inclusion-Exclusion Principle considered as an extension of the Addition Law of Probability we have that for any countable union of events:


 * $\displaystyle \Pr \left({\bigcup_{i=1}^{n} A_i}\right) = \sum_{i=1}^{n} \Pr \left({A_i}\right) - \sum_{i \ne j: i, j=1}^{n} \Pr \left({A_i \cap A_j}\right) - \Pr \left({\bigcap_{i=1}^{n}A_i}\right)$

Fortunately in this case the above reduces to:


 * $\displaystyle \Pr \left({\bigcup_{i=1}^{n} A_i}\right) = \sum_{i=1}^{n} \Pr \left({A_i}\right)$

Since the events are pairwise disjoint and $\Pr(\varnothing) = 0$

Thus we have: