Probability Mass Function of Binomial Distribution

Theorem
The probability mass function (pmf) of a binomially distributed random variable $X$ is equal to:


 * $\displaystyle \Pr \left({X = x}\right) = \binom n x p^x(1-p)^{n-x}$

where $n$ is the number of trials and $p$ is the probability of success.

Proof
Let $\displaystyle B_i: i = 1, 2, \ldots, \binom n x$ be events such that:


 * $(1): \quad B_i$ is the $i$th possible way to see $x$ successes in $n$ Bernoulli trials


 * $(2): \quad \forall i \ne j: B_i \cap B_j = \varnothing$

We can see that:


 * $\forall i: \Pr \left({B_i}\right) = p^x \left({1-p}\right)^{n-x}$

This is true since there will be $x$ successes, each with probability $p$ of occurring, and $n-x$ failures each with probability $1-p$ of occurring.

Furthermore we can assume independent trials and thus the result follows.

See Bernoulli Process as Binomial Distribution for further analysis of this.

Now our task becomes finding:


 * $\displaystyle \Pr \left({X = x}\right) = \Pr \left({\bigcup_{i \mathop = 1}^{\binom n x} B_i}\right)$ which is the probability of one of the $\binom n x$ outcomes occurring.

Then by the Inclusion-Exclusion Principle considered as an extension of the Addition Law of Probability we have that for any countable union of events:


 * $\displaystyle \Pr \left({\bigcup_{i \mathop = 1}^n A_i}\right) = \sum_{i \mathop = 1}^n \Pr \left({A_i}\right) - \sum_{i \mathop \ne j: i, j \mathop = 1}^n \Pr \left({A_i \cap A_j}\right) - \Pr \left({\bigcap_{i \mathop = 1}^n A_i}\right)$

Fortunately in this case the above reduces to:


 * $\displaystyle \Pr \left({\bigcup_{i \mathop = 1}^n A_i}\right) = \sum_{i \mathop = 1}^n \Pr \left({A_i}\right)$

since the events are pairwise disjoint and $\Pr \left({\varnothing}\right) = 0$.

Thus: