Bernoulli Process as Binomial Distribution

From ProofWiki
Jump to: navigation, search

Theorem

Let $\left \langle{X_i}\right \rangle$ be a finite Bernoulli process of length $n$ such that each of the $X_i$ in the sequence is a Bernoulli trial with parameter $p$.


Then the number of successes in $\left \langle{X_i}\right \rangle$ is modelled by a binomial distribution with parameters $n$ and $p$.


Hence it can be seen that:

$X \sim \operatorname{B} \left({1, p}\right)$ is the same thing as $X \sim \operatorname{Bern} \left({p}\right)$


Proof

Consider the sample space $\Omega$ of all sequences $\left \langle{X_i}\right \rangle$ of length $n$.

The $i$th entry of any such sequence is the result of the $i$th trial.

We have that $\Omega$ is finite.

Let us take the event space $\Sigma$ to be the power set of $\Omega$.


As the elements of $\Omega$ are independent, by definition of the Bernoulli process, we have that:

$\forall \omega \in \Omega: \Pr \left({\omega}\right) = p^{s \left({\omega}\right)} \left({1 - p}\right)^{n - s \left({\omega}\right)}$

where $s \left({\omega}\right)$ is the number of successes in $\omega$.


In the same way:

$\displaystyle \forall A \in \Sigma: \Pr \left({A}\right) = \sum_{\omega \mathop \in A} \Pr \left({\omega}\right)$


Now, let us define the discrete random variable $Y_i$ as follows:

$Y_i \left({\omega}\right) = \begin{cases} 1 & : \omega_i \text { is a success} \\ 0 & : \omega_i \text { is a failure} \\ \end{cases}$

where $\omega_i$ is the $i$th element of $\omega$.

Thus, each $Y_i$ has image $\left\{{0, 1}\right\}$ and a probability mass function:

$\Pr \left({Y_i = 0}\right) = \Pr \left({\left\{{\omega \in \Omega: \omega_i \text { is a success}}\right\}}\right)$


Thus we have:

\(\displaystyle \Pr \left({Y_i = 1}\right)\) \(=\) \(\displaystyle \sum_{\omega: \omega_i \text{ success} } p^{s \left({\omega}\right)} \left({1 - p}\right)^{n - s \left({\omega}\right)}\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{r \mathop = 1}^n \sum_{ \substack{ \omega: \omega_i \text{ success} \\ s \left({\omega}\right) = r } } p^r \left({1 - p}\right)^{n - r}\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{r \mathop = 1}^n \binom {n-1} {r-1} p^r \left({1 - p}\right)^{n - r}\) As we already know the position of one success (namely $i$)
\(\displaystyle \) \(=\) \(\displaystyle p \sum_{r \mathop = 0}^{n-1} \binom {n-1} r p^r \left({1 - p}\right)^{\left({n - 1}\right) - r}\) Switching summation index
\(\displaystyle \) \(=\) \(\displaystyle p \left({p + \left({1 - p}\right)}\right)^{n-1}\) Binomial Theorem
\(\displaystyle \) \(=\) \(\displaystyle p\)


Then:

$\Pr \left({Y_i = 0}\right) = 1 - \Pr \left({Y_i = 1}\right) = 1 - p$


So (by a roundabout route) we have confirmed that $Y_i$ has the Bernoulli distribution with parameter $p$.


Now, let us define the random variable:

$\displaystyle S_n \left({\omega}\right) = \sum_{i \mathop = 1}^n Y_i \left({\omega}\right)$

By definition:

  • $S_n \left({\omega}\right)$ is the number of successes in $\omega$
  • $S_n$ takes values in $\left\{{0, 1, 2, \ldots, n}\right\}$ (as each $Y_i$ can be $0$ or $1$).

Also, we have that:

\(\displaystyle \Pr \left({S_n = k}\right)\) \(=\) \(\displaystyle \Pr \left({\left\{ {\omega \in \Omega: s \left({\omega}\right) = k}\right\} }\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{\omega: s \left({\omega}\right) \mathop = k} \Pr \left({\omega}\right)\)
\(\displaystyle \) \(=\) \(\displaystyle \sum_{\omega: s \left({\omega}\right) \mathop = k} p^k \left({1 - p}\right)^{n-k}\)
\(\displaystyle \) \(=\) \(\displaystyle \binom n k p^k \left({1 - p}\right)^{n-k}\)

Hence the result.

$\blacksquare$


Sources