Expectation of Binomial Distribution

Theorem
Let $X$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$.

Then the expectation of $X$ is given by:
 * $E \left({X}\right) = n p$

Proof 1
From the definition of expectation:
 * $\displaystyle E \left({X}\right) = \sum_{x \in \Omega_X} x \Pr \left({X = x}\right)$

Thus:

Proof 2
Alternatively, we can derive this directly from the Expectation of Bernoulli Distribution.

From Bernoulli Process as Binomial Distribution, we see that $X$ as defined here is a sum of discrete random variables $Y_i$ that model the Bernoulli distribution:


 * $\displaystyle X = \sum_{i = 1}^n Y_i$

Each of the Bernoulli trials is independent of each other, by definition of a Bernoulli process. It follows that:

Proof 3
From the Probability Generating Function of Binomial Distribution, we have:
 * $\displaystyle \Pi_X \left({s}\right) = \left({q + ps}\right)^n$

where $q = 1 - p$.

From Expectation of Discrete Random Variable from P.G.F., we have:
 * $\displaystyle E \left({X}\right) = \Pi'_X \left({1}\right)$

We have:

Plugging in $s = 1$:
 * $\displaystyle\Pi'_X \left({1}\right) = n p \left({q + p}\right)$

Hence the result, as $q + p = 1$.