Expectation of Binomial Distribution

Theorem
Let $$X$$ be a discrete random variable with the binomial distribution with parameters $n$ and $p$.

Then the expectation of $$X$$ is given by:
 * $$E \left({X}\right) = n p$$

Proof 1
From the definition of expectation:
 * $$E \left({X}\right) = \sum_{x \in \operatorname{Im} \left({X}\right)} x \Pr \left({X = x}\right)$$

Thus:

$$ $$ $$ $$ $$

Proof 2
Alternatively, we can derive this directly from the Expectation of Bernoulli Distribution.

From Bernoulli Process as Binomial Distribution, we see that $$X$$ as defined here is the sum of the discrete random variables that model the Bernoulli distribution.

Each of the Bernoulli trials is independent of each other.

Hence we can use Sum of Expectations of Independent Trials‎.

The Expectation of Bernoulli Distribution is $$p\left({1-p}\right)$$ so the expectation of $$B \left({n, p}\right) $$ is $$n p\left({1-p}\right)$$.

Proof 3
From the Probability Generating Function of Binomial Distribution, we have:
 * $$\Pi_X \left({s}\right) = \left({q + ps}\right)^n$$

where $$q = 1 - p$$.

From Expectation of Discrete Random Variable from P.G.F., we have:
 * $$E \left({X}\right) = \Pi'_X \left({1}\right)$$

We have:

$$ $$

Plugging in $$s = 1$$:
 * $$\Pi'_X \left({1}\right) = n p \left({q + p}\right)$$

Hence the result, as $$q + p = 1$$.