Expectation of Bernoulli Distribution

Theorem
Let $$X$$ be a discrete random variable with the Bernoulli distribution with parameter $p$.

Then the expectation of $$X$$ is given by:
 * $$E \left({X}\right) = p$$

Proof 1
From the definition of expectation:
 * $$E \left({X}\right) = \sum_{x \in \operatorname{Im} \left({X}\right)} x \Pr \left({X = x}\right)$$

By definition of Bernoulli distribution:
 * $$E \left({X}\right) = 1 \times p + 0 \times \left({1-p}\right)$$

Hence the result.

Proof 2
We can also use the Expectation of Binomial Distribution putting $$n = 1$$.

Proof 3
From the Probability Generating Function of Bernoulli Distribution, we have:
 * $$\Pi_X \left({s}\right) = q + ps$$

where $$q = 1 - p$$.

From Expectation of Discrete Random Variable from P.G.F., we have:
 * $$E \left({X}\right) = \Pi'_X \left({1}\right)$$

We have $$\Pi'_X \left({s}\right) = p$$

Hence (as $$p$$ does not depend on $$s$$) the result.