Variance of Bernoulli Distribution

Theorem
Let $$X$$ be a discrete random variable with the Bernoulli distribution with parameter $p$.

Then the variance of $$X$$ is given by:
 * $$\operatorname{var} \left({X}\right) = p \left({1-p}\right)$$

Proof 1
From the definition of variance:
 * $$\operatorname{var} \left({X}\right) = E \left({\left({X - E \left({X}\right)}\right)^2}\right)$$

From the Expectation of Bernoulli Distribution, we have $$E \left({X}\right) = p$$.

Then by definition of Bernoulli distribution:

$$ $$ $$ $$

Proof 2
From the definition of Variance as Expectation of Square minus Square of Expectation:
 * $$\operatorname{var} \left({X}\right) = E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2$$

From Expectation of Function of Discrete Random Variable:
 * $$E \left({X^2}\right) = \sum_{x \in \operatorname{Im} \left({X}\right)} x^2 \Pr \left({X = x}\right)$$

So:

$$ $$

Then:

$$ $$ $$

Proof 3
We can also use the Variance of Binomial Distribution putting $$n = 1$$.

Proof 4
From Variance of Discrete Random Variable from P.G.F., we have:
 * $$\operatorname{var} \left({X}\right) = \Pi''_X \left({1}\right) + \mu - \mu^2$$

where $$\mu = E \left({x}\right)$$ is the expectation of $$X$$.

From the Probability Generating Function of Bernoulli Distribution, we have:
 * $$\Pi_X \left({s}\right) = q + ps$$

where $$q = 1 - p$$.

From Expectation of Bernoulli Distribution, we have $$\mu = p$$.

We have $$\Pi'_X \left({s}\right) = p$$ and so $$\Pi''_X \left({s}\right) = 0$$ from Differentiation of a Constant.

Hence $$\operatorname{var} \left({X}\right) = \mu - \mu^2 = p - p^2 = p \left({1-p}\right)$$.