Variance of Bernoulli Distribution

Theorem
$\newcommand{\var} [1] {\operatorname{var} \left({#1}\right)}$ Let $X$ be a discrete random variable with the Bernoulli distribution with parameter $p$.

Then the variance of $X$ is given by:
 * $\var X = p \left({1 - p}\right)$

Proof 1
From the definition of variance:
 * $\var X = E \left({\left({X - E \left({X}\right)}\right)^2}\right)$

From the Expectation of Bernoulli Distribution, we have $E \left({X}\right) = p$.

Then by definition of Bernoulli distribution:

Proof 2
From the definition of Variance as Expectation of Square minus Square of Expectation:
 * $\operatorname{var} \left({X}\right) = E \left({X^2}\right) - \left({E \left({X}\right)}\right)^2$

From Expectation of Function of Discrete Random Variable:
 * $\displaystyle E \left({X^2}\right) = \sum_{x \in \operatorname{Im} \left({X}\right)} x^2 \Pr \left({X = x}\right)$

So:

Then:

Proof 3
We can also use the Variance of Binomial Distribution putting $n = 1$.

Proof 4
From Variance of Discrete Random Variable from P.G.F., we have:
 * $\var X = \Pi''_X \left({1}\right) + \mu - \mu^2$

where $\mu = E \left({x}\right)$ is the expectation of $X$.

From the Probability Generating Function of Bernoulli Distribution, we have:
 * $\Pi_X \left({s}\right) = q + ps$

where $q = 1 - p$.

From Expectation of Bernoulli Distribution, we have $\mu = p$.

We have $\Pi''_X \left({s}\right) = 0$ from Derivatives of PGF of Bernoulli Distribution.

Hence $\var X = 0 - \mu - \mu^2 = p - p^2 = p \left({1-p}\right)$.