Chain Rule for Probability

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\EE$ be an experiment with probability space $\struct {\Omega, \Sigma, \Pr}$.

Let $A, B \in \Sigma$ be events of $\EE$ such that $\map \Pr B > 0$.

The conditional probability of $A$ given $B$ is:

$\condprob A B = \dfrac {\map \Pr {A \cap B} } {\map \Pr B}$


Proof



Suppose it is given that $B$ has occurred.

Then the probability of $A$ having occurred may not be $\map \Pr A$ after all.

In fact, we can say that $A$ has occurred if and only if $A \cap B$ has occurred.


So, if we know that $B$ has occurred, the conditional probability of $A$ given $B$ is $\map \Pr {A \cap B}$.

It follows then, that:

if we don't actually know whether $B$ has occurred or not
but we know its probability $\map \Pr B$

we can say that:

The probability that $A$ and $B$ have both occurred is the conditional probability of $A$ given $B$ multiplied by the probability that $B$ has occurred.


Hence:

$\condprob A B = \dfrac {\map \Pr {A \cap B} } {\map \Pr B}$

$\blacksquare$


Sources