# Definition:Probability Measure/Definition 2

Jump to navigation
Jump to search

## Definition

Let $\EE$ be an experiment.

Let $\Omega$ be the sample space on $\EE$.

Let $\Sigma$ be the event space of $\EE$.

A **probability measure on $\EE$** is a mapping $\Pr: \Sigma \to \R$ which fulfils the Kolmogorov axioms:

\((1)\) | $:$ | \(\ds \forall A \in \Sigma:\) | \(\ds 0 \) | \(\ds \le \) | \(\ds \map \Pr A \le 1 \) | The probability of an event occurring is a real number between $0$ and $1$ | |||

\((2)\) | $:$ | \(\ds \map \Pr \Omega \) | \(\ds = \) | \(\ds 1 \) | The probability of some elementary event occurring in the sample space is $1$ | ||||

\((3)\) | $:$ | \(\ds \map \Pr {\bigcup_{i \mathop \ge 1} A_i} \) | \(\ds = \) | \(\ds \sum_{i \mathop \ge 1} \map \Pr {A_i} \) | where $\set {A_1, A_2, \ldots}$ is a countable (possibly countably infinite) set of pairwise disjoint events | ||||

That is, the probability of any one of countably many pairwise disjoint events occurring | |||||||||

is the sum of the probabilities of the occurrence of each of the individual events |

## Also see

## Sources

- 1986: Geoffrey Grimmett and Dominic Welsh:
*Probability: An Introduction*... (previous) ... (next): $1$: Events and probabilities: $1.3$: Probabilities