Definition:Probability Measure

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\mathcal E$ be an experiment.

Let $\Omega$ be the sample space on $\mathcal E$, and let $\Sigma$ be the event space of $\mathcal E$.


A probability measure on $\mathcal E$ is a mapping $\Pr: \Sigma \to \R$ which fulfils the Kolmogorov axioms.


Kolmogorov Axioms

The Kolmogorov axioms are as follows:


First Axiom

$\forall A \in \Sigma: 0 \le \Pr \left({A}\right) \le 1$

The probability of an event occurring is a real number between $0$ and $1$.


Second Axiom

$\Pr \left({\Omega}\right) = 1$

The probability of some elementary event occurring in the sample space is $1$.


Third Axiom

Let $A_1, A_2, \ldots$ be a countable (possibly countably infinite) sequence of pairwise disjoint events.

Then:

$\displaystyle \Pr \left({\bigcup_{i \mathop \ge 1} A_i}\right) = \sum_{i \mathop \ge 1} \Pr \left({A_i}\right)$


The probability of any one of countably many pairwise disjoint events occurring is the sum of the probabilities of the occurrence of each of the individual events.


Notes


From the definition of event space, we already have that $\varnothing \in \Sigma$ and $\Omega \in \Sigma$.


If $\mathcal E$ is defined as being a measure space $\left({\Omega, \Sigma, \Pr}\right)$, then $\Pr$ is a measure on $\mathcal E$ such that $\Pr \left({\Omega}\right) = 1$.


Also known as

In the context of probability theory, a probability measure is sometimes referred to as a probability function.


Also see


Sources