Definition:Probability Measure

Context
Probability Theory.

Definition
Let $$\mathcal E$$ be an experiment.

Let $$\Omega$$ be the sample space on $$\mathcal E$$, and let $$\Sigma$$ be the event space of $$\mathcal E$$.

A probability measure on $$\mathcal E$$ is a mapping $$\Pr: \Sigma \to \R$$ which fulfils the following conditions:
 * $$\forall A \in \Sigma: 0 \le \Pr \left({A}\right) \le 1$$;
 * $$\Pr \left({\varnothing}\right) = 0$$;
 * $$\Pr \left({\Omega}\right) = 1$$;
 * $$\forall A, B \in \Sigma: A \cap B = \varnothing \implies \Pr \left({A \cup B}\right) = \Pr \left({A}\right) + \Pr \left({B}\right)$$.

(From the definition of event space, we already have that $$\varnothing \in \Sigma$$ and $$\Omega \in \Sigma$$.)

That is, if $$\mathcal E$$ is defined as being a measure space $$\left({\Omega, \Sigma, \Pr}\right)$$, then $$\Pr$$ is a measure on $$\mathcal E$$ such that $$\Pr \left({\Omega}\right) = 1$$.