Definition:Probability Measure/Definition 3
Jump to navigation
Jump to search
Definition
Let $\EE$ be an experiment.
Let $\Omega$ be the sample space on $\EE$.
Let $\Sigma$ be the event space of $\EE$.
A probability measure on $\EE$ is a mapping $\Pr: \Sigma \to \R$ which fulfils the following axioms:
\((\text I)\) | $:$ | \(\ds \forall A \in \Sigma:\) | \(\ds \map \Pr A \) | \(\ds \ge \) | \(\ds 0 \) | ||||
\((\text {II})\) | $:$ | \(\ds \map \Pr \Omega \) | \(\ds = \) | \(\ds 1 \) | |||||
\((\text {III})\) | $:$ | \(\ds \forall A \in \Sigma:\) | \(\ds \map \Pr A \) | \(\ds = \) | \(\ds \sum_{\bigcup \set e \mathop = A} \map \Pr {\set e} \) | where $e$ denotes the elementary events of $\EE$ |
Also see
Sources
- 1965: A.M. Arthurs: Probability Theory ... (previous) ... (next): Chapter $2$: Probability and Discrete Sample Spaces: $2.3$ Probabilities in discrete sample spaces