# Axiom:Axioms of Uncertainty

Jump to navigation
Jump to search

## Axiom Schema

Let $Z$ be a random variable.

Let $Z$ take the values $a_i$ with probability $p_i$, where $i \in \set {1, 2, \ldots, n}$.

Let $H_n: Z^n \to \R$ be a mapping which is to be defined as the uncertainty of $Z$.

Then $H_n$ fulfils the following conditions, which are to be known as the **axioms of uncertainty**:

### Axiom 1

- $\map {H_n} {p_1, p_2, \ldots, p_n}$ is a maximum when $p_1 = p_2 = \dotsb = p_n = \dfrac 1 n$

### Axiom 2

For any permutation $\pi$ of $\tuple {1, 2, \dotsc, n}$:

- $\map {H_n} {p_1, p_2, \ldots, p_n} = \map {H_n} {p_{\map \pi 1}, p_{\map \pi 2}, \ldots, p_{\map \pi n} }$

That is, $H_n$ is a symmetric function of the arguments $p_1, p_2, \dotsc, p_n$.

### Axiom 3

- $\map {H_n} {p_1, p_2, \ldots, p_n} \ge 0$

and:

- $\map {H_n} {p_1, p_2, \ldots, p_n} = 0$ if and only if $\exists i \in \set {1, 2, \ldots, n}: p_i = 1$

### Axiom 4

- $\map {H_{n + 1} } {p_1, p_2, \ldots, p_n, 0} = \map {H_n} {p_1, p_2, \ldots, p_n}$

### Axiom 5

- $\map {H_n} {\dfrac 1 n, \dfrac 1 n, \dotsc, \dfrac 1 n} \le \map {H_{n + 1} } {\dfrac 1 {n + 1}, \dfrac 1 {n + 1}, \dotsc, \dfrac 1 {n + 1} }$

### Axiom 6

- $H_n$ is a continuous function of its arguments.

### Axiom 7

- $\map {H_{m n} } {\dfrac 1 {m n}, \dfrac 1 {m n}, \dotsc, \dfrac 1 {m n} } = \map {H_m} {\dfrac 1 m, \dfrac 1 m, \dotsc, \dfrac 1 m} + \map {H_n} {\dfrac 1 n, \dfrac 1 n, \dotsc, \dfrac 1 n}$

### Axiom 8

Let:

- $p = p_1 + p_2 + \dotsb + p_m$
- $q = q_1 + q_2 + \dotsb + q_n$

such that:

- each of $p_i$ and $q_j$ are non-negative
- $p + q = 1$

Then:

- $\map {H_{m + n} } {p_1, p_2, \dotsc, p_m, q_1, q_2, \dotsc q_n} = \map {H_2} {p, q} + p \map {H_m} {\dfrac {p_1} p, \dfrac {p_2} p, \dotsc, \dfrac {p_m} p} + q \map {H_n} {\dfrac {q_1} q, \dfrac {q_2} q, \dotsc, \dfrac {q_n} q}$

## Also known as

The **axioms of uncertainty** are also known as the **axioms of entropy**.

## Historical Note

The axioms of uncertainty as defined here are practically the same as those proposed by Claude Elwood Shannon in $1948$

## Sources

- 1988: Dominic Welsh:
*Codes and Cryptography*... (previous) ... (next): $\S 1$: Entropy = uncertainty = information: $1.1$ Uncertainty