Information Contained in Letter of Alphabet
Theorem
Let $\psi$ be a letter of the English alphabet.
To a first degree of approximation, the quantity of information contained in $\psi$ is $4 \cdotp 7$.
That is, there is approximately $4 \cdotp 7$ times as much information conveyed by transmission of a single letter as a single bit.
Further Analysis
It needs to be pointed out this analysis of the quantity of information in a letter of the English alphabet is appropriate only if each letter is equally likely to occur.
In practice this is unlikely to occur, and the concept of entropy, also known as uncertainty, is used instead.
Proof
Let it be assumed for the purposes of this exercise that each letter has an equal probability of occurring as an element of a message.
There are $26$ letters of the English alphabet.
The amount of information contained in $\psi$ is therefore:
- $\map I \psi = \dfrac {\lg 26} {\lg 2} \approx 4 \cdotp 7$
where $\lg$ denotes the binary logarithm.
$\blacksquare$
This article needs to be linked to other articles. In particular: the rule that gives $\dfrac {\lg 26} {\lg 2}$ You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by adding these links. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{MissingLinks}} from the code. |
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): information theory
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): information theory