Real Numbers are Uncountably Infinite

Theorem
The set of real numbers $\R$ is uncountably infinite.

The Classic Proof
We show that the unit interval $[0,1]$ is countable, from which uncountability of $\R$ follows immediately.

Suppose that $[0,1]$ is countable.

Clearly $[0,1]$ is not finite because $\displaystyle \frac 1 n$ are distinct for $n \in \N$.

Therefore an injection $[0,1]\hookrightarrow \N$ enumerates $[0,1]$ with a subset of the natural numbers.

By relabeling, we can associate each $x \in [0,1]$ to precisely one natural number to obtain a bijection.

Let $g$ be such a correspondence:

where juxtaposition of digits descibes the decimal expansion of a number.

Note that the decimal expansion is not directly unique, because of 0.999...=1.

This problem is overcome by disallowing infinite strings of $9$'s in the decimal expansion.

For every $k \in \N$ define $f_n = \min\left\{d_{nn} + 1\right\}$ taken modulo $9$.

Let $y$ be defined by the decimal expansion:


 * $y = 0.f_1 f_2 f_3 \ldots$

Now


 * $y$ differs from $g(1)$ in the first digit of the decimal expansion
 * $y$ differs from $g(2)$ in the second digit of the decimal expansion

and generally the $n^\text{th}$ digit of the decimal expansion of $g(n)$ and $y$ is different.

By Existence of Base-N Representation, any decimal expansion of a real number is unique (except for the issue from 0.999...=1).

So $y$ can be none of the numbers $g(n)$ for $n \in \N$.

But $g$ is a bijection, a contradiction.

The Set-Theoretical Approach
By definition a set $A$ is countable iff there is a surjection from $f: \N \to A$.

Suppose there were a surjection $f: \N \to \R$.

Then $\forall x \in \R: \exists n \in \N: f \left({n}\right) = x$ as $f$ is surjective.

Let $d_{n, 0}$ be the integer before the decimal point of $f \left({n}\right)$.

Similarly, for all $m > 0$, let $d_{n, m}$ be the $m$th digit in the decimal expansion of $f \left({n}\right)$.

Let $e_0$ be an integer different from $d_{0, 0}$.

Similarly, for all $m > 0$, let $e_m$ be an integer different from $d_{m, m}$.

Specifically, we can define $e_0$ to be $d_{0, 0} + 1$, and:
 * $e_m = \begin{cases}

1 & : d_{m, m} \ne 1 \\ 2 & : d_{m, m} = 1 \end{cases}$

Now consider the real number $\displaystyle x = e_0 + \sum_{n=1}^\infty \frac {e_n} {10^n}$.

Its decimal expansion is:
 * $x = \left[{e_0 . e_1 e_2 e_3 \ldots}\right]_{10}$

Since $e_0 \ne d_{0, 0}$, $x \ne f \left({0}\right)$.

Similarly, for each $n \in \N$ such that $n \ge 1$, we have that $e_n \ne d_{n, n}$ and so $x \ne f \left({n}\right)$.

Thus $x$ is a real number which is not in the set $\left\{{f \left({n}\right): n \in \N}\right\}$.

Hence $f$ can not be surjective.

History
This was first demonstrated by Georg Cantor.

These proofs are examples of Cantor's Diagonal Argument.