Monday, January 5, 2026

Cantor's Diagonal Argument

Cantor's diagonal argument is famous for proving the uncountability of the real numbers. It is not so well known that Cantor's first set theory article proved it differently.

Here is the proof. First of all, we need to assume the completeness of the reals, in some form.

Theorem. If { Ak } is a collection of compact nested subsets of R, the real numbers, then the intersection is nonempty.

Compact means closed and bounded, like the interval [0,1], including the endpoints. Nested means each contains the next. The proof depends on how you have defined R. For example, you could get an intersection point as the least upper bound of the left endpoints.

This is a way of expressing the completeness of the reals. If we used the rationals instead, then we could have nested intervals about √2, and the intersection would not include any rational numbers.

To prove uncountability, suppose we have an enumeration { xk } of R, and we seek a contradiction. We define a collection { Ak } with xk not in Ak, and apply the theorem to get a real number in all the sets, and hence not in the enumeration.

Start with [0,1]. Let A1 be a closed interval subset excluding x1. Eg, if x1 = 0.4, then A1 could be [0,.3] or [.5,1]. Likewise, let A2 be a subset of A1 that excludes x2, and so on, and thus Ak excludes x1,..., xk. Applying the theorem gives at least one real in all of the intervals, and hence not part of the enumeration.

This is really a diagonalization argument, and maybe less intuitive, so what's the point?

I like it better. The usual diagonal argument is deceptively simple, as it assumes facts about the reals, such as every decimal expansion converging to a real, and reals having unique decimal expansions, with certain exceptions. This is still a diagonal argument, gets more directly to the heart of the matter, and explicitly uses the completeness of the reals, without the distracting decimal expansions.

Cantor gave a similar diagonal argument to show that the cardinality of a set A is less than the set P(A) of all its subsets.

Suppose not, so that there is some onto function f: A -> P(A). That is, where f(A) = P(A). Then let B = { a in A : a not in f(a) }. If B = f(b) for some b in A, then b is in f(b) if and only if b is not in f(b), a contradiction. Thus the image of f cannot be all of P(A).

This shows that taking all subsets always gives a set of larger cardinality.

Scott Aaronson recommends a video on Cantor's diagonal argument.

No comments:

Post a Comment

Cantor's Diagonal Argument

Cantor's diagonal argument is famous for proving the uncountability of the real numbers. It is not so well known that Cantor's firs...