this post was submitted on 12 Aug 2023
205 points (98.6% liked)

196

16396 readers
2607 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago (1 children)

I don't understand what you think the problem is. What do you mean infinities can't be differentiated from each other? Infinite cardinals are by definition equivalence classes of sets that can be placed in bijection with one another. You can compare one cardinal to another by showing there exists an injection from a representative set of the first one into a representative for the other. You can show equality by showing there is an injection both ways (Cantor-Schroder-Bernstein theorem) or otherwise producing a bijection explicitly. Infinite ordinals may as well be copies of the natural numbers indexed by infinite cardinals, so of course we can distinguish them too.

[–] [email protected] 1 points 1 year ago (1 children)

So far AFAIK we have two kinds of infinity: Those that can be accommodated at the Grand Hilbert (e.g. integers, fractions, etc.) and those that cannot (set of irrational numbers, set of curves, set of polytopes, etc.) This was why we had to differentiate orders of infinity, e.g. ℵ₀ (The Grand Hilbert set), ℵ₁ (the irrational set, the real set), ℵ₂ (???), ℵ₃ (?????), ℵₙ (??!??????????)

For values of infinity that are in higher orders than ℵ₀, we can only tell if they're equal to ℵ₁ or undetermined, which means their infinity size is ℵ₁ or greater, but still unknown.

Unless someone did some Nobel prize worthy work in mathematics that I haven't heard about which is quite possible.

[–] [email protected] 1 points 1 year ago (1 children)

No, that's definitely not true. As I said, infinite cardinals (like the cardinality of the naturals ℵ₀) are defined to be equivalence classes of sets that can be placed in bijection with one another. Whenever you have infinite sets that can't be placed in bijection, they represent different cardinals. The set of functions f : X --> X has cardinality 2^X too, so e.g. there are more real-valued functions of real numbers than there are real numbers. You can use this technique to get an infinite sequence of distinct cardinals (via Cantor's theorem, which has a simple constructive proof). And once you have all of those, you can take their (infinite) union to get yet another greater cardinal, and continue that way. There are in fact more cardinalities that can be obtained in this way than we could fit into a set-- the (infinite) number of infinite cardinals is too big to be an infinite cardinal.

You might be thinking of the generalized continuum hypothesis that says that there are no more cardinal numbers in between the cardinalities of power sets, i.e. that ℵ₁ = 2^(ℵ₀), ℵ₂ = 2^(ℵ₁), and so on.

[–] [email protected] 1 points 1 year ago (1 children)

It's quite possible that what I'm encountering is the the momentary failure to understand Cantor's theorem, or rather the mechanism it uses to enumerate the innumerable. So my math may just be old.

[–] [email protected] 1 points 1 year ago

Cantor's theorem says the power set of X has a strictly larger cardinality than X.

When |X| is a natural number, the power set of X has cardinality 2^(|X|), since you can think of an element of the power set as a choice, for each element of X, of "in the subset" vs "not in the subset." Hence the notation 2^X for the power set of X.

Cantor's theorem applies to all sets, not just finite ones. You can show this with a simple argument. Let X be a set and suppose there is a bijection f : X -> 2^(X). Let D be the set { x in X : x is not in f(x) }. (The fact that this is well defined is given by the comprehension axiom of ZFC, so we aren't running into a Russell's paradox issue.) Since f is a bijection, there is an element y of X so that f(y) = D. Now either:

  • y is in D. But then by definition y is not in f(y) = D, a contradiction.

  • y is not in D. But then by definition, since y is not in f(y), y is in D.

Thus, there cannot exist such a bijection f, and |2^(X)| != |X|. It's easy enough to show that the inequality only goes one way, i.e. |2^(X)| > |X|.