this post was submitted on 03 Jul 2023
25 points (100.0% liked)

Asklemmy

43989 readers
1356 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

The Singularity is a hypothetical future event where technology growth becomes uncontrollable and irreversible, leading to unpredictable transformations in our reality^1^. It's often associated with the point at which artificial intelligence surpasses human intelligence, potentially causing radical changes in society. I'd like to know your thoughts on what the Singularity's endgame will be: Utopia, Dystopia, Collapse, or Extinction, and why?

Citations:

  1. Singularity Endgame: Utopia, Dystopia, Collapse, or Extinction? (It's actually up to you!)
top 20 comments
sorted by: hot top controversial new old
[–] axtualdave 6 points 1 year ago (1 children)

In the short term, a series of collapses as we reach ever closer toward that singularity. There's a great many constraints on our ability to grow while on Earth, and it's proving difficult to get off the planet in any reasonable method with our current technology. I suspect we'll need to fall down and rebuild a couple times before we can reliably spread to other planets, or even simply exist in orbit.

Once we get up there, though, and we're no longer constrained by Earth's resource limits, we'll grow signficantly. I suspect we'll move toward a machine-based society, both in automation and robotics, but also integrating technology into our bodies.

At some point, someone is going to figure out how to do that mind to machine transfer, and we'll diverge as a species. The organic humans and the composite AI / machine-based humanity.

Knowing how stupid we are, though, we'll probably end up becoming the Borg.

[–] whileloop 1 points 1 year ago

Resistance is futile

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

Well, let me put it this way... Enjoy your days now, not later. :)

And prepare to move to a country where tech is not very widespread. Try to gather money so you can move if you want to.

[–] [email protected] 3 points 1 year ago

All of the above.

Humanity is, at it's core, motivated by self interest. The singularity will be harnessed by those with the power and means to do so, while those who don't will either suffer or die.

The powerful few will adapt to the singularity; using it to craft their own utopia. The masses, without access to the same power that the upper class enjoyed, will fall into a dystopia while even more marginalized substrates of society go extinct completely unnoticed.

[–] Candelestine 2 points 1 year ago (1 children)

Utopia or extinction, depending on the perspective of the person asking. Homo sapiens cannot exist forever, that would require a halting of DNA mutation and biological adaptation. Will "we" still be here even after we've begun to require a different classification term for ourselves, just for scientific clarity?

[–] [email protected] 1 points 1 year ago

I think for the purposes of OP's question, we can ignore genetic evolution. That takes place over hundreds, thousands of generations, and history hasn't been around that long.

[–] benjithedog 2 points 1 year ago

I believe collapse is inevitable. More interesting is what comes after. If we reach true AI before the collapse, it could go either way afterwards but I’m hoping people will create a better society from the ashes.

At least for the time we’ll have left, because AI or no AI, climate won’t be getting fixed any time soon.

[–] [email protected] 2 points 1 year ago

According to Connor Leahy, companies are currently engaged in a race to be the first ones to achieve AGI, prioritizing speed over security, as mentioned in his video (source). I firmly believe that unless significant changes occur, we are headed towards extinction. We may succeed in creating a highly powerful AGI, but it might disregard our existence and eventually destroy us—not out of malicious intent, but simply because we would be in its way. In the same way humans don't consider ants when constructing a road.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

There will not be a singularity. Global capitalism will absolutely collapse and on its way will become more dystopian. Humanity isn’t going extinct.

E: the cause of this process is not human nature. Anyone who tells you it is has simply failed to study history. We can have a utopia but global capital has to collapse first to make space for it.

[–] [email protected] 1 points 1 year ago

I'll do you one step better. What about when our ai meets another ai?

Our existence is based on death and war. There is a lot of evidence to suggest we killed off all the other human-like species, such as neanderthals.

And that is the reason we progressed to a state where we have developed our world and society we know today, and all the other species are just fossils.

We were the most aggressive and bloodthirsty species of all the other aggressive and bloodthirsty alternatives, and even though we have domesticated our world, we have only begun to domesticate ourselves.

Think about how we still have seen genocides in our own time.

Our AI will hopefully pacify these instincts. Most likely not without a fight from certain parties that will consider their right to war absolute.

Like the one ring, how much of the agressiveness will get poured into our AI?

What if our AI, in the exploration of space, encounters another AI? Will it be like the early humanoid species, where we either wipe out or get wiped out ourselves?

Will our AIs have completely abstracted away all the senseless violence?

If you want a really depressing answer, read the second book of 3 body problem: The Dark Forest.

[–] [email protected] 1 points 1 year ago (1 children)

AI doesn’t think like that

[–] [email protected] 1 points 1 year ago (1 children)
[–] [email protected] 3 points 1 year ago
[–] [email protected] 1 points 1 year ago (1 children)

Almost every comment I've seen sees the future as hopeless and I'm going to largely chalk that up to the postmodernism/realism consciousness in our society at this time period.

I think the future will be a utopia, and there isn't a long term (I mean centuries or millenia long developments) reason to think otherwise. The idea of utopia has pushed civilization to confront power structures and create new ones, to rethink what was impossible, too difficult to accomplish, etc. The many rights, freedoms, and ideas that many around the world take for granted today began as people envisioning a utopia and trying to make it happen. These ideas can't be done away with as Alexis De Tocqueville saw.

Right now there are problems for sure, and I personally think liberty and egality are only a parody of utopia at this point, but that'll change over a long time.

Human civilization is only 6000 years old! We're still working with the brain of primitive humans, and we aren't even toddlers yet in the grand lifespan of Earth. I think people tend to forget that sometimes.

We'll get to a better place, and our consciousness is always changing to confront the problems we face today (biosphere collapse, resource hoarding, infighting, etc).

Democracy took centuries to develop coherently, and even then it failed MANY times at first. But look at it now.

[–] [email protected] 1 points 1 year ago

I think the Fermi paradox would suggest otherwise. If all civilizations succeed in the long term, we would have seen evidence of one by now.

[–] [email protected] 1 points 1 year ago (1 children)

It really depends on what AI we raise.

[–] [email protected] 2 points 1 year ago

So we're fucked then.....

[–] queermunist 1 points 1 year ago* (last edited 1 year ago)

There are too many structural problems with the extractive economy for our current society to survive. As resources dwindle and climate change gets worse the smaller countries will start to collapse and entire regions will go to war over resources. Billions of humans will be forced to migrate out of uninhabitable zones around the globe and they'll do anything to escape. The ones that can't escape will eat each other (metaphorically and literally).

There won't be a singularity. There probably won't even be a global internet in 30 years.

[–] [email protected] -1 points 1 year ago

The singularity already happened. We have corporations that are unregulatable. They create their own rules and use those rules to grow further, on the cost of our all resources. AI will be used by those corporations to grow further, but it won’t be the game changer towards the dystopia we’re already living and expanding.