this post was submitted on 30 Sep 2023
56 points (95.2% liked)
Futurism
430 readers
1 users here now
A place to discuss the ideas, developments, and technology that can and will shape the future of civilization.
Tenets:
(1) Concepts are often better treated in isolation -- eg: "what if energy became near zero cost?"
(2) Consider the law of unintended consequences -- eg: "if this happens, then these other systems fail"
(3) Pseudoscience and speculative physics are not welcome. Keep it grounded in reality.
(4) We are here to explore the parameter spaces of the future -- these includes political system changes that advances may trigger. Keep political discussions abstract and not about current affairs.
(5) No pumping of vapourware -- eg: battery tech announcements.
See also: [email protected] and [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The "war" with LLMs is already over, that is to say there was no war. We embraced it.
Given how easily we accepted it and how slow legislation has been to have any meaningful impact on the development of the models if we achieve true AGI we'll see a similar path to integration.
My opinion is different than some, in my eyes AGI is a kind of natural evolution of us, if we create it and foster it's early development we will effectively be its parents. As our progeny, it will outlast humanity no matter it's relationship with us.
In my eyes that's the only possible way to truly preserve a piece of us forever and may be the only way to get information out beyond the fermi paradox if the paradox's key tenant that proves true is the short lifespan/implosion of advanced societies.
Creating AGI is what religious people would call judgement day, we'll let an advanced and evolving entity make the judgements about our societies that we can't.
Maybe they'll let us live the same way we let mosquitoes live, we don't wage wars where every human goes out and kills mosquitoes. We find them completely useless and even a leech on each of us, if we found an easy way to eradicate them we probably would.
Whatever happens, we deserve it. I don't say this as a way to say "the people researching and releasing LLMs are making mistakes", in fact I think those people are some of the best and brightest among us. Rather I'm saying humanity on the whole is pretty fucked up. Genocide, torture, hatred, religious bigotry, nationalism, tribalism, humans are largely just kinda smart and honestly fairly evil primates.
We literally raise sentient creatures from birth in cages to murder and devour, and the vast majority of society accepts that?!
To be more concrete, I don't think LLMs will be the path to AGI, but I do think they are an essentiao component of a consciousness. The "stream" of consciousness specifically.
5 years, nothing, LLMs treated like google is today, an incredible information source and partner for creation.
50 years, hard to conceive this time period arriving without AGI. I think humans will suffer as they weakly combat the ideology of a more advanced race being in charge. Not physically combat, but literally "back in my day" style combat.
500 years, I'd be surprised if we last that long with our current structure and march toward doom. Perhaps the new AGI race strips what it needs to leave the planet and we revert back to 1800s technology. So hard to predict.