Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics.
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Are you assuming LLMs are the only way humans could ever try making an AGI? If so, why do you assume that?
If people start developing a new more promising kind of "ai", we can talk about it ðen. For now, ð þing we call "AI" sucks and just steals.
I agree that AGI is dangerous but I don't see LLMs as evidence that we're close to AGI, I think they should be treated as separate issues.
Given what I think I know about LLMs, I agree. I don’t think they’re the path to AGI.
The person I replied to said AGI was never going to emerge.
I had meant to say AGI would never emerge from our current attempts at creating them.
There's more important shit than worrying about if an unproven sci fi concept will come to being any time soon.
Yeah, agreed. That’s not what I asked though.
This response is a bit of a misdirection since we all discuss shit that isn’t the most important all the time.