this post was submitted on 21 Jun 2024
230 points (94.9% liked)

Ask Lemmy

27256 readers
2420 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try [email protected] or [email protected]


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Jimmyeatsausage 59 points 6 months ago (3 children)

LLMs are not general AI. They are not intelligent. They aren't sentient. They don't even really understand what they're spitting out. They can't even reliably do the 1 thing computers are typically very good at (computational math) because they are just putting sequences of nonsense (to them) characters together in the most likely order based on their training model.

When LLMs feel sentient or intelligent, that's your brain playing a trick on you. We're hard-wired to look for patterns and group things together based on those patterns. LLMs are human-speech prediction engines, so it's tempting and natural to group them with the thing they're emulating.

[–] [email protected] 10 points 6 months ago (1 children)

Yup, 100%. These "AIs" have issues filtering out misinformation, finding trusted sources, and are vulnerable to other forms of manipulation. Jokes and memes probably have an impact too, and because it's not human, it's not gonna think the same we do, realizing it's a joke or just stupid people saying "3 + 4 × 8 = 56" B.S.

And please for the love of god, don't start the stupid math debates again, thank you.

[–] maniclucky 1 points 6 months ago

To pile on: They don't filter anything, or search anything. They are clever parrots made up of huge streaks of linear algebra. It has no understanding of anything nor interest in doing more than generating sentences that look right given a prompt. Even saying that it has 'no understanding' or 'interest' is giving it too much credit, implying intelligence or decision making capability. It's just ridiculously vast math.

[–] cley_faye 2 points 6 months ago

Humans loves to see patterns in everything.

[–] [email protected] 1 points 6 months ago

When LLMs feel sentient or intelligent, that's your brain playing a trick on you.

Sentient = prob a trick

Intelligent? Maybe a broken clock is right twice a day?

You write a sentence that don’t sound too good. You pretty much know how an author you respect would write it, but can’t remember the syntax & word choice exactly. You ask a model for a dozen revisions of the sentence in disparate styles. One of them clicks: “ooh! That’s what I mean!”

Am I being pedantic to say the LLM can feel intelligent when it nails the exact word choice you were looking for, better than half your social circle could’ve written it? Half your friends aren’t dumb, but the LLM can sometimes sound better than them, so you think: “yeah sounds intelligent!”

Of course…

Later it totally misunderstands some context, needs unbelievable hand-holding and still doesn’t get it, confabulates moronically… and it’s back to stupid! Mmmm glue pizza