this post was submitted on 03 Nov 2024
1274 points (99.4% liked)

Fuck AI

1449 readers
314 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Buddahriffic 2 points 3 weeks ago (1 children)

Why not both? Plus, not just trusting LLMs is something any of us can decide to do on our own.

[–] [email protected] 1 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Because the average person doesn't even know what an LLM is or what it even stands for and putting a misinformation generator at the top of search pages is irresponsible.

Like, if something is so unreliable with information that you have to say "don't trust what this thing says" but you still put it at the top of the page? Come on... It's like putting a self destruct button in a car and telling people "well the label says not to push it!"

[–] Buddahriffic 0 points 3 weeks ago

We don't control what Google puts on their search page. Ideally, yeah, they wouldn't be pushing their LLM out to where it's the first thing to respond to people who don't understand that it isn't reliable. But we live in a reality where they did put it on top of their search page and where they likely don't even care what we think of that. Their interests and everyone else's don't necessarily align.

That comment was advice for people who read it and haven't yet realized how unreliable it is and has nothing to do with the average person. I'm still confused as to why you have such an issue with it being said at all. Based on what you've been saying, I think you'd agree that Google is being either negligent or malicious by doing so. So saying they shouldn't be trusted seems like common sense, but your first comment acts like it's just being mean to anyone who has trusted it or something?