this post was submitted on 23 May 2024
220 points (98.2% liked)
TechTakes
1490 readers
32 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs don't understand any words.
yes. and you wouldn't believe¹ what's in the replies when you make this simple and obvious statement.
¹ who i am kidding. of course you know.
I both agree and disagree. I think of them as golems. They do understand how to respond, but that's as deep as it goes. It's simulated understanding, but a very very good simulation... Okay maybe I do agree.
I think that at best you could say that they understand the relationship between tokens. But even that requires a really generous definition of the word "understand".
There's a saying..."Knowledge is knowing a tomato is a fruit. Wisdom is knowing not to put it in fruit salad."
Meanwhile, LLMs are telling us to put glue on pizza so the cheese sticks. Even if the technology could eventually deliver on the promise, by the time we get there, nobody intelligent will trust it because the tech bros are, again, throwing half-baked garbage out into the world to try and be first to market.
I didn't trust it from the very moment of the announcement.
Well, so are humans. At least one human, 11 years ago, on reddit.
Yes, but the general population doesn't expect shitposts from their Google search. When I'm reading a meme community I want shitposts. When I'm googling recipies, I'm looking for reliable instructions on how to make dinner. It's all part of the whole "LLMs don't know what they're saying" issue.
Yeah, fair.