this post was submitted on 03 Dec 2024
264 points (97.8% liked)
Technology
59979 readers
3950 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah, no. LLMs predict what comes next, not what someone wants to hear.
Not really wants as much as expects, but that’s what AI is designed to do.
What you're saying is not factual. LLMs predict what comes next based on the parameters set during learning process. It might at times say what you're expecting, but then try contradicting information that it knows to be factual. See how far that gets you.
I think you're confusing agreeableness for a validation buddy. For a product like this to work, it has to be inviting.
Now you’re just splitting hairs.