this post was submitted on 17 May 2024
503 points (94.8% liked)
Technology
59974 readers
3693 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
hallucination refers to a specific bug (AI confidently BSing) rather than all bugs as a whole
Honestly, it's the most human you'll ever see it act.
It's got upper management written all over it.
Isn't it more accurate to say it's outputting incorrect information from a poorly processed prompt/query?
No, because it's not poorly processing anything. It's not even really a bug. It's doing exactly what it's supposed to do, spit out words in the "shape" of an appropriate response to whatever was just said
When I wrote "processing", I meant it in the sense of getting to that "shape" of an appropriate response you describe. If I'd meant this in a conscious sense I would have written, "poorly understood prompt/query", for what it's worth, but I see where you were coming from.