this post was submitted on 08 Mar 2025
14 points (81.8% liked)
Apple
18340 readers
87 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don’t even know if it’s theoretically possible to make a hallucination free LLM. That’s kind of its basic operating principle.
People are misled by the name. Its not making stuff up, its just less accurate
Less accurate as in misleading and outright false.
It always predicts the next word based on its tokenisation, data from training and context handling. So accuracy is all there is.