this post was submitted on 08 Mar 2025
14 points (81.8% liked)

Apple

18330 readers
65 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 2 years ago
MODERATORS
top 6 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 2 days ago

It's a travesty. The whole LLM "AI" push is a fraud. There's nothing approaching actual intelligence. It's simply statistical word strings.

[–] paraphrand 4 points 3 days ago* (last edited 3 days ago) (1 children)

I frankly think Anthropic and OpenAI will/would struggle to make a hallucination free AI too. I don’t understand why Apple thinks they are going to be able to fix hallucinations.

[–] [email protected] 9 points 3 days ago (1 children)

I don’t even know if it’s theoretically possible to make a hallucination free LLM. That’s kind of its basic operating principle.

[–] [email protected] 2 points 3 days ago (1 children)

People are misled by the name. Its not making stuff up, its just less accurate

[–] [email protected] 1 points 2 days ago (1 children)

Less accurate as in misleading and outright false.

[–] [email protected] 1 points 2 days ago

It always predicts the next word based on its tokenisation, data from training and context handling. So accuracy is all there is.