this post was submitted on 15 Aug 2024
0 points (50.0% liked)
Apple
17595 readers
56 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They could offload the processing to a device that does. They currently do this for personal requests. Putting 8 gigabytes of RAM in a smart speaker is a bit silly.
8 gb of RAM may be a lot for the average smart speaker, but not for one that costs $300. Then again, we know Apple prices RAM like it’s made from diamonds and unicorn queefs.
Possibly, but I’m not sure that’s what OP was looking for in an answer.
It could be and answer. So far what your comments have taught me are: the HomePod hardware is too weak to host Apple Intelligence locally, but there may be a workaround by outsourcing the process to a server. In a larger scale, it would make sense for Apple to open AI servers for older devices and charge a monthly fee for it. It would likely be slower than local processing (depending on internet speeds) but it would allow AI to be available to more people, while generating a revenue and giving a preview of what you could get with a newer device that can run it locally.
I’m not convinced Apple will do that because they want to run as much AI on device as possible for privacy reasons. My guess is they will save Apple Intelligence for whatever this new home hub with a robotic arm display that is being rumored turns out to be.