this post was submitted on 31 Jul 2023
18 points (87.5% liked)
Apple
17539 readers
87 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Tasks the Apple Neural Engine Takes Responsibility For
It’s time to dive into just what sort of jobs the Neural Engine takes care of. As previously mentioned, every time you use Face ID to unlock your iPhone or iPad, your device uses the Neural Engine. When you send an animated Memoji message, the Neural Engine is interpreting your facial expressions.
That’s just the beginning, though. Cupertino also employs its Neural Engine to help Siri better understand your voice. In the Photos app, when you search for images of a dog, your iPhone does so with ML (hence the Neural Engine.)
Initially, the Neural Engine was off-limits to third-party developers. It couldn’t be used outside of Apple’s own software. In 2018, though, Cupertino released the CoreML API to developers in iOS 11. That’s when things got interesting.
The CoreML API allowed developers to start taking advantage of the Neural Engine. Today, developers can use CoreML to analyze video or classify images and sounds. It’s even able to analyze and classify objects, actions and drawings.
https://www.macobserver.com/tips/deep-dive/what-is-apple-neural-engine/