this post was submitted on 24 May 2024
369 points (81.8% liked)
Linux
48721 readers
2209 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
it's currently opt-in rather than opt-out, fully on-device and won't work on devices with weak NPUs (or on any which completely lack it)
unless it changes in the future it's not that bad at the moment tbh
Of course it's on-device. Microsoft is doing all the processing on people's PCs, rather than their own servers, where they'd have to pay for that computation.
Data still gets reported to MS afterwards.
It must communicate with Microsoft in a way, just by the fact the "AI" must not "hallucinates" by suggesting the user to jump from a bridge or to add Glue in his pizza...
What is the bridge jumping bit about?
"On-device" has to be a half-true at best. I'm having a hard time believing that the NPUs on these new ARM chips are powerful enough for it to be fully on-device. Even more-so with "approved" x86 chips. There has to be some data sharing between the client and server, similar to how Rabbit does their shit.
Look up TPUs, like a coral tensor. Extremely efficient at machine learning, only, and cheap. If NPUs use anything like a TPU, then it absolutely can do local "AI." Then once the heavy lifting is done, then I'd imagine all that data is uploaded.
...until a botched update or a bug sends everything to the cloud, MS makes an about face saying oops my bad, then say it was fixed.