this post was submitted on 27 Feb 2025
991 points (96.8% liked)

Technology

63407 readers
6447 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

(page 3) 50 comments
sorted by: hot top controversial new old
[–] Armand1 65 points 1 day ago* (last edited 1 day ago) (22 children)

For people who have not read the article:

Forbes states that there is no indication that this app can or will "phone home".

Its stated use is for other apps to scan an image they have access to find out what kind of thing it is (known as "classification"). For example, to find out if the picture you've been sent is a dick-pick so the app can blur it.

My understanding is that, if this is implemented correctly (a big 'if') this can be completely safe.

Apps requesting classification could be limited to only classifying files that they already have access to. Remember that android has a concept of "scoped storage" nowadays that let you restrict folder access. If this is the case, well it's no less safe than not having SafetyCore at all. It just saves you space as companies like Signal, WhatsApp etc. no longer need to train and ship their own machine learning models inside their apps, as it becomes a common library / API any app can use.

It could, of course, if implemented incorrectly, allow apps to snoop without asking for file access. I don't know enough to say.

Besides, you think that Google isn't already scanning for things like CSAM? It's been confirmed to be done on platforms like Google Photos well before SafetyCore was introduced, though I've not seen anything about it being done on devices yet (correct me if I'm wrong).

[–] ZILtoid1991 11 points 1 day ago (1 children)

Issue is, a certain cult (christian dominionists), with the help of many billionaires (including Muskrat) have installed a fucking dictator in the USA, who are doing their vow to "save every soul on Earth from hell". If you get a porn ban, it'll phone not only home, but directly to the FBI's new "moral police" unit.

load more comments (1 replies)
[–] [email protected] 16 points 1 day ago

Doing the scanning on-device doesn't mean that the findings cannot be reported further. I don't want others going thru my private stuff without asking - not even machine learning.

load more comments (20 replies)
[–] SavageCoconut 118 points 1 day ago (1 children)

Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content. Users control SafetyCore, and SafetyCore only classifies specific content when an app requests it through an optionally enabled feature.”

GrapheneOS — an Android security developer — provides some comfort, that SafetyCore “doesn’t provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.”

But GrapheneOS also points out that “it’s unfortunate that it’s not open source and released as part of the Android Open Source Project and the models also aren’t open let alone open source… We’d have no problem with having local neural network features for users, but they’d have to be open source.” Which gets to transparency again.

[–] FauxLiving 10 points 1 day ago

Graphene could easily allow for open source solutions to emulate the SafetyCore interface. Like how it handles Google's location services.

There's plenty of open source libraries and models for running local AI, seems like this is something that could be easily replicated in the FOSS world.

[–] [email protected] 13 points 1 day ago (3 children)

Is there any indication that Apple is truly more secure and privacy conscious over Android? Im kinda tired of Google and their oversteps.

load more comments (3 replies)
[–] [email protected] 3 points 21 hours ago

And interestingly enough my phone crapped out on this post. But at least I was still able to read the the post.

[–] chilicheeselies 4 points 23 hours ago

Jesus thanks for posting this. Found it on my LG ThinQ.

Anyone have a fairphone? Thoughts about it?

[–] [email protected] 10 points 1 day ago

For those that have issues on Samsung devices: see here if you're getting the "App not installed as package conflicts with an existing package" error :

If you have a Samsung device - uninstall the app also from Knox Secure Folder. Entering to Secure Folder>Settings>Apps

load more comments
view more: ‹ prev next ›