this post was submitted on 27 Feb 2025
1007 points (96.7% liked)

Technology

63415 readers
4850 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 15 points 1 day ago* (last edited 1 day ago) (2 children)

Apple had it report suspected matches, rather than warning locally

It got canceled because the fuzzy hashing algorithms turned out to be so insecure it's unfixable (easy to plant false positives)

[–] lepinkainen 0 points 1 day ago (1 children)

They were not “suspected” they had to be matches to actual CSAM.

And after that a reduced quality copy was shown to an actual human, not an AI like in Googles case.

So the false positive would slightly inconvenience a human checker for 15 seconds, not get you Swatted or your account closed

[–] [email protected] 2 points 21 hours ago* (last edited 21 hours ago) (1 children)

Yeah so here's the next problem - downscaling attacks exists against those algorithms too.

https://scaling-attacks.net/

Also, even if those attacks were prevented they're still going to look through basically your whole album if you trigger the alert

[–] lepinkainen 1 points 14 hours ago (1 children)

And you’ll again inconvenience a human slightly as they look at a pixelated copy of a picture of a cat or some noise.

No cops are called, no accounts closed

[–] [email protected] 1 points 12 hours ago

The scaling attack specifically can make a photo sent to you look innocent to you and malicious to the reviewer, see the link above

[–] [email protected] 0 points 1 day ago

The official reason they dropped it is because there were security concerns. The more likely reason was the massive outcry that occurs when Apple does these questionable things. Crickets when it's Google.

The feature was re-added as a child safety feature called "Comminication Saftey" that is optional on a child accounts that will automatically block nudity sent to children.