this post was submitted on 01 Sep 2023
116 points (97.5% liked)

Apple

17745 readers
116 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 70 points 1 year ago (1 children)

I think Apple’s ultimate decision on this is the correct one. The world is an ugly place and there’s no silver bullet that solves a problem like CSAM and ensures it can’t be abused.

Wish it weren’t as this likely would have made a huge impact against child abusers, but thankfully degrading every Apple user’s privacy isn’t the only effective way to fight against it.

[–] Nogami 25 points 1 year ago (1 children)

Exactly. As problematic as CSAM is, adding the ability for on device scanning is exactly what untrustworthy governments would sell their souls for.

The potential to scan everyone’s devices for any content a government deems problematic could shift the balance of power in the world permanently. You can see why they want it so much.

[–] FMT99 12 points 1 year ago

Not just governments either.