this post was submitted on 28 Jun 2023
145 points (99.3% liked)

Apple

17525 readers
35 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 6 points 1 year ago (1 children)

To me, this seems like such a transparent attempt to force the tech companies to have a backdoor. If they can scan for CSAM, they can scan (or copy) anything else the government wants.

[โ€“] [email protected] 2 points 1 year ago

That's very likely the actual goal. Stopping child abuse is only an excuse, one governments keep pulling out whenever they want to push anti privacy legislation. And it's clear that this would do nothing to stop it either, because then abusers just wouldn't use compromised services from big companies.