this post was submitted on 02 Sep 2023
31 points (63.2% liked)
Apple
17438 readers
192 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
All big tech analyzes our data. I rather they not analyze anything but since we'll never see that day, they can at least use their privacy invasion power for good.
The thing is that while many companies have access to your data in various services, Apple has designed their systems such that they can’t access most user data. Can’t be both ways, your data is either private or not, and many would prefer it stay private.
As I understand the actual situation with iCloud and CSAM scanning is Apple does scan iCloud photos (the ones that users choose to upload to iCloud) if they can. A few years ago they tried to design a privacy focused version of that scanning that would allow them to access that kind of content for the purposes of reporting it, while preserving the users privacy. It was supposed to happen on device(while most companies only scan the photos on their servers) before the photos were uploaded, and use hashes to compare user photos to known CSAM material. This seemed an odd thing at the time, but a while after that Apple released end to end encryption for iCloud Photos, which means they can’t scan the uploaded photos anymore because they don’t have that access. Some have a theory that the big tech companies have regular contact with various government/law enforcement/etc. agencies and the on device scanning was a negotiated by them as a response to Apple’s plans to add E2E encryption to iCloud Photos, among other previously less secure services.
Some nits: Apple could access many classes of data stored on iCloud by default (including any photos), even now, but you can make almost every class end to end encrypted now if you explicitly chose to. Previously, and by default now, it’s Apple policy and internal controls over the keys your data is encrypted with that protect that data, not the encryption itself (though you can opt in to the encryption itself protecting you from Apple). From what I understand, Apple is only known to actually scan iCloud mailboxes regularly, with the on-device scanning having never been implemented. Outside of nits, considering the delay between the proposed scanning and offering of a wider E2EE program for iCloud, I doubt the two are actually related myself.
There's also no way to validate that Apple's E2EE operates as stated. They could have added a backdoor for themselves or "intelligence" agencies, and we have no way of knowing other than "trust us". Even if the source code is ever leaked (or a backdoor exploited by hackers), it could be written with plausible deniability — in such a way that it could be interpreted as unintentional (a bug/error).
This is why you should never trust closed source code with your sensitive data, and encrypt it yourself using open source, widespread/trusted, audited tools before uploading it to someone else's computer.
This is exactly what Apple wanted to do and lots of people (myself included) were against that because it would involve Apple scanning data on your phone. Sure, it was only at the point to deciders to upload photos to the cloud, but still it was unacceptable to scan our phones for data that hasn’t been uploaded yet.
Something that Apple themselves has done.