this post was submitted on 08 Dec 2024
211 points (96.9% liked)

Technology

59975 readers
3507 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 42 comments
sorted by: hot top controversial new old
[–] Lutra 1 points 6 days ago

I just read up, and I didn't know this is not so much about stopping new images, but restitution for continued damages.

The plaintiffs are "victims of the Misty Series and Jessica of the Jessica Series" ( be careful with your googling) https://www.casemine.com/judgement/us/5914e81dadd7b0493491c7d7

Correct me please, The plaintiffs logic is : "The existence of these files is damaging to us. Anyone found ever in possession of one of these files is required by law to pay damages. Any company who stores files for others, must search every file for one these 100 files, and report that files owner to the court"

I thought it was more about protecting the innocent, and future innocent, and it seems more about compensating the hurt.

Am I missing something?

[–] [email protected] 111 points 1 week ago (3 children)

They'd get sued whether they do it or not really. If they don't they get sued by those that want privacy invasive scanning. If they do, they're gonna get sued when they inevitably end up landing someone in hot water because they took pictures of their naked child for the doctors.

Protecting children is important but can't come at the cost of violating everyone's privacy and making you guilty unless proven innocent.

Meanwhile, children just keep getting shot at school and nobody wants to do anything about it, but oh no, we can't do anything about that because muh gun rights.

[–] [email protected] 14 points 1 week ago

Makes me wonder if the lawsuit is legit or if it's some But think of the children™ institution using some rando as cover.

because muh gun rights.

I think it's a bit more complicated. These are worth a watch at least once:
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 1)
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 2)
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 3)

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago)

If people really care about protecting the children, we can always raise taxes on the wealthy/cut military spending to fund new task forces to combat the production and spread of child pornography!

Heck, the money spent on this lawsuit could be spent catching people producing CSAM instead.

[–] [email protected] 108 points 1 week ago* (last edited 1 week ago) (2 children)

First: I'm not in any way intending to cast any negative light on the horrible shit the people suing went through.

But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

If you really were serious about suing to force change, you've literally got:

  1. X, who has reinstated the accounts of people posting CSAM
  2. Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
  3. Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content

Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going 'well, akshully' at reports.

[–] [email protected] 44 points 1 week ago (2 children)

Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely

I used to share an office with YouTube's content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it's worth, YT does take action on CSAM and other abusive materials. The problem is that it's just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it's not exactly easy to keep a department like that staffed (turns out you really can't pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there's pretty much always a consistent backlog of content to review.

While this article talks about Facebook, specifically, it's very similar to what I saw with YouTube's team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

[–] [email protected] 11 points 1 week ago (1 children)

you really can’t pay people enough to watch child abuse

I wonder what the package was, besides the salary. And the hiring requirements.

[–] [email protected] 15 points 1 week ago

I don't know all the details, but I know they had basically unlimited break time, as well as free therapy/counseling. The pay was also pretty decent, especially for a job that didn't require physical labor or a specialized background.

They did have a pretty strict vetting process, because it was apparently not uncommon at all for people to apply to the job because they were either eager to see abusive content directly, or had an agenda they might try to improperly influence what content gets seen. Apparently they did social media deep dives that you had to consent to, to apply.

[–] [email protected] 0 points 1 week ago

For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.

(If you have not seen and/or read up on this, save yourself the misery and skip it: it's gross.)

It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.

Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it's fucking awful, but if you've got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.

Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn't explicit CSAM.

Just... gross, and poorly handled.

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago)

But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

Yep. All the money being wasted on this lawsuit could be spent catching actual producers and distributors of child porn.

Always follow the money. It shows what people's true intentions are.

[–] [email protected] 39 points 1 week ago

I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.

[–] paraphrand 36 points 1 week ago* (last edited 1 week ago) (1 children)

“People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”

[–] [email protected] 34 points 1 week ago (1 children)

I'd posit that the people who don't want their files scanned, and the people suing Apple are not the same people.

[–] chemical_cutthroat 5 points 1 week ago (1 children)

If I've learned on thing from my time on earth, it's that all humans are the same, and all of the opinions of one are shared by the majority.

[–] [email protected] 12 points 1 week ago (1 children)

I have the exact opposite experience.

[–] chemical_cutthroat -1 points 1 week ago* (last edited 1 week ago) (1 children)

The exception that proves the rule.

I won't be answering any further questions.

[–] [email protected] 7 points 1 week ago

All people are stubborn as fuck though.

[–] lurklurk 23 points 1 week ago (1 children)

Is iCloud a file sharing service or social network in some way? If it isn't, comparing them with such services makes no sense

[–] [email protected] 4 points 1 week ago

file sharing service

Yes

[–] [email protected] 10 points 1 week ago

Children should be made illegal, this is a self resolving problem.

[–] [email protected] 2 points 1 week ago

Is this a free system, by the way?

Is Apple essentially getting sued for not giving another company money?

[–] lepinkainen -2 points 1 week ago* (last edited 1 week ago) (3 children)

The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.

But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

[–] lurklurk 28 points 1 week ago (1 children)

You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it's added, next year it's extended to cover terrorism. Then to look for missing people. Then "illegal content" in general.

The reason most people seem to disagree with you in this case is that you're wrong

[–] lepinkainen -2 points 1 week ago (1 children)

We could've burned that bridge when we got to it. If Apple would've been allowed to implement on-device scanning, they could've done proper E2E "we don't have the keys officer, we can't unlock it" encryption for iCloud.

Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.

[–] AlphaAutist 3 points 1 week ago (1 children)
[–] lepinkainen 0 points 1 week ago

It does now, it didn’t at the time

[–] [email protected] 4 points 1 week ago (2 children)

I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?

Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).

The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.

Apple just had a fit and declared that if they can't backdoor and scan your files on your own device then they just won't try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.

[–] meejle 2 points 1 week ago

I'm amazed it's taken so long... I think I'm on my third Android phone since they first announced it and I said "fuck no".

[–] lepinkainen 2 points 1 week ago* (last edited 1 week ago) (2 children)

There was no "huge privacy issue".

First of all: You could turn off the local scanning by turning off iCloud sync - which would've sent the images to the cloud for scanning anyway. That's it, nothing else, nobody at Apple would've touched a single super-private file on your device.

The local scanning required MULTIPLE (where n>3, they didn't say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would've been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude's whole Live account for his kid's pics being on OneDrive.

After the MULTIPLE matches (you can't get flagged by "accidentally" having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a "reduced resolution facsimile" (Can't remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would've ended up in. You would've had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.

If after that the human decided that yep, that's some fucked up shit, the authorities would've been contacted.

Yes, a Bad Government could've forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn't have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?

Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.

tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would've changed is that nobody would've had a legit reason to peep at your photos in the cloud "for the children". But if you've got cloud upload off anyway, nothing would've changed. So I still don't understand the fervour people had over this - the only reason I can think of is not understanding how it worked.

[–] [email protected] 6 points 1 week ago (1 children)

So I still don't understand the fervour people had over this - the only reason I can think of is not understanding how it worked.

Or that it was a built in backdoor running in your device.

The difference is what happens on your own device should be in your control. Once it leaves your device then it's not in your control. Which is where the entire issue was. It doesn't matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.

[–] lepinkainen 4 points 1 week ago (1 children)

It's not a very good back door if you have an explicit easy to use switch to turn it off.

And even without this feature on your device, they don't need to use a "back door". They'll just go through your front door that's wide open and can't be closed because of "the children"

If you want to "own" your phone, there are other manufacturers than Apple that allow you to lock it down like Fort Knox (or whatever you deem secure)

[–] [email protected] 3 points 1 week ago

Which one is it? This isn't Schrodinger's iPhone.

[–] [email protected] 0 points 1 week ago (1 children)

You don't understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.

Or are you shilling for anti-privacy?

My device, my files. I don't want your scanning.

What's so hard to grok about that unless you are anti-privacy?

[–] lepinkainen 1 points 1 week ago (2 children)

The files WILL be scanned the second they leave your device to any major cloud.

If they don't leave your device, then turning off iCloud (and thus the "back door") wouldn't have had any impact on you.

[–] Lutra 2 points 6 days ago

Just clearing up the argument.

  1. The files will be scanned
  2. They've been doing for decades

There's a difference here in principle. Exemplified by the answer to this question: "Do you expect that things you store somewhere are kept private?" Where, Private means: "No one looks at your things." Where, No One means: not a single person or machine.

This is the core argument. In the world, things stored somewhere are often still considered private. (Safe Deposit box). People take this expectation into the cloud. Apple, Google, Microsoft, Box, Dropbox etc - only made their scanning known publicly _after they were called out. They allowed their customers to _assume their files were private.

Second issue: Does just a simple machine looking at your files count as unprivate? And what if we Pinky Promise to make the machine not really really look at your files, and only like squinty eyed. For many, yes this also counts as unprivate. Its the process that is problematic. There is a difference between living in a free society, and one in which citizens have to produce papers when asked. A substantial difference. Having files unexamined and having them examined by an 'innocuous' machine, are substantial differences. The difference _is privacy. On one, you have a right to privacy. In the other you don't.


an aside...

In our small village, a team sweeps every house during the day while people are out at work. In the afternoon you are informed that team found illegal paraphernalia in your house. You know you had none. What defense do you have?

[–] [email protected] 3 points 1 week ago

The files WILL be scanned the second they leave your device to any major cloud.

There are services with e2e and you can encrypt before uploading to those who can't.

Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.

Then, once implemented and with less fanfare some authoritarian regimes (won't say any to not upset the tankies) can ask apple to scan for other material too... And as it's closed source we wouldn't even know that the models are different by country.

[–] [email protected] 2 points 1 week ago (1 children)

😆 yea especially after I learned that most cloud services (amazon, google, dropbox) were already doing csam scans on their servers 🤭

[–] lepinkainen 3 points 1 week ago (1 children)

Yep, it's a legal "think of the children" requirement. They've been doing CSAM scanning for decades already and nobody cared.

When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

The stupidest ones were the ones who went "a-ha! I can create a false match with this utter gibberish image!". Yes, you can do that. Now you've inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would've EVER get swatted by your false matches.

Can people say the same for Google stuff? People get accounts taken down by "AI" or "Machine learning" crap with zero recourse, and that's not a surveillance state?

[–] [email protected] 3 points 1 week ago

😅why do we get downvoted?

I guess somebody doesn’t like reality 💁🏻