this post was submitted on 28 Aug 2023
711 points (99.4% liked)

Meta (lemm.ee)

3473 readers
1 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, [email protected] will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
 

Sorry for the short post, I'm not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I'm really sorry for the disruption, it's a necessary trade-off for now until we figure out the way forward.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 10 months ago (1 children)

I kinda wonder though, how would go about making a law against cp but doesn't hurt small sites like lemm.ee?

[–] PM_Your_Nudes_Please 22 points 10 months ago* (last edited 10 months ago) (2 children)

The issue is that you really can’t. The laws are written specifically to prevent plausible deniability. Because pedos would be able to go “lol a troll sent it to me” and create some doubt in a jury. Remember that (at least in America) the threshold for conviction is supposed to be “beyond a reasonable doubt.” So if laws were focused on intent, all the pedos would need to do is create reasonable doubt, by arguing that they never intended to view/own the CSAM.

This was particularly popular in the Napster/Limewire days, when trolls would upload CSAM under innocuous titles, so people looking for the newest episode of their favorite show would find CSAM instead. You could literally find CSAM titled things like “Friends S10E9” because trolls were going for the shock factor of an innocent person opening a video only for it to end up being hardcore CSAM. Lots of actual pedos tried using the “I downloaded it by accident” defense.

So instead, the laws are written to close that loophole. It doesn’t matter why you have the CSAM. All that matters is you have it. The feds/courts won’t give a fuck if it was due to you seeking it out or if it was due to a bad actor sending it to you.

[–] [email protected] 6 points 10 months ago (2 children)

How is that not extremely problematic? What stops someone from using Tor and a bunch of dummy accounts to send CSAM to someone else and get them arrested?

[–] PM_Your_Nudes_Please 6 points 10 months ago

And that’s pretty much where we are now. Bad actors creating bot accounts on multiple instances, to spam the larger (most popular) instances with CSAM.

[–] [email protected] 4 points 10 months ago* (last edited 10 months ago) (1 children)

I think they have oversimplified the situation to the point that it is wrong.

  1. Arguably, Lemmy instance providers (depending on where they live) are protected in the same way Facebook or other content hosts are. So long as you are acting in good faith you are protected against any illegal content your users upload. This does mean you need to remove illegal content as you become aware of it, you can't just ignore what your users are doing.

  2. There have been cases where although a user technically 'possessed' CSAM, it was shown that they did so unknowingly via thumbnails or it being cached. The police do investigate where it came from. It's not as simple as just sending it to someone and you can have them convicted.

[–] [email protected] 2 points 10 months ago (1 children)

Oh okay, that's good. So if you could show that you were trying to block it, you'd be safe.

[–] [email protected] 1 points 10 months ago

Yes, you'd just need to show that you actively moderate/apply content policies.

This will vary by jurusduction, but most of the West has laws similar to this I believe.

[–] [email protected] 3 points 10 months ago

Lemmy instances are likely already protected in many countries legally so long as they act in good faith, ie actively moderate.