this post was submitted on 29 Aug 2023
470 points (98.0% liked)

Meta (lemm.ee)

3473 readers
1 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, [email protected] will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 2 years ago
MODERATORS
 

Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I'm writing a more detailed post now to clear up where we are now and where we plan to go.

What's the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it's first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What's the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it's the opposite of what many of our users have been hoping, but at the moment, we simply don't have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven't had a chance. However, with the current situation, I believe this feature is more important then ever, and I'm very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

This could in theory limit creation of new accounts just to break rules (or laws).

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it's flagged as NSFW, then we don't accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our "no pornography" rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago (5 children)

I understand that admins need to take whatever measures needed to protect themselves from legal pursuits

At the same time I hate to see the promised federated network revert to what commercial platforms have become, karma and account age requirement, phone and identity verification , forced 2fa and what not.

Maybe lemmy should implement a shared database whereas if an admin of an instance marks a post as potentially illegal, it gets replicated to other instances automatically and gets in queu for deletion.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

I think desperation of devs, admins and users is exactly the sentiment the trolls were trying to elicit. Lemmy is a young project, and this is one of many hurdles it'll need to overcome on its path.

I like the idea of removal flags propagating through the network, at least as an additional signal. Forcing removal everywhere on a single removal signal on a single instance would probably be too jumpy (e.g. a sfw instance might prevent any instance from hosting nsfw content), but some configurable rules and thresholds paired with removal reason context might significantly automate the process.

The reason I especially like this suggestion is because smaller instances can benefit from any automation that is affordable by larger ones.

load more comments (4 replies)
[–] [email protected] 7 points 1 year ago

That sucks, but hopefully something good can come out of it eventually. Like better mod tools...

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

It is now impossible to add an avatar or banner to profiles because the only way to do so through the UI is uploading to the instance. There’s no way to add an external URL. Just wanted to point that out in case it wasn’t intentional. Very understandable if that’s something we have to sacrifice for the time being.

Edit: I noticed that images will upload to the account's home instance instead of the community's home instance. This means that one workaround for the time being to change your lemm.ee community's icon and banner is to create an account on another instance and then add that account as a moderator to your lemm.ee community. You can then use that external account to change the icon and banner of your lemm.ee community because images will be uploaded to whatever instance your account is on instead of lemm.ee.

[–] [email protected] 4 points 1 year ago (3 children)

Nor sure about links but avatars and banners were intentional because those would be images, and all image uploads are suspended for now.

load more comments (3 replies)
[–] [email protected] 7 points 1 year ago (1 children)

These are great ideas especially the ability for users to invite others. I think it’s also a good way to get new people into the fediverse since inviting someone will have them easily know what instance to go to.

Will you submit all these features to the official lemmy backend too?

load more comments (1 replies)
[–] [email protected] 7 points 1 year ago (1 children)

I've searched what CSAM means, and I'm flabbergasted! Doesn't the authorities catch these criminals?!

[–] [email protected] 6 points 1 year ago (1 children)

They sure try, which is why it is such a difficult issue. If an instance were to hold onto it for too long, it would likely get caught in their cross hairs and taken down.

load more comments (1 replies)
[–] [email protected] 6 points 1 year ago (1 children)

Hey there! Why not talk with the main lemmy developer to try and integrate such a content blocker directly into the lemmy stack so that it’s easier to implement for smaller instances? Thanks for keeping this instance up and runnin’! Cheers!

[–] [email protected] 6 points 1 year ago (7 children)

A karma system is sounding pretty good right now... /me lifts shield and ducks

Even if it's just a a limited tiered system with numbers to obsess about. Level - 1 browsing rights. Graduate to level 2 after 5 days and total of greater than 30minutes of logged in activity

Level - 2 commenting rights. Limited to 10 comments daily for 5 days.

Graduate after at least 3 comments, total upvote count >+3, and 5days.

Level 3 - posting rights. Limited to 3 posts daily for 5 days. Unlimited commenting.

Graduate after 5d and total upvote count >50

Level 4 — image posting rights. 10 images per day max

Graduate after 2 weeks and total upvote count >100

Level 5 - you've made it, everyone is equal here. Entry level users are still enjoying and growing into the community. No need to be a tool about trying to get more karma / points and number of bots / temp accounts / total losers should be minimal by this screening level.

[–] [email protected] 3 points 1 year ago

A karma system is sounding pretty good right now

lemmy's code already does it. person_aggregates keeps track of post_score and comment_score. It just isn't displayed on lemmy-ui. A bot or new code can look at these values.

load more comments (6 replies)
[–] [email protected] 6 points 1 year ago

Thank you for the update. I appreciate your efforts in moderating and keeping us informed.

[–] [email protected] 6 points 1 year ago

Great plan!

[–] [email protected] 6 points 1 year ago

I think the only sustainable option here is to keep media on the instance it was first posted to and every instance managing their own stuff.

If it gets too crowded close registrations and another instance grows.

[–] [email protected] 5 points 1 year ago (1 children)

I didn’t even know there was an option to load images directly from the source instance instead of caching the content locally. I know it’s a resource issue and it can slow things down a bit for users, but I think ultimately it should be done that way by default, to mitigate exponential propagation of illegal content. Wasn’t caching the main reason why lemmy.world preemptively blocked piracy communities?

That, or admins should be able to selectively choose what communities to cache content from, like maybe the ones where they can confirm there is active moderation.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

Privacy-minded users want caching because otherwise it means they're connecting to multiple (possibly malicious) websites instead of just lemm.ee (someone made a post that would grab your IP and show it to you, for example). It's difficult.

[–] [email protected] 4 points 1 year ago

Good point. I was imagining users grabbing content from the source instance via their local instance as a proxy, which would hide their info. Obviously I don't know how the backend works, so if the alternative is direct connectivity exposing your info, then yeah that's definitely something to think about.

[–] [email protected] 4 points 1 year ago (1 children)

A way to deal with false positives of an ML NSFW scanner would be: Once per day, each user can "overwrite" the scanner. If a user is caught abusing this, they get banned.

[–] [email protected] 6 points 1 year ago (1 children)

This is an interesting idea. So if I’m understanding you correctly the workflow would be like this:

  1. user uploads 4 images.. 2 are flagged as CSAM.

  2. user overwrites the flag on one image, asserting that “no, this isn’t CSAM”

  3. in other sites, I’ve seen this work by the content remaining hidden except for the user until a team reviews it. If the team agrees, it’s allowed on the site. I think this is different from what you are describing though. I think you’re suggesting that the content stay online after the user overwrites the flagging, but then a mod will later double-check to see if the user was indeed trustworthy.

I only worry that an untrustworthy user will keep the content online until a mod reviews it, increasing the time the material is online and increasing the risk. It would be difficult to argue that “this was done in the interest of user satisfaction, even though it means that more CSAM got out”. Ultimately I don’t know how many people want to argue that to a judge.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

From the OP, it seems the filters don't flag CSAM. They flag any NSFW. That said, keep in mind that the filter would also have false negatives, so if people want to slip NSFW though, they might be able to do it through the filter even without such option.

But I don't mind the content staying hidden until a mod reviewed is in such cases. The false positive rate of the filter would likely be small, so there wouldn't be too many things that need review.

[–] [email protected] 4 points 1 year ago (1 children)

I wonder how hard it would be to fund a full time staff to review content. That's how other platforms do it.

[–] [email protected] 6 points 1 year ago

Other platforms also use armies of unpaid volunteers to do it. There are various methods, and with this being an entirely volunteer run an financed platform I really doubt if they is feasible. In the long term I like the idea of using technology to improve detection and moderation even if that requires some development commitment.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

I think the images should never be cached from other instances in the first place, that is a huge oversight in pictrs since not only does it have the potential to cache unwanted content but also causes the images hosted to rapidly accumulate which isn't ideal as it increases storage requirements which is unfair to people who want to self-host a personal instance. Hosting a personal instance should not have monstrous storage requirements or serious liability risk due to caching all images automatically, it should only cache what is uploaded to the Instance like profiles and banners, and posts that include images from the Instance.


I have reservations about allowing fully-invite based registrations on lemmy instances. While I do think it might be good to have invites as a way for users to skip filling out an application I don't really like the idea of requiring them like Tildes does, makes it feel like an elitist exclusive club of sorts having to beg for an invite from users. I don't think it should be an alternative to application-based registration, but rather a supplement to it, if someone can get an invite from users that's great but if not they should still be able to write an application to join, this could be extensive and also lower priority since you could get invites but should still be an option available.


Account requirements really depends on what they are and what they restrict (also who on the instance is allowed to impose restrictions). For example on instances with downvotes enabled I think score/upvote requirements are a bad idea since it essentially means that people who disagree are locked out, like on Reddit with karma restrictions, I do not support this, it creates an echo-chamber where unpopular opinions. It'll also lead to upvote farming if there are negatives due to having a lower score.

Comment or post requirements would just lead to post or comment farming similar to vote farming, though it's not as bad as score-requirements since people making posts and comments naturally (whether they are liked or not) can't be taken away by other people based on opinions (only if they break the rules and get posts removed, which isn't even remotely similar since they broke the rules).

Limiting image uploading is a fair requirement in my opinion since uploads can be particularly harmful if the uploads are malicious, and also uploads aren't really needed since people can externally host almost all their images without the need for uploads.

When it comes to DMs and restrictions around them I feel like that should be up to individual users to decide to allow private communication from certain users or not, or even to allow DMs at all, this shouldn't be something globally applied to people, maybe it could be a default in User settings and have a requirement set by the Admins but people should be able to turn it off if they don't care or want to accept messages from new users, I know I certainly will, I hate being nannied when it comes to who's allowed to send me messages, IMO Annoying or uncomfortable DMs are a fact of life and I prefer to deal with issues when they happen rather than block anyone who's a new user that might want to talk to me, it's one of the things I hated that Reddit does without giving me the option to opt out and receive messages from everyone.


I think having a Machine-Learning based system to identify Malicious images is actually a pretty good idea going forward, I know how some people feel about AI and Machine-Learning but I think it's probably our best defense considering that none of us want to see it, it might have False positives but I'd rather than than to allow CSAM to live here. Ultimately the choice is have ML scanning or Disable pictrs here, I think ML is the better option because people are going to want to have Avatars and without pictrs that isn't possible (unless Lemmy adds support to the UI for externally hosted Avatars and Banners).


[–] Anonymousllama 4 points 1 year ago (1 children)

Overall it's a tough situation to be in. I feel a combination of account restrictions would be a way to mitigate the majority of these low quality troll accounts who get verified and then immediately start spamming.

Having images uploads tied behind user metrics such as interactions, time since creation, upvote / downvote count etc I feel would be a good indicator of a "real" user. You'll always have bad actors coming in causing issues, but at least making new users jump through hoops will make this process slower.

Closing registrations temporarily to add in extra mod features is fine, but leaving it closed and switching to an invite only system feels like it's going to slow adoption (unless in the request an invite form it's explicit that the request will be processed quickly, people will just move on otherwise)

load more comments (1 replies)
[–] [email protected] 4 points 1 year ago (2 children)

Can we get image upload back?

load more comments (2 replies)
[–] [email protected] 3 points 1 year ago
  1. I understand that this would be a temporary measure, and I hope this gets revisited in the near future.
  2. Got to do what you have to do.
  3. same as 2
  4. I do not agree with invite-based registrations and would prefer other ways to limit sign ups such as what others have already suggested in this thread.
  5. This will be tricky, but if done correctly would be something I can support.
  6. Agreed.

Once again, thank you for this wonderful instance and I'm glad this is my home.

[–] [email protected] 3 points 1 year ago

Has there been any developments on the Github in regards to all this? Really, the only things that will solve this long term are proper and granular moderation tools.

load more comments
view more: ‹ prev next ›