this post was submitted on 28 Aug 2023
1462 points (97.7% liked)
Lemmy.World Announcements
29104 readers
11 users here now
This Community is intended for posts about the Lemmy.world server by the admins.
Follow us for server news 🐘
Outages 🔥
https://status.lemmy.world
For support with issues at Lemmy.world, go to the Lemmy.world Support community.
Support e-mail
Any support requests are best sent to [email protected] e-mail.
Report contact
- DM https://lemmy.world/u/lwreport
- Email [email protected] (PGP Supported)
Donations 💗
If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.
If you can, please use / switch to Ko-Fi, it has the lowest fees for us
Join the team
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm guessing they're not even flagging that shit as NSFW? I've been using liftoff and have the NSFW stuff hidden. I haven't run into of it yet but that's fucked up, hopefully it gets under control with this.
Maybe mods of each section can turn on manual approvals of submissions?
Isn't there a tool (possible free) by Google I think that detects abusive material like this?
https://protectingchildren.google/intl/en_uk/#introduction
I agree, everything on Lemmy is public for all to see, that's the nature of the Fediverse. Nothing here is really private, even vote counts since Admins of any self hosted server can see them, or Kbin which reveals them publicly for all.
Even DMs don't have it, which is why it nags you to use Matrix for secure DMs.
To combat this until there is something in place to automate blocking it. Manually approval might just be the only way to deal with it for now. Places can add more moderators.
Manual approval would mean that mods have to see all that shit to block it... That's not the right solution imo
They'll end up having to see it anyways to remove it, and by that point more than just the mods would have seen it...