this post was submitted on 17 Nov 2023
593 points (95.1% liked)
Technology
59119 readers
3983 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If a Fediverse instance grew so big that it couldn't moderate itself and had a lot of spam/Nazis, presumably other instances would just defederate, yeah? Unless an instance is ad-supported, what's the incentive to grow beyond one's ability to stay under control?
deleted
We need to keep distinguishing "actual, real-life child-abuse material" from "weird/icky porn". Fediverse services have been used to distribute both, but they represent really different classes of problem.
Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.
Weird/icky porn is not typically illegal, but it's something many people don't want to support or be associated with. Instance owners have a right to say "I don't want my instance used to host weird/icky porn." Other instance owners can say "I quite like the porn that you find weird/icky, please post it over here!"
Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.
You'd be surprised by how much of the Internet was built by furries, BDSM folk, and other people whose porn a lot of folks think is weird and icky.
Also, you seem to have misunderstood the gist of my comment, or I wasn't clear enough. The tools to deal with CSAM will of necessity be a lot stronger than content moderation that's driven by users' preferences of what they'd like not to see.
I'm talking about the necessities of moderation policy.
The things you think it's "suspect" I'm not saying? Those are things I think are obviously true and don't need to be restated. Yes, child abuse is very bad. We know that. I don't need to say it over again, because everyone already knows it. I'm talking specifically about the needs for moderation here.
I'm pointing at the necessary distinction between "you personally morally object to that material" and "that material will cause the law to come down on you and your users and anyone who peers with you".
You should have the ability to keep both of those off your server, but the latter is way more critical.
"White knighting"? Delete your account.