this post was submitted on 24 Jul 2023
211 points (81.7% liked)

Technology

34989 readers
172 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[–] whatsarefoogee 139 points 1 year ago (3 children)

Mastodon is a piece of software. I don't see anyone saying "phpBB" or "WordPress" has a massive child abuse material problem.

Has anyone in the history ever said "Not a good look for phpBB"? No. Why? Because it would make no sense whatsoever.

I feel kind of a loss for words because how obvious it should be. It's like saying "paper is being used for illegal material. Not a good look for paper."

What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you'll probably be the first to end up in prison.

[–] [email protected] 31 points 1 year ago (1 children)

I get what you're saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

[–] [email protected] 26 points 1 year ago (1 children)

I don't see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

[–] [email protected] -4 points 1 year ago (2 children)

This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it's still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

[–] [email protected] 13 points 1 year ago

Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There's a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.

[–] [email protected] 11 points 1 year ago

Again if it's illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

[–] [email protected] 1 points 1 year ago

I've thought about building a truly decentralized app similar to lemmy, but the question if how to prevent things like CSAM from ending up on unwitting users' devices is the main thing stopping me.

Lemmy has exactly the same problem, and the solution seems to be to defederate from instances that host that kind of content. That works, but it's a lot of work for an admin, so we absolutely need better moderation tools to help detect unwanted content and block the source of it.

I just wish people wouldn't post such nonsense.

[–] [email protected] -2 points 1 year ago (1 children)

Thats a dumb argument, though.

phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.

Mastadon/Twitter/social media is the host/provider/moderator.