this post was submitted on 11 Sep 2023
1133 points (96.0% liked)
Technology
59612 readers
3667 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you were head of a psychiatric ward and had an employee you knew was telling patients "Boy, I sure wish someone would kill as many black people as they could", you would absolutely share responsibility when on of them did exactly that.
If you were deliberately pairing that employee with patients who had shown violent behaviour on the basis of "they both seem to like violence", you would absolutely share responsibility for that violence.
This isn't a matter of "there's just so much content, however can we check it all?".
Reddit has hosted multiple extremist and dangerous communities, claiming "we're just the platform!" while handing over the very predictable post histories of mass shooters week after week.
YouTube has built an algorithm and monetisation system that is deliberately designed to lure people down rabbit holes then done nothing to stop it luring people towards domestic terrorism.
It's a lawsuit against companies worth billions. They're not being executed. There are grounds to accuse them of knowingly profiting from the grooming of terrorists and if they want to prove that's not the case, they can do it in court.