this post was submitted on 16 Jun 2023
16 points (100.0% liked)

Fediverse

17671 readers
23 users here now

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of "federation" and "universe".

Getting started on Fediverse;

founded 4 years ago
MODERATORS
 

anonymity and privacy seem to come at odds with a social platform's ability to moderate content and control spam.

If users have sufficient privacy and anonymity, then they can simply use another identity to come back, or use multiple identities.

Are there ways around this? It seems that any method of ensuring that a banned user is kept off the platform would necessitate the platform knowing information about the user and their identity

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (1 children)

Great points; thanks! On that, there's a couple of ways I could see content moderation involving more personal freedom and choice:

  • give users more general types of content. For example "block all content containing ____ racial slur". Could be made more complex as well, especially with how open source language models are coming along

  • give users the ability to follow another user's content self-moderation choices. Consequently, a group of users can all be part of a group where, if one user flags content or type of content, it applies to others. The niceness of this is that it would be extremely fluid and you can opt out with a button.

This could lead to better moderation in my opinion, and less disconnect between moderators and users.

Does not solve the anonymity issue, but that's for another comment.

[–] [email protected] 0 points 1 year ago (1 children)

Those are reasonable options - though I'm pessimistic enough to believe that trolls will get better than every automated system, so we'd probably want some manual options. I wouldn't say it's not possible - just would require quite a bit of work, and would likely be an ongoing battle to improve your auto-moderator.

It feels like I'm moving the goalposts, so apologies, but your response got me thinking further. The other big advantages I can think of for central censorship is that it can actually prevent hosting of content - which has two benefits:

  • legal concerns - make countries will require the removal of some amount of content - extreme stuff of all the usual sorts. Some jurisdictions will also require minors being prevented from accessing certain content, at least to a reasonable degree - refusing to host that kind of content is an easy solution.
  • community unity and protection - is a lot more abstract, age debatable - but I'd contest that central moderation can give a certain "this content isn't wanted in our community" that individual censorship won't. Really difficult to define, though.
[–] [email protected] 1 points 1 year ago

first, just to clarify, I am not saying all moderation should be automatic. That is what I said in my first point, but in my second point, moderation is still manual and delegated to another person. The only difference is that you can very easily opt-out of it without losing anything else, or you can override it.

so, instead of moderation being something tighly coupled with a community or space where people post, it is instead something separate. You can "subscribe" to a moderation policy managed by someone or a group of people, and anything they ban (automatically or manually) applies to you without extra effort. The benefit to this is that if you ever regret this "subscription", you dont lose out on the entire community. You can simply just change the moderation policy.

To answer your other points:

  • legal concerns: I think it will always be hard to please all lawmakers. But I think this approach would be coupled with a censorship proof model. It is a protocol that is hard to outright ban, as another instance can spring up anywhere to provide a gateway to the rest.
  • this is still possible. The "community" in this case is the group of people subscribing to a particular moderation policy. The key is that unsubscribing from this policy is extremely easy without much loss. User freedom is satisfied