this post was submitted on 31 Jul 2023
941 points (98.9% liked)

Canada

7273 readers
576 users here now

What's going on Canada?



Related Communities


🍁 Meta


πŸ—ΊοΈ Provinces / Territories


πŸ™οΈ Cities / Local Communities

Sorted alphabetically by city name.


πŸ’ SportsHockey

Football (NFL): incomplete

Football (CFL): incomplete

Baseball

Basketball

Soccer


πŸ’» Schools / Universities

Sorted by province, then by total full-time enrolment.


πŸ’΅ Finance, Shopping, Sales


πŸ—£οΈ Politics


🍁 Social / Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage: lemmy.ca


founded 4 years ago
MODERATORS
 

As it says in the title, the BBC is starting its own Mastodon instance. I think the CBC (and other news networks) should do similar. Particularly with the recent passing of Bill C-18 it seems like a world where the links we share are crossposts to news organization's own content is the perfect resolution to that whole issue.

you are viewing a single comment's thread
view the rest of the comments
[–] voluble 9 points 1 year ago* (last edited 1 year ago) (4 children)

I didn't know about the BBC thing, that's a pretty big deal for the fediverse. A question though, in the linked BBC article, it seems like they're heavily relying on moderation to come from the home instance of anyone who posts a reply to a BBC post. If a self hosted troll server decides to start aggressively spamming these media accounts, or posts illegal material as replies to their posts, what could a media organization do to stop it? Is there any protection against say a wide network of troll servers working together?

Traditional social media at least theoretically has a better ability to shut this sort of activity down because they can see the whole picture of user activity and use algorithms to discover and ban bots. I worry that decentralization itself will become an attack vector for malicious activity.

[–] [email protected] 8 points 1 year ago (3 children)

Isn't this what defederation is for? If it became enough of an issue media companies could even work together to maintain a shared blacklist to reduce the individual burden.

[–] voluble 4 points 1 year ago* (last edited 1 year ago) (2 children)

Good point & thanks for the post.

Won't different media companies have different red lines for what gets blacklisted and what doesn't, and wouldn't that be at best confusing, and worse, a political quagmire? Let's say (after the fediverse gains some momentum) an influential politician uses a self hosted instance to exclusively communicate their policies, and as a home for their political base, but leaves the server moderated well below an acceptable level on purpose. Are media companies obligated to defederate it? Will they? Seems like there is a whole new world of trade-offs and grey areas here.

Even if we assume troll instances are easily and effectively defederated and can't be spun up faster than they can be collectively blocked. Other than volunteer moderation, what stops an ocean of trolls from flooding better-known, federated instances?

Just want to make clear - I'm 100% an advocate for the fediverse, I'm here because I think it's awesome right now. I just worry about the chances for it to get drowned in troll/malicious/corporate material as it grows in popularity, and I'm trying to think if there are any ways to stem that tide. Seems reasonable to expect that it will start coming.

[–] [email protected] 4 points 1 year ago

At some point, a lot of server will start to be defederated and some big player will start to be more trust-worthy. Just like email servers.

One cannot start sending email with their own server without proving it's reliable first.

I hope that we, as a society (the gov), make a process on how to become a trusted server instead of relying on the free market for this because right now, it's very hard to send emails without being blacklisted by every major email provider.

[–] matt 2 points 1 year ago

In your example, people who have the "bad instances" blocked won't see the replies under the posts in question, as the instance will not fetch replies from said source.

With how Mastodon works as well, it won't fetch replies from instances until they're known either, so brand new instances aren't going to flood popular comment sections - this is a bit of a con though in a way, as it degrades the user experience when trying to read threads and causes people to constantly post the same stuff as they can't see all the replies.