this post was submitted on 01 Aug 2023
5 points (100.0% liked)

theATL.social Discussion

96 readers
1 users here now

News, info, and announcements pertaining to theATL.social. Community is open to all, but official announcements are only posted by @[email protected]

founded 1 year ago
MODERATORS
 

Hi theATL.social (Mastodon) and yall.theATL.social (Lemmy) friends. Your friendly admin, @michael, here.

Currently, theATL.social blocks two domains from federation but does not utilize any block lists. the Lemmy yall.theATL.social does not block any domains.

My general admin philosophy is to let users decide what content they want to see, or not see. However, the Mastodon UI can make the adding/removing of domain block lists a bit tedious. (There are some tech/UI-related options to make this easier.)

On the other hand, I am personally not a free speech absolutist, and there are limits to what content could/should be relayed through theATL.social's servers.

For example, illegal content, instances dedicated solely to hate speech/harassment, etc. To that end, the Oliphant Tier 0 block list offers a "floor" to remove literally the worst instances operating on the Fediverse: https://codeberg.org/oliphant/blocklists/src/branch/main/blocklists

As your admin, I don't want to make any unilateral decisions - rather, I'd prefer a user/stakeholder conversation, with as many Q&As as helpful.

With that intro, let me know your thoughts:

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago (2 children)

I would agree completely with utilizing the Tier 0 block list at the minimum. If things got bad, and users were being harassed, then going further would be justifiable to me. I prioritize safe communities over free speech, if a trade-off has to be made.

[–] [email protected] 3 points 1 year ago (1 children)

Also, I believe the research on Mastodon CSAM that made all that noise last week, pointed out that utilizing the Tier 0 block list would have filtered out like ~80% of the stuff they found

[–] [email protected] 1 points 1 year ago

Agreed - that report is what drove the urgency to implement this. It is one thing if "bad stuff" ended up on the server after reasonable steps were taken to prevent. However, if reasonable steps were not taken to prevent and bad stuff showed up...well, I don't want to be in that position!

[–] [email protected] 1 points 1 year ago

Duly noted! Thank you