There is a big difference between mild NSFW and full on porno. Suppose there is a News story with photo/video but it's a little bit graphic or violent. Nobody is jacking off to that. Maybe shouldn't view it at work, but in the library is fine.
Maybe it's a funny meme pic but there's a nip slip situation going on. No biggy; it should probably be tagged NSFW. Probably don't want it showing up at actual work. But I want to enable this kind of content away from work without a bunch of actual porn showing up in my feed.
There should be a porn tag. It's not the same as NSFW.
EDIT: The two main devs have done some amazing work here, but as I understand it they are totally booked for the foreseeable future. My rust chops aren't quite up to snuff (yet) and my frontend chops are non-existent, so it might be a quite while before I'm up to speed enough to make a meaningful contribution. In the meantime just thought I'd point out the issue.
I'm very defensive when it comes to NSFW. But I think "NSFW flag with mandatory reason from a drop down" (e.g. nudity, sex, violence, gore, explicit-language) is the best solution which satisfy both. That is actually a great idea.
Just a UX nitpick: a drop down is usally for a single choice question but this should be multiple choice to avoid unintentional mislabeling.
I agree. A thing can be sensitive content for more than one reason at the same time.
This kind of solves the issue of cross content as well. I may be okay with seeing porn, but I don't want to see the like gore porn. So something tagged gore, and nudity can just be hidden from me thanks to me blocking anything tagged gore.
Someone should write up the feature as described and explain why this particular implementation solves issues people are having as a pull request on the repo. This would be much easier to implement than a custom tagging feature like others are asking for. Like yeah a custom tagging feature would be cool but that's a ton of work and wouldn't show up for quite a while.