The article points out that the strength of the Fediverse is also it’s downside. Federated moderation makes it challenging to consistently moderate CSAM.
We have seen it even here with the challenges of Lemmynsfw. In fact they have taken a stance that CSAM like images with of age models made to look underage is fine as long as there is some dodgy ‘age verification’
The idea is that abusive instances would get defederated, but I think we are going to find that inadequate to keep up without some sort of centralized reporting escalation and ai auto screening.