It's hard to have a nuanced discussion because the article is so vague. It's not clear what he's specifically been charged with (beyond "obscenity," not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.
I completely get the "lock them all up and throw away the key" visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?
I'm at least all for a "fruit of the poisoned tree" theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it's going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.
The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That's really tough. Because not only does it not directly hurt anyone in its creation, there's a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.
Could, because I don't think there's studies that answers whether those are true.