this post was submitted on 19 Sep 2023
623 points (98.1% liked)
Europe
8324 readers
1 users here now
News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺
(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures
Rules
(This list is obviously incomplete, but it will get expanded when necessary)
- Be nice to each other (e.g. No direct insults against each other);
- No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
- No posts linking to mis-information funded by foreign states or billionaires.
Also check out [email protected]
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.
So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.
At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.
There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.
Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.
Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.
This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.
Fucking horrible world we're forced to live in. Where's the fucking exit?
Sauce that allowing computer generated cp causes more harm?
How is this place infested with so many fucking nonces?
I made no claims about "more harm" so what imaginary claim are you referring to in your attempt to justify CSAM?
Oh, so you want more harm. Curious.