this post was submitted on 28 Oct 2024
280 points (97.9% liked)

World News

39329 readers
2280 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 25 points 1 month ago (5 children)

Most AI porn images looks quite underage to me, to be completely frank. :/

[–] [email protected] 15 points 1 month ago (4 children)

At least with a human being it’s a matter of factuality whether or not they’re over 18. But with AI it’s unverifiable, especially considering some models have already been trained on CSEM.

Once someone has that model locally, do they technically possess CSEM, even unknowingly? Do they only possess it if they try to make the AI make it? Seems like something someone in charge should have thought about in a legally binding way before dumping the internet into an image generator!

[–] [email protected] 26 points 1 month ago* (last edited 1 month ago) (1 children)

In this case he used pictures from actual children and transformed them into CSAM using AI. So there's no question about the age, and there are real victims, too.

[–] [email protected] 9 points 1 month ago

Oh yeah, this dude without a question is guilty and a pedo. I meant more that ‘out of the box’ models may still produce material that looks really CSEM adjacent, and you have no way of telling whether or not it used CSEM to generate the image if the whole dataset is poisoned by actual CSEM being included.

[–] FourPacketsOfPeanuts 5 points 1 month ago

Bound to be tested in court sooner or later. As far as I understand it one is "in possession" if they have access to a set of steps or procedures that would recover an image. So this prevents offenders from hiding behind the fact their images were compressed in a zip file or something. They don't have a literal offending image, but they possess it in a form that they can transform.

What would need to be tested is that AI generators are coming up with novel images rather than retrieving existing ones. It seems like common sense but the law is quite pedantic. The more significant issue is that generators don't need to be trained on csem to come up with it. So proving someone had it with the intent of producing it would always be hard. Even generators trained on illegal material I'm not sure it would be straight forward to prove that someone knew what it was capable of.

[–] [email protected] 3 points 1 month ago (1 children)

I assume any CSEM ingested into these models is absolutely swamped by the massive amount of adult porn that's much more easily available. A handful of images aren't going to drive model output in datasets of the scale of the image generation models. Maybe there are keywords that could drill down to be more associated with the child porn, but a lot of "young" type keywords are already plentifully applied to adults, and I imagine accidental child porn ingests are much less likely to be as conveniently labeled.

So maybe you can figure out how to get it to produce child porn, but it probably won't just randomly produce it for an innocent porn prompt.

[–] [email protected] 3 points 1 month ago (1 children)

The actual issue is that models who are trained on porn, or even just nudity, and simultaneously trained on perfectly innocent pictures of children will be able to produce at least an approximation of CSAM if you know what you're doing.

More recent commercial foundation models are absolutely neutered when it comes to nudity and, or at least that's the hypothesis, that's why so they're godawful at anatomy. Which is why upcoming community models are going to go the way of include the porn but not include any pictures of any child in any situation.

[–] [email protected] 1 points 1 month ago

Absolutely agree. My comment above was focused on whether some minimal amount of CSEM would itself make similar images happen when just prompting for porn, but there are a few mechanics that likely bias a model to creating young-looking faces in porn and with intentional prompt crafting I have no doubt you can at least get an approximation of it.

I'm glad to hear about the models that are intentionally separating adult content from children. That's a good idea. There's not really much reason an adult-focused model needs to be mixed with much other data. There's already so much porn out there. Maybe if you want to tune something unrelated to the naked parts (like the background) or you want some mundane activity, but naked, but neither of those things need kids in them.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago) (2 children)

I'm not the person to clear up this legal grey area. I just think that AI porn often has these incredibly young faces which makes the enjoyers of that porn extra creepy.

[–] [email protected] 2 points 1 month ago

Oh yeah, I was agreeing with you, sorry if I was unclear. It pisses me off this situation exists to begin with.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Creeps, or young themselves, and/or have shit all for an artistic eye. AI models have given us a whole new uncanny valley and that's photorealistic textures on comic/anime abstraction.

[–] [email protected] 10 points 1 month ago

I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that's not very surprising to me.

On top the of the general societal bias towards youth for "beauty" related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It's probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.

[–] FlyingSquid 8 points 1 month ago (1 children)

Most real porn has women who look like kids to me.

Even the so-called MILFs look about 15 years younger than me and I'm 47.

You have to get into "mature" and shit to see women my age.

I'm not into young women. I'm just not. It looks like they're fucking a high schooler and it's icky to me.

And then there's all the schoolgirl and incest or incest-adjacent shit. "Playing with my stepdad." No. Just no.

[–] Zorque -5 points 1 month ago (2 children)

So... anyone who's not your age looks like a child to you? That's kind of fucked up.

[–] FlyingSquid 14 points 1 month ago (1 children)

No?

The majority of the women in porn, who can't be more than their very early 20s, look like children to me. And they infantilize them too. I'm not sure where you got anyone not my age from.

[–] Zorque -1 points 1 month ago (1 children)

Fair, you didn't explicitly state it. Just implied it with statements about how most people in porn (who should all be adults, unless you're looking at questionable material) look like children to you. Then make comments about how even the "milfs" are too young.

Maybe it's not about them being too young, maybe it's time you accept that you're old. You're putting a lot of your own biases into your judgment instead of looking at it objectively.

[–] FlyingSquid 6 points 1 month ago* (last edited 1 month ago)

Sorry... why should I look at what I personally want out of the porn I want to see objectively? It's entirely subjective.

I mean I'm not sure how I could have been clearer that this was about my personal preferences. I said "to me" twice.

[–] [email protected] 6 points 1 month ago (1 children)

You gotta give me your yoga instructor because that's a big stretch!

[–] Cryophilia 1 points 1 month ago

He literally said that 32 year old women look too young for him.

[–] [email protected] 3 points 1 month ago (1 children)

Most AI porn I've seen looks like Eldritch abominations, but then I haven't seen any for a while so maybe it's improved

[–] [email protected] 0 points 1 month ago

I don't want to kinkshame freaks who like eldritch abominations. I wanna kinkshame pdf file creeps.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago)

Ive seem some grandmas