this post was submitted on 28 Oct 2024
280 points (97.9% liked)

World News

40073 readers
2240 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 53 points 3 months ago (2 children)

Because they are using images of real children.

[–] FlyingSquid 25 points 3 months ago (4 children)

I agree, but if there were some way to create CSAM without using real children (I'm not sure how you would train such an AI model), it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.

Because my god, we need to figure out something.

[–] Zorque 20 points 3 months ago (1 children)

I mean trying to help them get treatment instead of going all pod-people on anyone showing even the possibility of being attracted to kids would be helpful.

[–] FlyingSquid 22 points 3 months ago

I've been saying that for ages. Obviously we don't want to enable any pedophiles to do anything horrific to children, but we're at a state right now where if you have those urges to begin with, you're basically already told to accept that you're an incurable monster. So why not act on the urges?

Somehow we need to get through to such people that they need to get help before they do anything terrible. I'm not sure how to do that in the current climate though.

[–] [email protected] 7 points 3 months ago* (last edited 3 months ago) (1 children)

The way AI models work, you don't have to train it on the thing you want it to do, you can ask it to combine the things it knows about. Take any of the meme loras for example, like pepe punch or patcha.

So literally any model that can generate pictures of naked adults and clothed children - which is to say almost all of them - is going to be at least somewhat competent in creating CP unless those prompts are being actively censored and blocked.

[–] [email protected] 2 points 3 months ago (1 children)

Wouldn't that generate images of children with small-sized adult bodies?

If it doesn't know what a child's body looks like, it can't just figure it out.

[–] [email protected] 1 points 3 months ago

The datasets will have enough images of kids in bikinis and underwear from stock photos and clothes shop listings etc to figure that part out rather easily.

[–] [email protected] 5 points 3 months ago

Train it to depict humans that look like anime characters that are ~~definitely 18 or older~~ immortal dragons that are taking on the bodies of young human beings

DisclaimerI am not condoning, endorsing, or suggesting this

[–] Mango -4 points 3 months ago (1 children)
[–] [email protected] 7 points 3 months ago (1 children)

Its a form of stalking, probably makes it more likely for them to rape that child, even if they don't wind up doing that it would still qualify as a form of revenge porn.

[–] Mango 2 points 3 months ago (2 children)

It's not stalking and "probably" shouldn't rouse a courtroom.

[–] [email protected] 4 points 3 months ago* (last edited 3 months ago) (1 children)

It is when they are commissioning these "works".

Ed8t: To be clear, that's what happened here.

[–] Mango 3 points 3 months ago (2 children)

Commissioning as in buying? I'm not sure how that changes it to stalking.

IMO, the worst part about it is that there's someone else out there who thinks less of me because there's some naked imagery of me.

[–] [email protected] 2 points 3 months ago (1 children)

Commissioning as in a buyer has an interest in a particular child. They ask the guy using ai to make a custom bit of CSAM, so the buyer can have CSAM of that specific child.

That kind of commissioning.

[–] Mango 0 points 3 months ago (1 children)

Okay, but if I ask someone to draw me a picture of Nicholas Cage naked, is that stalking him? What if I have Nick Cage pictures all over my walls and even ceiling and my phone wallpaper? Is that stalking? Does it help if I'm really horny for him? And I touch myself?

[–] [email protected] 2 points 3 months ago* (last edited 3 months ago) (1 children)

We aren't talking about a famous person.

We are talking about someone taking pictures of kids they know to have someone else turn it into CSAM.

The comparison you are trying to make is completely irrelevant. The fact that you see it as a comparison makes it even worse.

[–] Mango 0 points 3 months ago (1 children)

Do famous people have certain exemptions? Fewer rights?

You can definitely say that them going around trying to get the pictures to begin with is stalking though. I pretty well didn't consider that step and was focused on the AI bit.

[–] [email protected] 1 points 3 months ago

The AI part is a continuation.

And this...

Do famous people have certain exemptions? Fewer rights?

Is completely irrelevant and ridiculous as a comment.

You are comparing a household name and likeness to a child that someone wants to sexually abuse, is near, and able to get pictures of.

Stop talking about celebrities and come on down to reality please.

[–] [email protected] 1 points 3 months ago (1 children)

Oh I didn't realize I needed to be a lawyer.

[–] Mango -3 points 3 months ago

I can buy photos of Robert Downey Junior from Marvel Studios and that's not stalking.