this post was submitted on 28 Oct 2024
280 points (97.9% liked)

World News

39331 readers
2510 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

you are viewing a single comment's thread
view the rest of the comments
[–] Mango 1 points 1 month ago (1 children)

So your point is that because he's fast with this tool, it's bad? Guess we gotta institute fake CP data rate limits.

[–] [email protected] 1 points 1 month ago (1 children)

A tool that allows anyone to generate countless images of CSAM in minutes (based on real images as input) is definitely worse than someone needing to spend years honing an art and using hours to produce one image of CSAM. I'm not really sure how someone could argue against that.

[–] Mango 1 points 1 month ago (1 children)

Why? It's pictures. Sticks and stones yo.

[–] [email protected] 1 points 1 month ago (1 children)

So again...you wouldn't do it with your children's pictures, right?

[–] Mango 1 points 1 month ago (1 children)

I'm an antinatalist. I think the shit kids gotta go through regularly is worse than all that.

[–] [email protected] 1 points 1 month ago

So if you were to get 5 cents per image, would you do it? Lol