this post was submitted on 21 May 2024
509 points (95.4% liked)

Technology

55731 readers
3390 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 40 points 1 month ago (22 children)

OMG. Every other post is saying their disgusted about the images part but it's a grey area, but he's definitely in trouble for contacting a minor.

Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

https://www.thefederalcriminalattorneys.com/possession-of-lolicon

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

[–] Madison420 23 points 1 month ago (6 children)

Yeah that's toothless. They decided there is no particular way to age a cartoon, they could be from another planet that simply seem younger but are in actuality older.

It's bunk, let them draw or generate whatever they want, totally fictional events and people are fair game and quite honestly I'd Rather they stay active doing that then get active actually abusing children.

Outlaw shibari and I guarantee you'd have multiple serial killers btk-ing some unlucky souls.

[–] [email protected] 21 points 1 month ago (39 children)

Exactly. If you can't name a victim, it shouldn't be illegal.

load more comments (39 replies)
[–] ZILtoid1991 0 points 1 month ago (1 children)

My main issue with generation is the ability of making it close enough to reality. Even with the more realistic art stuff, some outright referenced or even traced CSAM. The other issue is the lack of easy differentiation between reality and fiction, and it muddies the water. "I swear officer, I thought it was AI" would become the new "I swear officer, she said she was 18".

load more comments (1 replies)
load more comments (4 replies)
[–] surewhynotlem 17 points 1 month ago (6 children)

Would Lisa Simpson be 8 years old, or 43 because the Simpsons started in 1989?

load more comments (6 replies)
[–] Clbull 8 points 1 month ago (2 children)

I thought cartoons/illustrations of that nature were only illegal in the UK (Coroners and Justices Act 2008) and Switzerland. TIL about the PROTECT Act.

[–] ZILtoid1991 3 points 1 month ago

The thing about the PROTECT Act is that it relies on the Miller test, which has obvious holes, and is like depends on who is reviewing it and stuff. I have heard even the UK law has holes which can be exploited.

[–] [email protected] 3 points 1 month ago (1 children)

Several countries prohibit any fictional depictions of child porn, whether drawn, written or otherwise. Wikipedia has an interesting list on that - https://en.wikipedia.org/wiki/Legality_of_child_pornography

[–] Rayspekt 1 points 1 month ago (2 children)

I wonder if there is significant migration happening into those countries where csam os legal.

[–] ZILtoid1991 2 points 1 month ago

Most people instead have a trip to a place where underage sex workers are common, one can just have an external hard drive and/or a USB stick for that material which they hide. "An"caps are actively trying to form their own countries, partly to legalize "recordings of crimes" as they like to call them, if not outright to legalize child rape and child sex trafficking.

[–] [email protected] 2 points 1 month ago

Unlikely. Tourism, on the other hand...

load more comments (18 replies)