this post was submitted on 21 May 2024
509 points (95.4% liked)

Technology

55731 readers
3331 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] helpImTrappedOnline 150 points 1 month ago* (last edited 1 month ago) (4 children)

The headline/title needs to be extended to include the rest of the sentence

"and then sent them to a minor"

Yes, this sicko needs to be punished. Any attempt to make him the victim of " the big bad government" is manipulative at best.

Edit: made the quote bigger for better visibility.

[–] cley_faye 49 points 1 month ago

That's a very important distinction. While the first part is, to put it lightly, bad, I don't really care what people do on their own. Getting real people involved, and minor at that? Big no-no.

[–] [email protected] 26 points 1 month ago (1 children)

All LLM headlines are like this to fuel the ongoing hysteria about the tech. It's really annoying.

[–] helpImTrappedOnline 8 points 1 month ago* (last edited 1 month ago)

Sure is. I report the ones I come across as clickbait or missleading title, explaining the parts left out...such as this one where those 7 words change the story completely.

Whoever made that headline should feel ashamed for victimizing a grommer.

[–] MeanEYE 7 points 1 month ago

I'd be torn on the idea of AI generating CP, if it were only that. On one hand if it helps them calm the urges while no one is getting hurt, all the better. But on the other hand it might cause them not to seek help, but problem is already stigmatized severely enough that they are most likely not seeking help anyway.

But sending that stuff to a minor. Big problem.

[–] [email protected] -1 points 1 month ago (2 children)

Cartoon CSAM is illegal in the United States. Pretty sure the judges will throw his images under the same ruling.

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

https://www.thefederalcriminalattorneys.com/possession-of-lolicon

[–] Madison420 9 points 1 month ago

It won't. They'll get them for the actual crime not the thought crime that's been nerfed to oblivion.

[–] ameancow 3 points 1 month ago

Based on the blacklists that one has to fire up before browsing just about any large anime/erotica site, I am guessing that these "laws" are not enforced, because they are flimsy laws to begin with. Reading the stipulations for what constitutes a crime is just a hotbed for getting an entire case tossed out of court. I doubt any prosecutors would lean hard on possession of art unless it was being used in another crime.