this post was submitted on 21 May 2024
509 points (95.4% liked)

Technology

59483 readers
3823 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -1 points 6 months ago (2 children)

I think the challenge with Generative AI CSAM is the question of where did training data originate? There has to be some questionable data there.

[–] [email protected] 17 points 6 months ago

That would mean you need to enforce the law for whoever built the model. If the original creator has 100TB of cheese pizza, then they should be the one who gets arrested.

Otherwise you're busting random customers at a pizza shop for possession of the meth the cook smoked before his shift.

[–] [email protected] -4 points 6 months ago (1 children)

There is also the issue of determining if a given image is real or AI. If AI were legal, that means prosecution would need to prove images are real and not AI with the risk of letting go real offenders.

The need to ban AI CSAM is even clearer than cartoon CSAM.

[–] Madison420 3 points 6 months ago* (last edited 6 months ago)

And in the process force non abusers to seek their thrill with actual abuse, good job I'm sure the next generation of children will appreciate your prudish factually inept effort. We've tried this with so much shit, prohibition doesn't stop anything or just creates a black market and a abusive power system to go with it.