this post was submitted on 03 Mar 2024
200 points (86.5% liked)

Technology

59428 readers
2854 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 3 points 8 months ago* (last edited 8 months ago) (1 children)

If somebody wanted to draw animated kiddie porn they could still do that. How far would you go until you ban crayons

It's genuinely impressive how completely you missed my point.

How about another analogy: US federal law allows people to own individual firearms, but not grenades.

But they're both things that kill people, right? Why would they be treated differently?

Hint: it's about scale.

The same is true of pipe bombs. But anyone can make a pipe bomb. Genie is out of the bottle, right? So why are there laws regulating manufacture and ownership of them? Hmm...

[โ€“] [email protected] 3 points 8 months ago* (last edited 8 months ago)

I guess we kind of agree then. A.i is a pipe bomb for kiddie porn and the genie is out of the bottle. Not much we can do about it. We will still have laws but that won't stop anything. I already stated that the easy to access a.i services already prevent this from happening and there will be more regulation, though I don't see what more laws and regulations can bring to the table.