this post was submitted on 18 Jan 2024
429 points (95.2% liked)

Technology

59448 readers
3477 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 58 points 10 months ago (3 children)

Creating fake child porn of real people using things like Photoshop is already illegal in the US, I don't see why new laws are required?

[–] Bgugi 37 points 10 months ago* (last edited 10 months ago) (2 children)

Well those laws clearly don't work. So we should make new laws! Ones that DEFINITELY WILL work! And if they don't, well I guess we just need more laws until we find ones that do.

[–] NotMyOldRedditName 13 points 10 months ago

Since we need a rule explicitly for AI related cases, even though it's already covered by others, lets ensure that we also make a 100 page law for if the material is explicitly made in Photoshop, and also another 80 pages if it was made in Gimp. If you use MS Paint to do it, we need a special 200 page law that makes the punishment even harsher, because damn you got skillz and need to be punished more.

[–] [email protected] 5 points 10 months ago* (last edited 10 months ago) (1 children)
[–] Bgugi 9 points 10 months ago (1 children)

No, I'm not criticizing the bill's content. If you don't enforce laws, new ones won't work either. The new ones are, at best, an opportunity for people to huff and puff and pat themselves on the back at the cost of actual victims. At worst, it's smoke and mirrors for what the new law actually does.

[–] [email protected] -2 points 10 months ago* (last edited 10 months ago)
[–] General_Effort 12 points 10 months ago

This is not at all about protecting children. That's just manipulation. In truth, kids are more likely to prosecuted than protected by this bill.

There are already laws that could be used against teen bullies but it's rarely done. (IMHO it would create more harm than good, anyway.)

This is part of an effort to turn the likenesses of people into intellectual property. Basically, it is about more money for the rich and famous.

This bill would even apply to anyone who shares a movie with a sex scene in it. It's enough that the "depiction" is "realistic" and "created or altered using digital manipulation". Pretty much any photo nowadays, and certainly any movie, can be said to "altered using digital manipulation". There's no mention of age, deception, AI, or anything that the PR bullshit suggests.

[–] rabiddolphin 9 points 10 months ago

Regulatory capture. OpenAI wants to kick down the ladder