this post was submitted on 31 Jan 2024
68 points (92.5% liked)

Technology

60162 readers
3331 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes::Bipartisan measure introduced in the US senate will allow victims in ‘digital forgeries’ to seek civil penalty against perpetrators

top 9 comments
sorted by: hot top controversial new old
[–] TheGrandNagus 44 points 11 months ago (3 children)

Unfortunate that it took a famous person being a victim of this before action started to be taken.

That said, will this do anything to help? People are still going to do it regardless.

In a weird way, could this tech not be a good thing? There are unfortunately plenty of stories of young people in particular becoming extremely stressed to the point of harming themselves over leaks of their nudes.

If in the future, they can be trivially dismissed as being easily-generated deepfakes (regardless of whether they actually are), would that not lessen the harm considerably? Would it not recalibrate our brains to care about it less?

I don't know the answers to any of this. We're going into uncharted territory.

[–] [email protected] 13 points 11 months ago

https://lemmy.zip/comment/6691303

I actually argued similarly over here. This fake outrage because a celebrity was targeted serves nobody. Fakes will just get better, and legislation will always be on the back foot.

[–] General_Effort 3 points 11 months ago

I wonder what it will do to the stress levels of parents, knowing that their kid can be on the hook for $150k if they share some fake. Would any liability insurance cover this?

[–] abhibeckert -3 points 11 months ago* (last edited 11 months ago) (1 children)

People are still going to do it regardless.

Would they? Last year a woman was awarded $1.2b in damages after her ex boyfriend distributed revenge porn.

How many people would hit that retweet button, if they knew it might lead to damages on that scale? Presumably her ex-boyfriend went bankrupt and lost everything he owned, having to give all of it to her (and her lawyers).

Sure, some people would still take that risk but not very many. And at least the victims would get a nice pay day out of it.

[–] TheGrandNagus 10 points 11 months ago* (last edited 11 months ago)

Would they?

Yes.

People break copyright and other IP laws all the time, for example.

Shit, torrenting a film carries a 10 year max prison sentence where I am. It doesn't stop anybody.

Speeding fines can be absolutely huge. People still speed. Etc.

A law like this is virtually impossible to enforce, the crime in question is getting easier and easier to trivially commit, and thus the law likely won't do much.

And btw that case you linked is a hell of a lot more than someone retweeting or upvoting a deepfake.

It covers someone constantly uploading porn of a partner and blackmailing them (even days before the court case), impersonating her online, doxxing her, and sending porn of her to her family members.

It also covers him illegally using her bank account to pay his bills and using her name and information to apply for loans in her name.

That case is a very, very, very, very different situation to someone making a Taylor Swift deepfake.

So different that it calls into question whether you even read past the headline.

[–] danjay 15 points 11 months ago

Shit, I better delete some of my Danny DeVito folders...

[–] [email protected] 11 points 11 months ago

Can't wait to see this do literally nothing to curb the problem while also insuring people using AI for legitimate research get fucked by morons that don't understand the technology.

There are already laws against slander and revenge porn, and these images would fall under both.

[–] General_Effort 6 points 11 months ago

This appears to be the bill: https://www.documentcloud.org/documents/24397944-defiance-act

For reference, this is the section getting altered: https://www.law.cornell.edu/uscode/text/15/6851

This looks to be an absolute shitshow. I fear it'll be made even worse before it passes. Maybe they'll curtail the abuse potential. Then again, maybe not. It may be seen as hurting the right people.

[–] [email protected] 3 points 11 months ago

This is the best summary I could come up with:


A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence.

“This month, fake, sexually-explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms.

Sexualized, exaggerated images of Swift at football games went viral over the weekend on X, racking tens of millions of views, according to Twitter’s metrics.

Swifties, as the artist’s fans are known, began flooding X with tweets of the phrase “Taylor Swift AI” accompanied by clips of her performing to stymie searches for the images.

Later, as bad press mounted, Elon Musk’s X took the drastic step of prohibiting all searches for Swift to contain the spread of the images.

Musk laid off the majority of the employees responsible for curbing his social network’s worst impulses after he purchased the company for $44bn.


The original article contains 415 words, the summary contains 145 words. Saved 65%. I'm a bot and I'm open source!