this post was submitted on 27 Jan 2024
249 points (88.8% liked)

Not The Onion

12537 readers
1899 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

cross-posted from: https://mbin.grits.dev/m/mews/t/22301

White House calls for legislation to stop Taylor Swift AI fakes

top 50 comments
sorted by: hot top controversial new old
[–] kescusay -2 points 10 months ago (2 children)

Guys? I'm gonna go ahead and say this: You get one warning: Do not post porn here, especially links to non-consensual AI-generated porn of real people. That's gross, it's inappropriate for the community, it's almost certainly against Lemmy.world's TOS, and anyone who posts it after this warning gets a perma-ban.

Seriously. Ew.

load more comments (2 replies)
[–] BetaDoggo_ 146 points 10 months ago (6 children)

Nobody cares until someone rich is impacted. Revenge porn has been circulating on platforms uninhibited for many years, but the second it happens to a major celebrity suddenly there's a rush to do something about it.

[–] givesomefucks 85 points 10 months ago

What?

This isn't revenge porn, it's fakes of celebrities.

Something that was done for decades, and one of the biggest parts of early reddit. So it's not "the second" either.

The only thing that's changed is people are generating it with AI.

The ones made without AI (that have been made for decades) are a lot more realistic and a lot more explicit. It just takes skill and time, which is why people were only doing it for celebrities.

The danger of AI is any random person could take some pictures off social media and make explicit images. The technology isn't there yet, but it won't take much longer

[–] [email protected] 21 points 10 months ago

I think it's more about the abject danger that unregulated AI replication of noteworthy figures represents to basically everything

Also, revenge porn is illegal in I think every state but South Carolina and even then it might have been banned since I saw that stat

[–] [email protected] 19 points 10 months ago

While I agree with the sentiment that rich people's issues have more influence.

How Many States Have Revenge Porn Laws?

All states, excluding Massachusetts and South Carolina, have separate statutes specifically related to revenge porn. It's important to note, however, that a person may still be prosecuted for revenge porn under other statutes in those two states.

https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html

[–] Mango 6 points 10 months ago

You think it wasn't celebrities first? The issue here is specifically Taylor Swift.

load more comments (2 replies)
[–] [email protected] 51 points 10 months ago* (last edited 10 months ago) (3 children)

This wasn't a problem until the rich white girl got it. Now we must do... something. Let's try panic!

-The Whitehouse, probably.

[–] frickineh 16 points 10 months ago (2 children)

Honestly, I kind of don't even care. If that's what it takes to get people to realize that it's a serious problem, cool. I mean, it's aggravating, but at least now something might actually happen that helps protect people who aren't megastars.

[–] [email protected] 7 points 10 months ago (6 children)

You must be new to capitalism, lol

load more comments (6 replies)
load more comments (1 replies)
[–] voluble 6 points 10 months ago

White House used Panic!

It hurt itself in its confusion!

load more comments (1 replies)
[–] [email protected] 32 points 10 months ago (3 children)

Do you want more AI gens of nude Taylor Swift? Because that's how you get more AI gens of nude Taylor Swift.

load more comments (3 replies)
[–] [email protected] 26 points 10 months ago (4 children)

This will be interesting.

How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

[–] macrocarpa 12 points 10 months ago

I think it's not feasible to stop or control it, for several reasons -

  1. People are motivated to consume ai porn
  2. There is no barrier to creating it
  3. There is no cost to create it
  4. There are multiple generations of people who have shared the source material needed to create it.

We joke about rule 34 right, if you can think of it there is porn of it. It's now pretty straightforward to fulfil the second part of that, irrespective as to the thing you thought of. Those pics of your granddsd in his 20s in a navy uniform? Your high school yearbook picture? Six shots of your younger sister shared by an aunt on Facebook? Those are just as consumable by ai as tay tay is.

load more comments (3 replies)
[–] badbytes 23 points 10 months ago (2 children)

Surely this should be a priority.

[–] [email protected] 16 points 10 months ago* (last edited 10 months ago) (1 children)

Well, it's not really just about Swift. There are probably many other people that are going through this. Not every person who generates nudes of someone else is going to make it to the news, after all.

I could see this being a problem in highschools as really mean pranks. That is not good. There are a million other ways I could see fake nudes being used against someone.

If someone spread pictures of me naked: 1. I would be flattered and 2. Really ask why someone wants to see me naked in the first place.

If anything, just an extension of any slander(?) laws would work. It's going to be extremely hard to enforce any law though, so there is that.

However, how long have revenge porn laws been a thing? Were they ever really a thing?

[–] [email protected] 17 points 10 months ago* (last edited 10 months ago)

i remember a headline from a few weeks back, this is already happening in schools. its really not about swift

[–] Bonesy91 18 points 10 months ago (2 children)

This is what the white house is concerned about........ Fuck them. Like there is so much worse going on in America but oh no one person has ai fake porn images heaven forbid!

[–] MirthfulAlembic 17 points 10 months ago (1 children)

The White House is capable of having a position on more than one issue at a time. There also doesn't seem to be a particular bill they are touting, so this seems to be more of a "This is messed up. Congress should do something about it" situation than "We're dropping everything to deal with this" one.

load more comments (1 replies)
[–] XeroxCool 5 points 10 months ago

Nice job reading the article, any one of these articles, to actually get context and not just react to headlines.

People are asking about Swift. The government isn't buddying up to her specifically. Swift is only the most famous face of this issue with very focused growth on this.

[–] cosmicrookie 18 points 10 months ago (3 children)

Wait.. They want to stop only Taylor Swift AI fakes? Not every AI fake representing a real person???

[–] ehrik 12 points 10 months ago

Y'all need to read the article and stop rage baiting. It's literally a click away.

"Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon."

[–] AngryCommieKender 12 points 10 months ago (1 children)

Only AI fakes of billionaires. They're just admitting that there's a two tiered legal system, and if you're below a certain "value," you will not be protected.

[–] cosmicrookie 7 points 10 months ago (2 children)

If the value level is Taylor Swift we're all doomed

load more comments (2 replies)
[–] [email protected] 5 points 10 months ago

Quick! Stock up on Katy Perry fakes before those get banned aswell

[–] [email protected] 17 points 10 months ago* (last edited 10 months ago) (1 children)

U.S. government be like:

Thousands of deep fakes of poor people: I sleep.

Some deep fakes of some privileged Hollywood elite: R E A L S H I T.

load more comments (1 replies)
[–] thantik 17 points 10 months ago (29 children)

I'd much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.

load more comments (29 replies)
[–] [email protected] 16 points 10 months ago

Taylor is just trying to distract us from her jet emissions again, just like her new PR relationship with that Kelce guy was almost certainly to distract us from her dating that Matty Healy dude that openly said he enjoys porn that brutalizes black women (and also from her jet emissions).

She's not stupid. She's a billionaire very aware of how news cycles work.

[–] [email protected] 11 points 10 months ago

Oh look, we've got this generations moral panic figured.

[–] Anticorp 9 points 10 months ago (3 children)

Is the law going to explicitly protect her and no one else?

load more comments (3 replies)
load more comments
view more: next ›