this post was submitted on 23 Apr 2024
908 points (97.1% liked)

Technology

60165 readers
4018 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] Zoomboingding 15 points 8 months ago (1 children)

I think the only thing we can do is to help out by calling this out. AI fakes are just advanced gossip, and people need to realize that.

[–] TwilightVulpine 3 points 8 months ago (1 children)

But it doesn't. Nobody who is harassed or has their prospects undermined because of AI fakes is helped by repeating that. Especially because as the technology advances the only way to verify its legitimacy will be to compare it with real intimate pictures, which the person cannot show without being exposed to the exact same treatment.

It also doesn't help that gossip can do all that harm as well so the point is moot.

Trying to point out that this is illogical and that nudes shouldn't even be such a big deal is an uphill battle against human emotional, social and cultural tendencies. It would take much more than some offhand comments to affect it at all, and I wouldn't count on that shift happening before the harms of AI fakes spread.

[–] Zoomboingding 5 points 8 months ago (1 children)

The ubiquity of AI fakes will necessitate a cultural shift. Honestly, the world is going to be a nightmare of misinformation soon and nudes may very well be the least of our worries.

What other options do we have? An ironclad verification system for any fabricated content? Wildly harsh penalties for all caught creating it? The ship has sailed - we won't be able to prevent it from happening.

I'd argue that overexposure will make people quickly become accustomed/nonplussed at information we don't believe to be true and verify with the source. Look at how we treat other fabricated content - if I showed you a screencap of the Pope saying "Fuck" you'd want to verify with a source directly.

[–] TwilightVulpine 3 points 8 months ago (1 children)

Does it seem to you that people are becoming more likely to verify sources?

Nevermind, like I just said before, how exactly do you verify fake porn with the source? Who is going to be volunteering their intimate pictures as reference? Or, do you really think all that it takes to avoid all issues is for the victim to say "that's fake, it's not me"?

Frankly, that sounds like pure wishful thinking to me.

[–] Zoomboingding 4 points 8 months ago

In most cases, the answer should really be "It's none of my business", but yes, it'd involve asking the person whether they're authentic if you needed to verify for some reason.

But yes it really is wishful thinking, because it's honestly about to be a shitshow. People are going to start getting credibly framed for things like child porn.