this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

60078 readers
4458 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 146 points 1 year ago (1 children)

Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!

[–] Crow 119 points 1 year ago (2 children)

I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.

[–] [email protected] 65 points 1 year ago (1 children)

Post nut clarity can be truly eye opening

load more comments (1 replies)
[–] CleoTheWizard 21 points 1 year ago (1 children)

I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.

A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.

[–] [email protected] 15 points 1 year ago (10 children)

I've seen ads for these apps on porn websites. That ain't right.

Any moron can buy a match and a gallon of gasoline, freely and legally, and that's a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that's already a huge win.

load more comments (10 replies)
[–] snekerpimp 91 points 1 year ago (10 children)

“But the brightest minds of the time were working on other things like hair loss and prolonging erections.”

[–] SinningStromgald 43 points 1 year ago

So we can all have big hairy erections like God intended.

load more comments (9 replies)
[–] Dimantina 66 points 1 year ago (8 children)

These are terrible but I'm honestly curious what it thinks I look like naked. Like I'm slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?

Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.

[–] [email protected] 64 points 1 year ago* (last edited 1 year ago) (2 children)

Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don't send the images anywhere, I just make them to satiate my own curiosity).

You're essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It's not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don't match known information from the original photo. So, with current technology, you're not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you'd need to already know that's what you want to portray and load in a custom data set, like a LoRa.

Once you know what's going on under the hood, making naked photos of celebrities or other real people isn't the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future's getting pretty weird.

[–] [email protected] 50 points 1 year ago (1 children)

Yeah man it's uh... it's the future that's getting weird 😅

[–] [email protected] 42 points 1 year ago

Hey, I've maintained a baseline weird the whole time, I'm pretty sure the future is catching up.

[–] [email protected] 27 points 1 year ago (1 children)

You'll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, "I actually know a guy who might be able to help."

You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn't be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don't get many guests. You offer them homemade kombucha. They decline.

load more comments (1 replies)
[–] SCB 22 points 1 year ago (1 children)

Ethically, these apps are a fucking nightmare.

But as a swinger, they will make an amazing party game.

[–] [email protected] 31 points 1 year ago* (last edited 1 year ago) (2 children)

Ethics will probably change... I guess in the future it'll become pretty irrelevant to have "nude" pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it'll be problematic though.

load more comments (2 replies)
[–] [email protected] 13 points 1 year ago (1 children)

If you want the best answer then you'll have to download the app and try it on yourself. If it's accurate then that's pretty wild.

load more comments (1 replies)
load more comments (5 replies)
[–] cleverusernametry 53 points 1 year ago (3 children)

There are so many though!! Which ones? Like which ones specifically??

load more comments (3 replies)
[–] [email protected] 51 points 1 year ago (18 children)

nakedness needs to stop being an issue

[–] TORFdot0 62 points 1 year ago (1 children)

It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

[–] [email protected] 19 points 1 year ago (1 children)

I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

There’s a huge potential for harassment though, and I think that should be the main concern.

load more comments (1 replies)
[–] [email protected] 37 points 1 year ago (1 children)

Regardless of feelings on that subject, there's also the creep factor of people making these without the subjects' knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one's own... gratification. Any damage "revenge porn" can do, which I would guess most people would say is wrong, this can do as well.

load more comments (1 replies)
[–] [email protected] 23 points 1 year ago (3 children)

I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren't used to/supposed to be sexualized.

[–] ReluctantMuskrat 13 points 1 year ago (1 children)

It's a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other "friends" doing it could be just as bad.

It's sexual harassment even if fake.

load more comments (1 replies)
[–] [email protected] 13 points 1 year ago (1 children)

Fully agree but I do think that's more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they're kids you know?

load more comments (1 replies)
[–] [email protected] 12 points 1 year ago (10 children)

Nudity shouldn't be considered sexual.

[–] TORFdot0 16 points 1 year ago (3 children)

Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

load more comments (3 replies)
load more comments (9 replies)
[–] [email protected] 15 points 1 year ago (1 children)

People have a really unhealthy relationship with nudity. I wish we had more nude beaches as it really helps decouple sex from nudity. And for a decent number of people, helps with perceived body issues too.

load more comments (1 replies)
load more comments (14 replies)
[–] Mojojojo1993 47 points 1 year ago (6 children)

Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all

Or find the people doing this and lock em up.

[–] alienanimals 13 points 1 year ago* (last edited 1 year ago) (1 children)

The first option is much better in the long run.

load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 44 points 1 year ago (3 children)

Could we stop pushing articles monetizing fear amd outrage on this community to the top and post about actual technology

[–] owlboy 37 points 1 year ago* (last edited 1 year ago) (1 children)

Sounds like someone needs to make a community for that.

Otherwise, this is what technology is these days. And I’d say that staying blind to things like this is what got us into many messes.

I remember when tech news was mostly a press release pipeline. And when I see these comments, I see people who want press releases about new tech to play with.

Now duplicate posts. Those can fuck right off.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 43 points 1 year ago (1 children)

Honestly, were probably just going to have to get over it. Or pull the plug on the whole ai thing, but good luck with that.

load more comments (1 replies)
[–] [email protected] 40 points 1 year ago* (last edited 1 year ago) (1 children)

It tells me we're less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress' nudity is real or simulated.

Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren't measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.

In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.

Porn doesn't bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.

load more comments (1 replies)
[–] [email protected] 40 points 1 year ago (19 children)

I use an ad blocker and haven't seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

[–] [email protected] 38 points 1 year ago

Great! Now whenever someone finds all my secret nudes, I'll just claim they're deepfakes

[–] [email protected] 31 points 1 year ago (12 children)

But are there apps that undress men?

[–] [email protected] 60 points 1 year ago

Aren't those just normal chat apps where you can send pictures?

load more comments (10 replies)
[–] [email protected] 26 points 1 year ago* (last edited 1 year ago) (1 children)

Though the picture suggests we should also create really a robot or really a cyborg edits of celebrities.

As an afterthought, really a reptilian images for our political figures would also be in good order.

load more comments (1 replies)
[–] snek 23 points 1 year ago (2 children)

They can go ahead, but they'll never get that mole in the right place.

load more comments (2 replies)
[–] Tylerdurdon 19 points 1 year ago (1 children)

You mean men envision women naked? And now there's an app that's just as perverted? Huh

[–] [email protected] 12 points 1 year ago (2 children)

What's perverted about someone envisioning a potential sexual partner naked? That seems incredibly normal to me.

load more comments (2 replies)
load more comments
view more: next ›