this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

59692 readers
4774 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 51 points 11 months ago (7 children)

nakedness needs to stop being an issue

[–] TORFdot0 62 points 11 months ago (1 children)

It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

[–] [email protected] 19 points 11 months ago (1 children)

I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

There’s a huge potential for harassment though, and I think that should be the main concern.

[–] [email protected] -1 points 11 months ago

first, relevant xkcd https://xkcd.com/1432/

second,

Nudity isn’t needed for people to sexually objectify you.

do you really think that makes it less bad? that it's opt-in?

And even if it was, the majority of people are able to strip you down in their head no problem

apparently this app helps them too

[–] [email protected] 37 points 11 months ago (1 children)

Regardless of feelings on that subject, there's also the creep factor of people making these without the subjects' knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one's own... gratification. Any damage "revenge porn" can do, which I would guess most people would say is wrong, this can do as well.

[–] ByteJunk 4 points 11 months ago

I don't think they're really comparable?

These AI pictures are "make believe". They're just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it's still a "generic" nude, kind of how someone would fantasize about someone they're attracted to.

Of course it's creepy, and sharing them is clearly unacceptable as it's certainly bullying and harassment. These AI nudes say more about those who share them than they do about who's portrayed in them.

However, sharing intimate videos without consent and especially as revenge? That's a whole other level of fucked up. The AI nudes are ultimately "lies" about someone, they're fakes. Sharing an intimate video, that is betraying someone's trust, it's exposing something that is private but very real.

[–] [email protected] 23 points 11 months ago (3 children)

I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren't used to/supposed to be sexualized.

[–] [email protected] 13 points 11 months ago (1 children)

Fully agree but I do think that's more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they're kids you know?

[–] [email protected] 7 points 11 months ago

It shouldn't be a big deal if they choose to be nude some place that is private for them and they're comfortable. The people who are using this app to make someone nude isn't really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don't know but I don't think there is since it's public domain.

[–] ReluctantMuskrat 13 points 11 months ago (1 children)

It's a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other "friends" doing it could be just as bad.

It's sexual harassment even if fake.

[–] [email protected] 7 points 11 months ago

I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.

[–] [email protected] 12 points 11 months ago (4 children)

Nudity shouldn't be considered sexual.

[–] TORFdot0 16 points 11 months ago (1 children)

Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

[–] [email protected] 7 points 11 months ago (1 children)

The question on consent is something I'm trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

[–] TORFdot0 7 points 11 months ago (1 children)

Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

[–] [email protected] 3 points 11 months ago

The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

Just because something shouldn't be doesn't mean It won't be. This is reality and we can't just wish something to be true. You saying it doesn't really help anything.

[–] [email protected] -1 points 11 months ago* (last edited 11 months ago) (1 children)

Whoooooosh.

In societies that have a healthy relationship with the human body, nudity is not considered sexual. I'm not just making up fantasy scenarios.

[–] mossy_ 4 points 11 months ago (1 children)

so because it's not a problem in your culture it's not a problem?

[–] [email protected] 0 points 11 months ago (1 children)

You're just really looking for an excuse to attack someone, aren't you?

[–] mossy_ -1 points 11 months ago

You caught me, I'm an evil villain who preys on innocent lemmings for no reason at all

[–] criticalthreshold 1 points 11 months ago
[–] [email protected] -2 points 11 months ago (1 children)

Take it up with God or evolution then

[–] [email protected] 0 points 11 months ago (1 children)

You can't really be that stupid.

[–] [email protected] -3 points 11 months ago
[–] [email protected] 15 points 11 months ago (1 children)

People have a really unhealthy relationship with nudity. I wish we had more nude beaches as it really helps decouple sex from nudity. And for a decent number of people, helps with perceived body issues too.

[–] [email protected] 8 points 11 months ago

Also better education, not just the sex part but overall. Critical thinking, reasoning, asking questions and yes off course sex ed

[–] [email protected] 8 points 11 months ago (3 children)

so you'd be fine with fake nudes of you floating around the internet?

[–] [email protected] 4 points 11 months ago (1 children)

I actually would but I'm a guy so i think it is different

[–] [email protected] 4 points 11 months ago (1 children)

i think the nude isn't really the actual issue, it's people gossiping about it and saying you're a slut for doing things you didn't do

[–] omfgnuts -5 points 11 months ago (1 children)

are you 15? people gossiping still bothers you?

[–] [email protected] 2 points 11 months ago (2 children)
[–] [email protected] 1 points 11 months ago

And they've been gossiping and calling each other sluts forever. Depending on the social group, just the accusation alone is enough to harass someone, because kids are idiots, and because it's not even about people believing the accusation is true. The accusation is just a way for a bully to signal to their followers that the target is one of the group's designated scapegoats.

I can't believe I'm about to recommend a teen comedy as a source of educational material, but you should check out the movie Mean Girls if you want to see an illustration of how this kind of bullying works. It's also pretty funny.

[–] omfgnuts -2 points 11 months ago (1 children)

let's see the rules, can a 15y.o use lemmy.world.

ouchiiieee

[–] [email protected] 3 points 11 months ago (1 children)

the world outside touch grass

[–] omfgnuts 0 points 11 months ago
[–] [email protected] 3 points 11 months ago

There are real nudes of me floating around the internet, and I'm fine with it.

[–] [email protected] 1 points 11 months ago

I'm pretty squeamish about nudity when it comes to my own body, but fake nudes would not be pictures of my body, so I don't see what there would be for me to be upset about. It might be different if everyone thought they were real, but if people haven't figured out yet any nudes they encounter of someone they know are probably fake, they will soon.

Here's a thought experiment: imagine a world where there are fake nudes of everyone available all the time Would everyone just be devastated all the time? Would everyone be a target of ridicule over it? Would everyone be getting blackmailed? We're probably going to be in that world very soon, and I predict everyone will just get over it and move on. Sharing fake nudes will reflect badly on the person doing it and no one else, and people who make them for their own enjoyment will do so in secret because they don't want to be seen as a creepy loser.

[–] criticalthreshold 1 points 11 months ago* (last edited 11 months ago)

But some people don't agree with you. They're not comfortable with tech that can nudify them for millions to see. So if, and that's possibly an impossible task, but if there was a way to punish services that facilitate or turn a blind eye to these things, then you bet your ass many many people would be for criminalizing it.

[–] creditCrazy 0 points 11 months ago (1 children)

Honestly I don't think that's the problem here. The problem is that we have kreeps trying to get a physical photo of someone nude for wank material.

[–] [email protected] 1 points 11 months ago

I'm genuinely curious, why do you consider this harmful? They might as well be drawing tits by hand on a picture of the "victim"

I mean sure I wouldnt want to be a teenage girl in highschool right now but I don't think it's the technologys fault but rather our culture as a society