this post was submitted on 27 May 2024
91 points (94.2% liked)

politics

18067 readers
4838 users here now

Welcome to the discussion of US Politics!

Rules:

  1. Post only links to articles, Title must fairly describe link contents. If your title differs from the site’s, it should only be to add context or be more descriptive. Do not post entire articles in the body or in the comments.
  2. Articles must be relevant to politics. Links must be to quality and original content. Articles should be worth reading. Clickbait, stub articles, and rehosted or stolen content are not allowed. Check your source for Reliability and Bias here.
  3. Be civil, No violations of TOS. It’s OK to say the subject of an article is behaving like a (pejorative, pejorative). It’s NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect!
  4. No memes, trolling, or low-effort comments. Reposts, misinformation, off-topic, trolling, or offensive.
  5. Vote based on comment quality, not agreement. This community aims to foster discussion; please reward people for putting effort into articulating their viewpoint, even if you disagree with it.
  6. No hate speech, slurs, celebrating death, advocating violence, or abusive language. This will result in a ban. Usernames containing racist, or inappropriate slurs will be banned without warning

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.

That's all the rules!

Civic Links

Register To Vote

Citizenship Resource Center

Congressional Awards Program

Federal Government Agencies

Library of Congress Legislative Resources

The White House

U.S. House of Representatives

U.S. Senate

Partnered Communities:

News

World News

Business News

Military News

Global Politics

Moderate Politics

Progressive Politics

UK Politics

Canadian Politics

Australian Politics

New Zealand Politics

founded 1 year ago
MODERATORS
all 46 comments
sorted by: hot top controversial new old
[–] RickRussell_CA 30 points 1 month ago (3 children)

I suppose the only thing I disagree with is that the law can do anything about it. Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It's the wild west.

[–] [email protected] 14 points 1 month ago (4 children)

I think it's best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

You can't target the technology, or stop people from using AI to do perverted things, but if they get caught, we should at least respond to the problem.

I don't know what a proactive response to this issue looks like. Maybe better public education and a culture that encourages more respect for others?

[–] [email protected] 17 points 1 month ago

I think it's best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

So... Where do you draw the line exactly? Does this include classic photo manipulation too? Written stories (fanfic)? Sketching / doodling of some nude figure with a name pointed towards it? Dirty thoughts that someone has about someone else? I find this response highly questionable and authoritarian. Calling it abuse is also really trivializing actual abuse, which I, as an abuse victim, find pretty apprehensive. If I could swap what was done to me with someone making porn of "me" and getting their rocks off of that then I'd gladly make that exchange.

[–] RickRussell_CA 1 points 4 weeks ago

I feel I was misconstrued. 1. a law will probably happen, and 2. it will do fuck all because the tool chain and posting/sharing process are going to be completely anonymous.

Yeah, in specific cases where you can determine deepfake revenge porn of Person A was posted by Person B who had an axe to grind, you might get a prosecution. I just don't think the dudes making porn on their Nvidia GPUs of Gal Godot f*ckin Boba Fett are ever gonna get caught, and the celebrity cat will stay forever out of the bag.

[–] foggy 5 points 1 month ago

Yeah, it's a hydra.

Cut off the head, 3 more grow back.

[–] Grimy 3 points 1 month ago* (last edited 1 month ago) (1 children)

You can't ban the tech but you can ban the act so it's easier to prosecute people that upload deep fakes of their co-workers.

[–] [email protected] 0 points 1 month ago (1 children)

That's already illegal in most countries, regardless of how it was made. It also has nothing to do with "AI".

[–] Grimy 4 points 1 month ago* (last edited 1 month ago)

Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It's the wild west.

I was referring to that part of his comment. It is also not at all illegal in most countries. Its only illegal at state level in the US for example, and not for all of them either. Canada only has 8 provinces with legislation against it.

I do agree though that it's not the softwares fault. Bad actors should be punished and nothing more.

[–] Veraxus 26 points 1 month ago (3 children)

I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

Anything other than that narrow application is an infringement on the First Amendment.

[–] [email protected] 12 points 1 month ago (1 children)

I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

I would love that solution, but it definitely wouldn't have bipartisan support.

[–] Veraxus 3 points 1 month ago (1 children)

There are certain political groups that have a vested interest in lying, deceiving, manipulating, and fabricating to get what they want. So… yeah. 😞

[–] [email protected] 0 points 1 month ago

I feel that's just most political groups nowadays. Not implying both sides are the same, just that everyone likes their lies.

[–] [email protected] 12 points 1 month ago (1 children)

The majority of "AI" generated / altered porn is already very much labeled as such though.

[–] Veraxus 8 points 1 month ago

Exactly. Photoshop has been around for decades. AI is just more of the same. I find it weird how, as technology evolves, people keep fixating on the technologies themselves rather than the universal (sometimes institutional) patterns of abuse.

[–] Sorgan71 2 points 1 month ago (1 children)

But that just is illegal already.

[–] Veraxus 3 points 1 month ago* (last edited 1 month ago)

It’s not, though. Not remotely. At least not in the US.

Defamation of an individual (including “individual” entities like an org or business) is purely a civil matter, and defamation in a broader sense, such as against “antifa” or “leftists” or “jews” or “gays” et al, has no remedy whatsoever, civil or criminal.

[–] Plague_Doctor 7 points 1 month ago (1 children)

Just another reason why we can't ethically introduce AI.

[–] foggy 17 points 1 month ago

Pandoras box has already been cracked way open. Shit is already in military application.

[–] TheBananaKing 1 points 1 month ago (1 children)

Can AIs really consent, though?

[–] [email protected] 4 points 1 month ago (2 children)

The issue is not with all forms of pornographic AI, but more about deepfakes and nudifying apps that create nonconsensual pornography of real people. It is those people's consent that is being violated.

[–] [email protected] 1 points 1 month ago (1 children)

No-one cares if you consent to being drawn. The problem here isn't the consent of the depicted person, it's that the viewer is being misled. That's why the moral quandary goes away entirely if the AI porn is clearly labeled as such.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago)

I dont think that this is really true, I strongly suspect that most people I know would consider someone drawing porn of them without consent a majorly icky thing to do, and would probably consider someone doing that to someone else to be a creep for doing so. The reason such drawings are less an issue is at least partly that the barrier to entry is lower with AI, since it takes a certain amount of skill and time investment to draw something like that such as to be clearly recognizable as any specific real person.

[–] [email protected] -1 points 1 month ago* (last edited 1 month ago) (1 children)

I still don't understand why this is now an issue but decades of photo editing did not bother anyone at all.

[–] [email protected] 4 points 1 month ago (1 children)

I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.

[–] [email protected] -1 points 1 month ago (1 children)

Regular editing is much easier and quicker than installing, configuring and using stable diffusion. People acting like "AI" is a 1 click solution that gets you convincing looking images have probably never used it.

[–] [email protected] 2 points 1 month ago (1 children)

It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people's pictures.

These sites (which I won't even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.

The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They're making it so you don't need to know what you're doing to participate.

And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.

It is now many orders of magnitude easier than it ever has been in history to sexually exploit people's photographs. That's a big deal.

[–] [email protected] -1 points 1 month ago (1 children)

If you wanna pay for that then you do you. lol But at that point you could've also paid a shady artist to do the work for you too.

Also, maybe don't pose half naked on the internet already if you don't want people to see you in a sexual way. That's just weird, just like this whole IG attention whoring of people nowadays. And no, this isn't even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).

This hypocritical fake outrage is just embarrassing.

[–] [email protected] -2 points 1 month ago (2 children)

Anyone could run it on their own computer these days, fully local. What could the government do about that even if they wanted to?

[–] Carrolade 8 points 1 month ago

The govt's job is not to prevent crime from happening, that's dystopian-tier stuff. Their job is to determine what the law is, and apply consequences to people after they are caught breaking it.

The job of preventing crime from happening in the first place mainly belongs to lower-level community institutions, starting with parents and teachers.

[–] jeffw 6 points 1 month ago (1 children)

Anyone can make CSAM in their basement, what could the government do about that even if they wanted to?

Anyone can buy a pet from a pet store, take it home and abuse it, why is animal abuse even illegal?

Should I keep going with more examples?

[–] [email protected] -4 points 1 month ago (1 children)

What do you want them to do, have constant monitoring on your computer to see what applications you open? Flag suspicious GPU or power usage and send police to knock on your door? Abusing people or animals requires real outside involvement. You are equating something that a computer generates with real life, while they have nothing to do with each other.

[–] jeffw 7 points 1 month ago* (last edited 1 month ago) (1 children)

Who is suggesting that?

Murder is illegal, do we surveil everyone who owns a gun or knife?

CSAM is illegal, do cameras all report to the government?

Again, that’s just 2 examples. Lmk if you want more

[–] [email protected] 1 points 1 month ago (1 children)

Maybe my wording is unclear. I am wondering how they should be expected to detect it in the first place. Murder leaves a body. Abuse leaves a victim. Generating files on a computer? Nothing of the sort, unless it is shared online. What would a new regulation achieve that isn't already covered under the illegality of 'revenge porn?' Furthermore, how can they possibly even detect anything beyond that without massive privacy breaches as I wrote before?