this post was submitted on 21 May 2024
72 points (96.2% liked)

Technology

890 readers
437 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 7 months ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 29 points 1 month ago (17 children)

How are they abuse images if no abuse took place to create them?

[–] PoliticalAgitator 4 points 1 month ago* (last edited 1 month ago) (2 children)

Because they are images of children being graphically raped, a form of abuse. Is an AI generated picture of a tree not a picture of a tree?

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago) (14 children)

No it isn't, not anymore than a drawing of a car is a real car, or drawings of money are real money.

[–] PoliticalAgitator 4 points 1 month ago (13 children)

Material showing a child being sexually abused is child sexual abuse material.

load more comments (13 replies)
load more comments (13 replies)
[–] Leg 2 points 1 month ago (4 children)

It's a picture of a hallucination of a tree. Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

load more comments (4 replies)
[–] sxt 4 points 1 month ago (3 children)

If the model was trained on csam then it is dependent on abuse

[–] Darrell_Winfield 25 points 1 month ago (1 children)

That's a heck of a slippery slope I just fell down.

If responses generated from AI can be held criminally liable for their training data's crimes, we can all be held liable for all text responses from GPT, since it's being trained on reddit data and likely has access to multiple instances of brigading, swatting, man hunts, etc.

[–] laughterlaughter 2 points 1 month ago

You just summarized the ongoing ethical concerns experts and common folk alike have been talking about in the past few years.

[–] [email protected] 19 points 1 month ago

As I said in my other comment, the model does not have to be trained on CSAM to create images like this.

[–] Jimmyeatsausage 1 points 1 month ago (1 children)

That irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.

load more comments (1 replies)
[–] laughterlaughter 2 points 1 month ago (1 children)

I mean... regardless of your moral point of view, you should be able to answer that yourself. Here's an analogy: suppose I draw a picture of a man murdering a dog. It's an animal abuse image, even though no actual animal abuse took place.

[–] [email protected] 2 points 1 month ago (1 children)

Its not though, its just a drawing.

[–] laughterlaughter 2 points 1 month ago (1 children)

Except that it is an animal abuse image, drawing, painting, fiddle, whatever you want to call it. It's still the depiction of animal abuse.

Same with child abuse, rape, torture, killing or beating.

Now, I know what you mean by your question. You're trying to establish that the image/drawing/painting/scribble is harmless because no actual living being suffering happened. But that doesn't mean that they don't depict it.

Again, I'm seeing this from a very practical point of view. However you see these images through the lens of your own morals or points of view, that's a totally different thing.

[–] [email protected] 2 points 1 month ago (1 children)

And when characters are killed on screen in movies, are those snuff films?

[–] laughterlaughter 2 points 1 month ago

No, they're violent films.

Snuff is a different thing, because it's supposed to be real. Snuff films depict violence in a very real sense. So so they're violent. Fiction films also depict violence. And so they're violent too. It's just that they're not about real violence.

I guess what you're really trying to say is that "Generated abuse images are not real abuse images." I agree with that.

But at face value, "Generated abuse images are not abuse images" is incorrect.

load more comments (14 replies)
[–] [email protected] 20 points 1 month ago (4 children)

13.000 images are generated relatively fast. My PC needs like 5 seconds for a picture with SD(depending on settings of course). So not even a day.

Also, if pedos would only create their own shit to fap to i would consider this a win.

load more comments (4 replies)
[–] [email protected] 20 points 1 month ago (1 children)

Sensitive topic - obviously.

However these guard rail laws, and “won’t someone think about the children” cases are a reeeeally easy way for the government to remove more power from the people.

However, I believe if handled correctly, banning this sort of thing is absolutely necessary to combat the mental illness that is pedophilia.

[–] laughterlaughter 8 points 1 month ago (2 children)

I don't condone child sexual abuse, and I'm definitely not a pedo (gosh, I can't believe I have to state this.)

But how does banning AI generated material help combating a mental illness? The mental illness will still be there, with or without images...

[–] Leg 4 points 1 month ago (1 children)

There's something to be said about making it as difficult as possible to enable the behavior. Though that does run the risk of a particularly frustrated individual doing something despicable to an actual child. I don't exactly have the data on how all this plays out, and frankly I don't want to be the one to look into it. Society isn't particularly equipped to handle an issue like this though, focusing on stigma alone to kinda try to shove it under the rug.

[–] laughterlaughter 3 points 1 month ago

Your second sentence is exactly what I was thinking of. The big issue with pedophilia is the fact that kids can be easily manipulated (or forced!) to do heinous acts. Otherwise, what's the difference with regular porn and topics about prisoners, slavery, necrophilia, etc? Would we say that people who consume rape fantasy porn will go out and rape? If a dude who is sexually attracted to women is not raping women left and right every day all year round, you know, because he knows it's wrong, if we're not labeling every heterosexual male as creeps, then why would this be different with other kinds of attractions?

But anyway. I'm not saying anything that hasn't been discussed in the past (I'm sure.) I'm just glad I don't have that condition (or anything similar, like attracted to volcanoes), otherwise life would definitely suck.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago) (7 children)

Mainly it’s a problem of enabling the problem as others have mentioned.

It’s not a solution, per se. It doesn’t solve something specifically- but it doesn’t have to be. It’s about making it less accessible, harsher consequences, and so on to put more pressure on not continuing to participate in the activity. Ultimately it boils down to mental health and trauma. Pedophilia is a paraphilic disorder at the end of the day.

load more comments (7 replies)
[–] [email protected] 19 points 1 month ago

70 years for... Generating AI CSAM? So that's apparently worse than actually raping multiple children?

[–] [email protected] 10 points 1 month ago (4 children)

It seems weird that the AI companies aren't being held responsible too.

[–] [email protected] 17 points 1 month ago

It's open source code that someone ran on their own computer, it's not like he used paid OpenAI credits to generate the image.

It also would set a bad precedent - it would be like charging Solomons & Fryhle because someone used their (absolutely ubiquitous) organic chemistry textbook to create methamphetamine

[–] [email protected] 7 points 1 month ago (1 children)

Well the American way is not to hold the company accountable, I.e. school shootings, so yeah.

[–] [email protected] 2 points 1 month ago (1 children)

I'm pretty sure you can't hold a school liable for a school shooting

[–] stanleytweedle 13 points 1 month ago* (last edited 1 month ago) (8 children)

I think they were referring to the firearm manufacturer and\or seller.

load more comments (8 replies)
[–] Reddfugee42 5 points 1 month ago

Was Kodak ever held responsible for original CSAM?

[–] [email protected] 4 points 1 month ago

I think stable diffusion is an open source AI you can run on your own computer, so I don't see how the developers should be held responsible for that.

[–] [email protected] 8 points 1 month ago (1 children)

The basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?

Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it's just stick figures, and I've labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it's CSAM?

It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it's not any real person?

This seems like a very bad path to head down.

[–] [email protected] 1 points 1 month ago

Of course he did. That's the world we live in.

load more comments
view more: next ›