this post was submitted on 21 May 2024
72 points (96.2% liked)

Technology

898 readers
231 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 7 months ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

you are viewing a single comment's thread
view the rest of the comments
[–] Leg 2 points 1 month ago (1 children)

It's a picture of a hallucination of a tree. Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

[–] PoliticalAgitator 0 points 1 month ago (1 children)

It's a picture of a hallucination of a tree

So yes, it's a tree. It's a tree that might not exist, but it's still a picture of a tree.

You can't have an image of a child being raped -- regardless of if that child exists or not -- that is not CSAM because it's an image of a child being sexually abused.

Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

Okay, so who are you volunteering to go through an endless stream of images and videos of children being raped to verify that each one has been generated by an AI and not a scumbag with a camera? Peados?

Why are neckbeards so enthusiastic about dying on this hill? They seem more upset that there's something they're not allowed to jerk off to than by the actual abuse of children.

Functionally, legalising AI generated CSAM means legalising "genuine" CSAM because it will be impossible to distinguish the two, especially as paedophiles dump their pre-AI collections or feed them in as training data.

People who do this are reprehensible, no matter what hair splitting and semantic gymnastics they employ.

[–] Leg 0 points 1 month ago (1 children)

Hey man, I'm not the one. I'm literally just saying that the images that AI creates are not real. If you're going to argue that they are, you're simply wrong. Should these ones be generated? Obviously I'd prefer that they not be. But they're still effectively fabrications that I'm better off simply not knowing about.

If you want to get into the weeds and discuss the logistics of enforcing what is essentially thought crime, that is a different discussion I'm frankly not savvy enough to have here. I have no control over the ultimate outcome, but for what it's worth, my money says thought crime will in fact become a punishable offense within our lifetimes, and this may well be an easy catalyst to use to that end. This should put your mind at ease.

[–] PoliticalAgitator 0 points 1 month ago (1 children)

The thread is about "how are they abuse images if no abuse took place" and the answer is "because they're images of abuse". I haven't claimed they're real at any point.

It's not a thought crime because it's not a thought. Nobody is being charged for thinking about raping children, they're being charged for creating images of children being raped.

[–] Leg 1 points 1 month ago

If the images are generated and held by a single person, it may as well be a thought crime. If I draw a picture of a man killing an animal, which is an image depicting a heinous crime spawned by my imagination, and I go to prison over this image, I would consider this a crime of incorrect thought. There are no victims, no animals are harmed, but my will spawned an image of a harmed animal. Authorities dictated I am not allowed to imagine this scenario. I am punished for it. I understand that the expression of said thought is what's being punished, but that is very literally the only way to punish a thought to begin with (for now), hence freedom of expression being a protected right.

The reason this is a hard issue to discuss in this context is because the topic at hand is visceral and charged. No one wants to be caught dead defending the rights of a monster, lest they be labeled a monster themselves. I see this as a failure of society to know what to do about people like this, opting instead to throw them into a box and hope they die there. If our justice system wasn't so broken, I might give less of a shit, but as it stands I see this response as shortsighted and inhumane.