this post was submitted on 21 May 2024
72 points (96.2% liked)

Technology

1221 readers
621 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 10 months ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 4 months ago* (last edited 4 months ago) (3 children)

No it isn't, not anymore than a drawing of a car is a real car, or drawings of money are real money.

[–] PoliticalAgitator 4 points 4 months ago (1 children)

Material showing a child being sexually abused is child sexual abuse material.

[–] [email protected] 0 points 4 months ago (1 children)

And an AI generated image does not show a child being abused

[–] PoliticalAgitator 3 points 4 months ago (1 children)
[–] [email protected] -1 points 4 months ago (1 children)

There is no child being abused by a generated image or a drawing.

[–] PoliticalAgitator 4 points 4 months ago* (last edited 4 months ago) (1 children)

If Paedophile Hill is the hill you want to die on, it's no loss to me, so I've got zero interest in your "Ceci n'est pas une child rape" defense.

[–] [email protected] 0 points 4 months ago* (last edited 4 months ago) (2 children)

And yet you still engaged with it. If we're gonna classify every picture/drawing/gen that makes people uncomfortable as CSAM it distracts from the actual CSAM that is running rampant

[–] [email protected] 2 points 4 months ago (1 children)

Seems alot like your just promoting CSAM at that point.

Im sure that will have absolutely no effect on the pedophiles who are attracted to children acting on their desires. /s

[–] [email protected] -1 points 4 months ago* (last edited 4 months ago) (1 children)

It distracts from the very real issue of actual CSAM being produced and spread by pieces of shit like the guy in the article. Instead, people are going to focus on the guy using stable diffusion to generate images that are not the result of actual abuse. Its exactly what companies like OpenAI and Microsoft want, demonization of the open source AI projects. And you and many others in this thread are falling for it.

[–] PoliticalAgitator 1 points 4 months ago (1 children)

Oh yeah, opposition to videos showing the graphic rape of children is all just a big tech conspiracy. Fuck off scumbag.

[–] [email protected] 1 points 4 months ago (1 children)

You're literally just making shit up to be offended about now

[–] PoliticalAgitator 1 points 4 months ago

Take your pills.

[–] PoliticalAgitator 1 points 4 months ago* (last edited 4 months ago) (1 children)

I'm not engaging for your benefit, which is why I've got no interest in repeating the same point in 500 ways in the hope it sinks in. But the reality is that a lot of people get their opinions from social media and they sure as fuck shouldn't imitate your views on CSAM so it's important that nobody mistakes contrarianism and apologism for actual wisdom.

But yes, it is hard to stand by while you lie your little heart in a way that helps paedophiles. I'm not ashamed or embarrassed about that.

So here's how it will play out: Your bullshit apologism and enabling will result in the creation of platforms for circulating child pornography. This platform will immediately be flooded with pictures and videos of children being raped that are indistinguishable from "genuine" child pornography, thanks to models being trained on paedophiles back catalogue.

As the amount of content grows, more and more videos of actual children being raped will enter circulation, with moderators and paedophile wriggling out of it by claiming "I thought it was AI generated".

New videos featuring the rape of actual children will be created and posted to these communities as child pornography normalises the abuse of children for the members. Detection and prosecution of the people responsible being functionally impossible because they've been buried and obfuscated by the AI generated content you insist doesn't count.

But hey, at least your bullshit semantic sensibilities haven't been offended right? That seems way more important to you than the abuse of children anyway. You're basically a hero for selflessly safeguarding paedophiles jerk off material.

We're not talking about "drawings of children being raped that make people uncomfortable". We're talking about pictures and videos that are indistinguishable from reality, featuring children being coereced or forced into performing every act and fetish known to pornography.

And you fucking know it.

[–] [email protected] 1 points 4 months ago (1 children)

There are no pictures or videos generated by AI that are indistinguishable from real CSAM because real CSAM requires a child to be abused to create.

All of those things you mention already happen all over social media.

[–] PoliticalAgitator 1 points 4 months ago

You're getting even less plausible but still desperately clinging to flawed rhetoric that only benefits paedophiles.

[–] laughterlaughter 1 points 4 months ago (1 children)

Nobody is saying they're real, and I now see what you're saying.

By your answers, your question is more "at-face-value" than people assume:

You are asking:

"Did violence occur in real life in order to produce this violent picture?"

The answer is, of course, no.

But people are interpreting it as:

"This is a picture of a man being stoned to death. Is this picture violent, if no violence took place in real life?"

To which answer is, yes.

[–] [email protected] 0 points 4 months ago (2 children)

It can be abhorrent and unlikable, its still not abuse

[–] laughterlaughter 2 points 4 months ago (1 children)

We're not disagreeing.

The question was:

"Is this an abuse image if it was generated?"

Yes, it is an abuse image.

Is it actual abuse? Of course not.

[–] [email protected] 0 points 4 months ago (2 children)

And yet its being treated as though it is

[–] PoliticalAgitator 0 points 4 months ago* (last edited 4 months ago)

Images of children being raped are being treated as images of children being raped. Nobody has every been caught with child pornography and charged as if they abused the children themselves, nor is anybody advocating that people generating AI child pornography are charged as if they sexually abused a child.

Everything is being treated as it always has been, but you're here arguing that it's moral and harmless as long as an AI does it, using every semantic trick and shifted goalpost you possibly can.

It's been gross as fuck to watch. I know you're aiming for a kind of "king of rationality, capable of transcending even your disgust of child abuse" thing, but every argument you make is so trivial and unimportant that you're coming across as someone hoping CSAM becomes more accessible.

[–] laughterlaughter 0 points 4 months ago

Well, that's another story. I just answered your question. "Are these images about abuse even if they're generated?" Yup, they are.

"Should people be prosecuted because of them?" Welp, someone with more expertise should answer this. Not me.

[–] [email protected] 0 points 4 months ago (1 children)

No genius it's just promoting abuse. Have a good day.

[–] [email protected] 1 points 4 months ago (1 children)

Just like violent video games produce school shooters

[–] PoliticalAgitator 1 points 4 months ago (1 children)

You've already fucked up your own argument. You're supposed to be insisting there's no such thing as a "violent video game", because representations of violence don't count, only violence done to actual people.

[–] [email protected] 0 points 4 months ago (1 children)

If you can't follow a simple line of logic to explain a counterpoint, that's on you.

[–] PoliticalAgitator 1 points 4 months ago

I understood it just fine.

[–] [email protected] 0 points 4 months ago (1 children)

Oops you forgot to use logic. As per the comment you're replying to, the more apt analogy would be: is an AI generated picture of a car still a picture of a car.

[–] [email protected] 1 points 4 months ago

That has nothing to do with logic? Its pointing out that both drawings and AI gens are not really the things they might depict