this post was submitted on 05 Oct 2023
36 points (69.6% liked)

Unpopular Opinion

6325 readers
25 users here now

Welcome to the Unpopular Opinion community!


How voting works:

Vote the opposite of the norm.


If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.



Guidelines:

Tag your post, if possible (not required)


  • If your post is a "General" unpopular opinion, start the subject with [GENERAL].
  • If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].


Rules:

1. NO POLITICS


Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.


2. Be civil.


Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Shitposts and memes are allowed but...


Only until they prove to be a problem. They can and will be removed at moderator discretion.


5. No trolling.


This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.



Instance-wide rules always apply. https://legal.lemmy.world/tos/

founded 1 year ago
MODERATORS
 

It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.

Like the comments on this post here.

https://sh.itjust.works/post/6220815

I find this argument crazy. I don't even know where to begin to talk about how many ways this will go wrong.

My views ( which are apprently not based in fact) are that AI CSAM is not really that different than "Actual" CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.

Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.

Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.

Using drugs has no inherent victim. And it is not predatory.

I could go on but im not an expert or a social worker of any kind.

Can anyone link me articles talking about this?

you are viewing a single comment's thread
view the rest of the comments
[–] PM_Your_Nudes_Please 6 points 1 year ago* (last edited 1 year ago)

While I agree that studies would help, actually performing those studies has historically been very difficult. Because the first step to doing a study on pedophilia is actually finding a significant enough number of pedophiles who are willing and able to join the study. And that by itself is a tall order.

Then you ask these pedophiles (who are for some reason okay with admitting to the researchers that they are, in fact, pedophiles) to self-report their crimes. And you expect them to be honest? Any statistician will tell you that self-reported data is consistently the least reliable data, and that’s doubly unreliable when you’re basically asking them to give you a confession that could send them to federal prison.

Or maybe you try going the court records/police FOIA request route? Figure out which court cases deal with pedos, then figure out if AI images were part of the evidence? But that has issues of its own, because you’re specifically excluding all the pedos who haven’t offended or been caught; You’re only selecting the ones who have been taken to court, so your entire sample pool is biased. You’re also missing any pedos who have sealed records or sealed evidence, which is fairly common.

Maybe you go the anonymous route. Let people self report via a QR code or anonymous mail. But a single 4chan post could ruin your entire sample pool, and there’s nothing to stop bad actors from intentionally tainting your study. Because there are plenty of people who would jump at a chance to make pedos look even worse than they already do, to try and get AI CSAM banned.

The harsh reality is that studies haven’t been done because there simply isn’t a reliable way to gather data while controlling for bias. With pedophilia being taboo, any pedophiles will be dissuaded from participating. Because it means potentially outing yourself as a pedophile. And at that point, your best case scenario is having enough money to ghost your entire life.