167
this post was submitted on 07 Sep 2023
167 points (96.1% liked)
Technology
60055 readers
3751 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's obviously very distasteful but those needs don't just go away. If people with that inclination can't satisfy their sexual urges at home just looking at porn, it seems more likely they're going to go out into the world and try to find some other way to do it.
Also, controlling what people do at home that isn't affecting anyone else, even in a case like this isn't likely to target exactly just those people and it's also very likely not to stop there either. I'd personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful and that the cure wasn't going to be worse than the disease.
Even if the imagery was 100 computer-generated without a training model based in abuse, at some point it becomes a line between "is somebody being hurt" and "what is acceptable in civilized society". If we accept CG CSAM then what else? Gore porn? Snuff? Bestiality? How about a child being sexually assaulted and mutilated by animals at the same time? There is always stuff that's going to push you envelope further and further, and how do you even tell if real individuals are involved, if it's CG, or some combination of such?
If they're all CG, then nobody real is being hurt BUT there's still gotta be a line between acceptable and unacceptable. Society - at least western society- by and large has decided that the majority of those categories are not acceptable by law, so regardless of how it's made it's still illegal, and the dissemination of such can still have a harmful effect on society in general.
There's a very real argument that it does mess with the consumer's head. The person being hurt is the consumer of the media and while some unhealthy behaviors can be okay in reason, there's also risks in terms of how it makes the consumer view the world and people in it. Violent video games don't make people more violent, but imagery in porn does affect how consumers of porn view people and relationships.
I think this is a valid point, and there may indeed be a line in terms of what kind of graphic AI-gen image can or cannot be freely distributed. This should be based on evidence-supported research with the goal of reducing harmful behavior, even if our gut instinct is that banning these things outright is the best way to reduce harm. We need to follow the science.
My personal opinion is stronger when it comes to AI-generated nude images of minors in non-sexual situations. Again, any decisions we make should be evidence-based, but I suspect we will find that the prohibition of nude imagery of children makes the risks of pedophilia and sexual assault against children greater, not reduced.
Imagery of nude bodies in non-sexual situations tends to be considered as sexual because of its irregularity. In societies where nudity is normalized, nude bodies are not considered sexual in non-sexual contexts and indeed, countries with this kind of culture tend to exhibit lower levels of sexual deviancy. In America in particular, nudity is very taboo and child nudity even more so. This Puritanical belief that hiding bodies protects people from sexual urges is, I believe, misplaced. To the contrary, it tends to fetishize the nude form, especially that of minors developing secondary sexual characteristics.
This ratcheting effect of hiding child nudity more and more has led to a reality where our society as a whole cannot break free of the prohibition without putting our most vulnerable populace at severe risk. In other words, anyone attempting to photograph nude children and distribute those photographs is both committing an abusive act against those children and likely causing harmful fetishes to emerge in themselves as well as those who are on the other end of the distribution chain. Simply put, none of the adults in that situation are likely to be taking part in an effort to decrease the sexualization of minors.
Now we have AI, where as others have mentioned, images can be generated basically through guesswork, combining known information about what humans look like at various ages, using drawn images to fill in the gaps. Nude imagery of underage people can be made without anyone being harmed and without any sexualization of the image itself. People who grow up in oversexualized cultures will inevitably project their sexuality onto those images, but by having access to sufficiently realistic simulated nudity, the idea is that over time they would become desensitized. The playing field between adults and children would be leveled, hopefully making underage nude imagery no longer a thing that any significant portion of society covets.
And in an ideal situation, the next generation would grow up already having access to this kind of imagery, never developing a fetish around pubescent or prepubescent nudity in the first place. And hopefully this added comfort around nudity would serve to curb some of their overstimulation when they, as adults, fantasize about seeing the naked form. It would be more about finding a partner that suits them, not about fulfilling a desire to see the rarest most prohibited sights. And we can finally start to move beyond base objectification of women. Even if that seems virtually impossible from where we currently stand.
So, even if it touches a nerve for most of us, to be talking about this, I think we have to discuss this for our own good. We may finally have a silver bullet for a problem that's been plaguing society for uncountable generations. And a puritanical knee-jerk reaction may be about to cause us to throw it away. Before we do so, we need to think really carefully to make sure we're not doing it for all the wrong reasons.
We already have that line though. Beheading photos, for example, arent illegal, but they are banned from most websites.
My brother in Christ they have to train the fucking models on real CSAM. It's not like AI generated CSAM is suddenly a victimless crime.
Get psychological help
Get psychological help
Feeding pedophilia is directly harmful to children who grow more at risk
I'd personally be very hesitant to say "it's okay to beat off to children" unless there was an actual clinical psychologist involved with the person I'm speaking to saying as such.
How about addressing my points instead of the ad hominem attacks?
Like I said: "I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful"
If what you're saying here is actually true then the type of evidence I mentioned would exist. I kind of doubt it works that way though. If you stop "feeding" being straight, gay, whatever, does it just go away and you no longer have those sexual desires? I doubt it.
Much as we might hate it that some people do have those urges, it's the reality. Pretending reality doesn't exist usually doesn't work out well.
I never said any such thing. Also, in this case, we're also talking about images that resemble children, not actual children.
It should be very clear to anyone reading I'm not defending any kind of abuse. A knee-jerk emotion response here could easily increase the chances children are abused. Or we could give up our rights "for the children" in a way that doesn't actually help them at all. Those are the things I'm not in favor of.
I'm not the guy you're replying to, but I will say this is a topic that is never going to see a good consensus, because there are two questions of morality at play, which under normal circumstances are completely agreeable. However, when placed into this context, they collide.
Pornography depicting underage persons is reprehensible and should not exist
The production and related abuse of children should absolutely be stopped
To allow AI child porn is to say that to some extent, we allow the material to exist, even if it depicts an approximation of a real person whether they are real or not, but at the potential gain of harming the industry producing the real thing. To make it illegal is to agree with the consensus that it shouldn't exist, but will maintain the status quo for issue #2 and, in theory, cause more real children to be harmed.
Of course, the argument here goes much deeper than that. If you try to dig into it mentally, you end up going into recursive branches that lead in both directions. I'm not trying to dive into that rabbit hole here, but I simply wanted to illustrate the moral dilemma of it.
So we should ban books like Lolita since it can be interpreted as porn, or is it only visual that should be banned? If books are okay, are an imahe of stick figures with a sign "child" okay? How much detail should the visual image have before it gets banned?
How about 1000 year old dragons in a child's body? How about images of porn stars with very petite bodies?
That is addressing your point. These people need to get psychological help.
The harms brought by conversion therapy to the gay and straight people outweigh the harms that are brought about by allowing them to exist. The same is not true of pedophilia. Though it is interesting if you do see these as the same, are you for the persecution of gay or straight people as you are pedophiles, or are you in favour of pedophiles being able to enact their desires?
It is the reality, and pretending people will just safely keep their desires to themselves has proven to not work.
I never said you said it, but it is the result of what you're saying.
Since you're drawing this distinction from the words you decided were thrust in your mouth, they weren't, would you say "it's okay to beat off to children who may not exist"?
You're outwardly expressing pedophile apologia.
What rights are you giving up?
Psychologists.
There's no evidence that CSAM, real or virtual, helps reduce rates of child predation.
I'd love if you could cite your evidence.
I assume it increases it then since you're so opposed to it
I never said I had evidence. I specifically said there was no evidence. The claim presented is that it's beneficial, and thte burden of proof lies with the claim.
That's not claiming it is benefical. It's entertaining the idea that what if it is.
Then that can be decided by psychologists. It's funny you keep insisting on calling it "CG porn" though when it's abjectly and legally child pornography.
I have not once called it CG porn.
There’s a disgusting number of people on this site that I’ve seen defending ai pedos. I honestly don’t understand where it comes from. Some people cannot and should not be helped as their views are incompatible with society.
Not to mention that AI pedophilia could simply be creating a massive stepping stone to the real thing. Which I’ve also seen a number of people on Lemmy defend people possessing CSAM and saying they didn’t produce it therefore they aren’t the criminal. It’s pure insanity. I’m incredibly liberal and progressive and even I know that’s a slope I don’t wish to have society slip down it’s not worth the risk to children who are innocent to be caught in the crossfire.
Half the answers in this cursed thread. Like wtf is this thread.
Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI will worsen the ADDICTION!
What you don't have any more arguments and so you resent to saying it's stupid and insult?
Yours seems to be the one.
I can't say I entirely agree. I do think that they should be helped, but in a measured and rigorous way. None of this "let them find shit online that quells their needs". Pedophilia, in the psychological profession, is viewed in a similar light to sexual orientations; of that the person I'm responding to is correct. It's simply that they seem to be blind to nuance beyond that stance that they're stuck.
AI pedophilia is certainly a very risky move for us to simply accept, when we don't even have any data on how consumption of real or virtual CSAM impacts those who indulge in it, and to get that data would require us to do very unethical and likely illegal research as far as I can tell. The approach [email protected] is suggesting is one that is naive and myopic in the most generous light; which is how I'm choosing to take it so as to not accuse them of something they may not be guilty of.
I'm also someone who's extremely progressive, and while I can sympathize with people who have these urges and no true wish to act on them, I think it's outright malicious to say that the solution is to simply allow them to exist with informal self-treatments based on online "common sense" idealism. Mental health support should absolutely be available and encouraged; part of that is making sure people are safe to disclose this stuff to medical professionals, but no part of that is just having this shit freely spread online.
I appreciate your measured and metered response. I think these are extremely tricky conversations to have, but important, especially with how technology is progressing.
The problem is that the technology is progressing so rapidly without any checks or balances that our reaction for the time being should simply be one that enables further research without allowing others to create. This isn’t to say we should be stopping advancements, but we should be taking measured responses and using the input of psychologists to help us better understand the repercussions. It’s the same as if someone could generate AI gore that allows them to make generated videos of them killing someone they have always wanted to kill. It’s something that needs to be evaluated before we just release this stuff into the world. Specifically before this technology gets even better and more realistic. That blending of reality from fiction could be a path we as a society are not prepared for.
My worry is that people with backgrounds in computers are making decisions around things that impact human brains.
I agree completely. Unfortunately techbros have been making important world-changing decisions for two decades now and our legislators seem mostly fine to let them continue unabated.
I'm not a psychologist, so I can't say whether the other person's argument is correct or not, but just saying "get psychological help" isn't a very effective counter argument. CP in any form is super disturbing for a healthy person to see, so by definition the people who want to see it aren't mentally healthy. So what does a person do who has those urges, but has never acted on them and never wants to hurt a child? I have no idea if providing such a person with AI generated CP would make things worse or help them satisfy the urges in a "safe" fashion. How about you, do you know for sure or are you just calling the other person sick in the head because the idea is repulsive?
Ultimately, I think we'll make more progress and keep kids safer if we can provide mental help to folks like that, and that's not going to happen if they're terrified of admitting that they have the urges in the first place.
Unfortunately, I feel like many people opposed to pedophilia in this way don't understand the mazes of the human brain. It may be repulsive, it may be something you could never imagine yourself doing, but someone somewhere can't stop these urges and most pedophiles do not offend, but instead deal with this problem alone on a daily basis, stigmatized for something no other person in the world would wish upon themselves. These people need help and support, at the same time victims of pedophilia deserve all the support they can get. Balancing this with AI is difficult. Maybe if we have the technology, we could regulate it? Allow non offenders access to it, maybe they can get it through a therapist if they deem it helpful for them to manage.
It would probably take days of reading to even familiarize with studies on pedophiles and to correct my misconceptions.
Agreed, I think you have the right attitude/perspective. It could very well be that someone with those urges being given simulated CP would make things worse, I just don't know enough to say, but I don't think we should rule it out just because the idea of someone looking at it is offensive.