this post was submitted on 22 May 2024
296 points (96.8% liked)

News

23612 readers
5094 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] theherk 22 points 7 months ago (3 children)

I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

[–] Madison420 3 points 7 months ago (2 children)

So long as the generation is without actual model examples that are actual minors there's nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

[–] [email protected] 0 points 7 months ago (2 children)

The problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.

[–] [email protected] 3 points 7 months ago

This isn't true. AI can generate tan people if you show them the color tan and a pale person -- or green people or purple people. That's all ai does, whether it's image or text generation -- it can create things it hasn't seen by smooshing together things it has seen.

And this is proven by reality: ai CAN generate csam, but it's trained on that huge image database, which is constantly scanned for illegal content.

[–] Madison420 1 points 7 months ago

Real images that don't have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

[–] Fungah -1 points 7 months ago (1 children)

It is illegal in Canada to have sexual depictions of a child whether its a real image or you've just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There's no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

[–] Madison420 2 points 7 months ago

This article isn't about Canada homeboy.

Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

Similarly, you didn't actually offer a counterpoint to any of my points.

[–] [email protected] 3 points 7 months ago (1 children)

I'm a professional artist and have no issue banning ai generated CSAM. People can call it self expression if they want, but that doesn't change the real world consequences of it.

Allowing ai generated CSAM basically creates camouflage for real CSAM. As ai gets more advanced it will become harder to tell the difference. The scum making real CSAM will be emboldened to make even more because they can hide it amongst the increasing amounts of ai generated versions, or simply tag it as AI generated. Now authorities will have to sift through all of it trying to decipher what's artifical and what isn't.

The liklihood of them being able to identify, trace, and convict child abusers will become even more difficult as more and more of that material is generated and uploaded to various sites with real CSAM mixed in.

Even with hyper realistic paintings you can still tell it's a painting. Anime loli stuff can never be mistaken for real CSAM. Do I find that sort of art distasteful? Yep. But it's not creating an environment where real abusers can distribute CSAM and have a higher possibility of getting away with it.

[–] [email protected] 1 points 7 months ago (1 children)

I guess my question is, why would anyone continue to "consume" -- or create -- real csam? If fake and real are both illegal, but one involves minimal risk and 0 children, the only reason to create real csam is for the cruelty -- and while I'm sure there's a market for that, it's got to be a much smaller market. My guess is the vast majority of "consumers" of this content would opt for the fake stuff if it took some of the risk off the table.

I can't imagine a world where we didn't ban ai generated csam, like, imagine being a politician and explaining that policy to your constituents. It's just not happening. And i get the core point of that kind of legislation -- the whole concept of csam needs the aura of prosecution to keep it from being normalized -- and normalization would embolden worse crimes. But imagine if ai made real csam too much trouble to produce.

AI generated csam could put real csam out of business. If possession of fake csam had a lesser penalty than the real thing, the real stuff would be much harder to share, much less monetize. I don't think we have the data to confirm this but my guess is that most pedophiles aren't sociopaths and recognize their desires are wrong, and if you gave them a way to deal with it that didn't actually hurt chicken, that would be huge. And you could seriously throw the book at anyone still going after the real thing when ai content exists.

Obviously that was supposed to be children not chicken but my phone preferred chicken and I'm leaving it.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

I try to think about it this way. Simulated rape porn exists, and yet terrible people still upload actual recordings of rapes to porn sites. And despite the copious amounts of the fake stuff available all over the internet... rape statistics haven't gone down and there's still sexual assaults happening.

I don't think porn causes rape btw, but I don't think it prevents it either. It's the same with CSAM.

Criminally horrible people are going to be horrible.

[–] [email protected] -2 points 7 months ago (3 children)

It's not a difficult test. If a person can't reasonably distinguish it from an actual child, then it's CSAM.

[–] [email protected] 9 points 7 months ago (2 children)

Just to play devil's advocate:

What about hentai where little girls get fondled by tentacles? (Please please please don't make this be my most up voted post)

[–] Olgratin_Magmatoe 5 points 7 months ago

(Please please please don’t make this be my most up voted post)

I can downvote to prevent that if you like

[–] [email protected] 4 points 7 months ago (2 children)

Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

[–] Adalast 3 points 7 months ago (1 children)

And yet, it is still actually illegal I'm every state. CSAM of any kind in any medium is legally identical. Hand drawn stick figures with ages written under them is enough for some judges/prosecutors.

Honestly, I am of the firm belief that the FBI should set up a portal that provides user account bound access to their seized materials. This may seem extreme and abhorrent, but it provides MANY benefits.

  • They are able to eliminate the black market for it by providing free, legal access to already existing materials, no more children will be harmed in the production of "new materials".
  • They can mandate that accounts are only able to be made by those actively pursuing mental health treatments for their mental illness. It is a mental illness long before it is a crime.
  • They are able to monitor who is accessing and from where, and are able to coordinate efforts with mental health providers to give better treatment.
  • They can compile statistical data on the prevailing patterns of access to get a better analytical understanding of how those with the mental illness behave so they can better police those who still utilize extra-legal avenues.

Always keep in mind that this is a mental illness. Often times it is rooted in the person's own traumatic past. Many were themselves victims of sexual abuse as children and are as much victims as the children they abuse. I am not, in ANY way, absolving them of the harm that they have done and they absolutely should repent for it. What I am attempting to articulate is that we need to, as a society, avoid vilifying them into boogy-people so we can justify hate and violence. They are people, they are mentally ill, they can be treated, and they can be healthy. It is no different than something like BPD, Malignant Narcissism, or Munchausen by Proxy. All can do real harm, all should face consequences of their harm, but those three are all so normalized at this point that unless the abuse results in death, most people will handwave the actions and push for treatment. Now I feel we have gotten too lax on these (and others) and are far too harsh on others. All mental illnesses deserve ardent and effective treatment.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

Nay, I just replied to you in the context of the commenter. The other commenter stated about real life children so your point about hentai is irrelevant to him. I do know the legal definition of CSAM is the end result and not the act. And hence, why I stated that yours is a different discussion entirely.

Edit: Sorry I read it again and I think I didn't get my point across very well. I think your point about artwork falls into the debate about the definition of CSAM. Why? Because the word abuse implies an abusive act is being done. But the current definition states that what matters is the end result only. This poses a problem in my opinion because it slightly touch your freedom of expression. By the current definition, art has its limit

[–] [email protected] 1 points 6 months ago

Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago)

What he probably means is that for a "photo", an actual act of photography must be performed. While "artwork" can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting "image" is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.

Edit to clarify a few things.

[–] Madison420 0 points 7 months ago (1 children)

This would also outlaw "teen" porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

I get the reason people think it's a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

[–] [email protected] 1 points 7 months ago (1 children)

And nothing was lost...

But in seriousness, as you said they are models who are in the industry, verified, etc. It's not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown "actresses" portraying teenagers...

[–] Madison420 0 points 7 months ago

Except jobs dude, you may not like their work but it's work. That law ignores verified age, that's a not insignificant part of my point....