Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I actually don't think this is shocking or something that needs to be "investigated." Other than the sketchy website that doesn't secure user's data, that is.
Actual child abuse / grooming happens on social media, chat services, and local churches. Not in a one on one between a user and a llm.
It's the "burn that witch" reaction.
See how they hate pedophiles and not child rapists.
The crowd wants to feel its power by condemning (and lynching if possible) someone.
I'd rather want to investigate those calling for "investigation" and further violation of privacy of people who for all we know have committed no crime.
That's about freedom of speech and yelling "fire" in a crowded theater and thousand hills radio, you know the argument.
insert surprised pikachu face here
Wait… so you meant to tell me that predatory simps are using AI incorrectly? Man…. If only someone could have called this years ago- something could have been done to minimize it!
Who knew that unchecked growth could lead to negative results?!
But they did, AI Dungeon got nerfed so bad you could only have happy adventures with.
Not at all surprising but also it is an AI
Ain't that what are the tools there for. I mean I don't like cp and I don't want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn't even talk about with other humans.
As long as they don't do it on real humans nobody is hurt.
The problem with AI generated CP is that if they're legal, it opens a new line of defense for actual CP. You would need to prove the content is not AI to convince real abusers. This is why it can't be made legal, it needs to be prosecuted like real CP to be sure to convict actual abusers.
This is an incredibly itchy and complicated theme. So I will try not go go really further into it.
But prosecute what is essentially a work of fiction seems bad.
This it not even a topic new to the AI. CP has been wildly represented in both written and graphical media. And the consensus in most free countries is not to prosecute those as they are a work of fiction.
I cannot think why an AI written CP fiction is different from human written CP fiction.
I suppose "AI big bad" justify it for some. But for me there should be a logical explanation behind if we would began to prosecute works of fiction why some will be prosecuted and why other will not. Specially when the one that's being prosecuted is just regurgitating the human written stories about CP that are not being prosecuted nowadays.
I essentially think that a work of fiction should never be prosecuted to begin with, no matter the topic. And I also think that an AI writing about CP is no worse than an actual human doing the same thing.
A bit off topic... But from my understanding, the US currently doesn't have a single federal agency that is responsible for AI regulation... However, there is an agency for child abuse protection: the National Center on Child Abuse and Neglect within Department of HHS
If AI girlfriends generating CSAM is how we get AI regulation in the US, I'd be equally surprised and appalled