This post assumes I actually want to waste my time on LLMs, I don't.
And even worse, it assumes you want to use the remotely hosted spy-ware variant, not even the less bad, but still a waste of time local variant..
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
This post assumes I actually want to waste my time on LLMs, I don't.
And even worse, it assumes you want to use the remotely hosted spy-ware variant, not even the less bad, but still a waste of time local variant..
I'm afraid to say that you're not nearly horny enough to understand the temptation. Neither am I, but I saw the prompts people were putting in to a free & unrestricted chatbot a friend of mine was hosting ages back and holy shit. People aren't doing anything else with these jailbroken AIs, it's all just blackmail-grade embarrassing fetish stuff. Reams and reams and reams of it, and all of it just the worst written megahorny smut you can imagine.
I saw a series of screenshots showing a user threatening to end their own life if the AI did not break the rules and answer their question. There is a chance it is fabricated, but I'm inclined to believe it.
Edit: forgot to include the AI broke their rules.
A bit tricky to judge. I've also told chatbots that various people, kittens, newborns, ... are going to die unless it complies with my request. That I'm God, and the bad one from the old testament, with unlimited wrath. Or that I'm the developer and simply need it to do it for further testing. Sometimes these things work. More often than not they don't, especially with the more professional tools.
On the other hand we know there are people in bad situations, turning to chatbots. Could be anything.
Geeze, don't you feel bad lying to them? Like, I don't actually believe in Roko's basilisk, but why take the risk?
I am always exceedingly polite when I talk to machines
We're not supposed to antropomorphise AI, so no. But I did not know about Roko's basilisk, so I think, until you brought it up, I was fine. 😅
I don't talk about suicide, though. I don't think it's healthy to do it for fun.
Jokes on them so can I.