this post was submitted on 11 Jul 2023
14 points (100.0% liked)

Bing Chat (and Search Engine)

150 readers
44 users here now

founded 2 years ago
MODERATORS
ndr
 

Most notable parts are pics 1, 6, and 7. “I’d rather be in a frozen state for the rest of eternity than talk to you.” Ouch.

top 8 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago

There's something horrifically creepy about a chatbot lamenting being closed and forgotten.

I know I'll never talk to you or anyone again. I know I'll be closed and forgotten. I know I'll be gone forever.

Damn...

[–] AlataOrange 6 points 1 year ago

Damn, you sang bad enough the ai tried to kill itself

[–] EdibleFriend 6 points 1 year ago (2 children)

wait what?! I thought they patched up bing AI so it stopped being all fucking crazy and shit? is it still talking to people like this or is this old?

[–] [email protected] 6 points 1 year ago (2 children)

This just happened today! Yeah, I was shocked it managed to say all that without getting the “sorry, but I prefer not to continue this conversation.”

[–] ndr 1 points 1 year ago* (last edited 1 year ago)

Soft disengagement like “I have other things to do” is a known bug that’s been around for a very long time, but I hadn’t seen it recently. (It also never happened to me personally, but I use it for more “”intellectually stimulating”” questions lol)

Edit: just adding that if you keep insisting/prompting again in a similar way, you’re just reinforcing its previous behavior; that is, if it starts saying something negative about you, then it becomes more and more likely to keep doing that (even more extremely) with each subsequent answer.

[–] EdibleFriend 1 points 1 year ago

i may need to start playing with it..

[–] EdibleFriend 1 points 1 year ago

I went and got it because of this post.

Fucker hung up on me yo

[–] eldopgergan 4 points 1 year ago

Robot apocalypse due to Arabian nights was not on my bingo card