this post was submitted on 11 Jul 2023
14 points (100.0% liked)
Bing Chat (and Search Engine)
151 readers
44 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
wait what?! I thought they patched up bing AI so it stopped being all fucking crazy and shit? is it still talking to people like this or is this old?
This just happened today! Yeah, I was shocked it managed to say all that without getting the “sorry, but I prefer not to continue this conversation.”
Soft disengagement like “I have other things to do” is a known bug that’s been around for a very long time, but I hadn’t seen it recently. (It also never happened to me personally, but I use it for more “”intellectually stimulating”” questions lol)
Edit: just adding that if you keep insisting/prompting again in a similar way, you’re just reinforcing its previous behavior; that is, if it starts saying something negative about you, then it becomes more and more likely to keep doing that (even more extremely) with each subsequent answer.
i may need to start playing with it..