this post was submitted on 08 Feb 2024
921 points (95.0% liked)
Programmer Humor
32588 readers
1744 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But where is it fun in it if I can't make it hallucinate?
I do feel bad when I have to tell it not to. Hallucinating is fun!
But does it work to tell it not to hallucinate? And does it work the other way around too?
It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.
Makes me wonder: Can I just asked it to hallucinate?
Yep. Tell it to lie and it will.