this post was submitted on 08 Feb 2024
921 points (95.0% liked)

Programmer Humor

32566 readers
264 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago (1 children)

But does it work to tell it not to hallucinate? And does it work the other way around too?

[–] [email protected] 2 points 9 months ago (1 children)

It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.

[–] [email protected] 1 points 9 months ago (1 children)

Makes me wonder: Can I just asked it to hallucinate?

[–] [email protected] 2 points 9 months ago

Yep. Tell it to lie and it will.