this post was submitted on 29 Jun 2024
132 points (91.2% liked)

ChatGPT

8945 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
 

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn't know the answer, it would have been trustworthy.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 64 points 4 months ago (5 children)

LLMs don’t know if what they’re telling you is true or not, out of the box. It’s gonna give you an answer that statistically looks like a sequence of words that should come in response to the sequence of words (the prompt) you gave it. It doesn’t know what the words you said mean, and it doesn’t know what it answered means either. One of my favorite interactions I’ve had with Claude shows this limitation quite well…

How many r's are there in the word strawberry?

There are 2 r's in the word "strawberry".

Are you sure?

You're right to ask for confirmation. I apologize for my mistake in the previous response. Let me correct that:

There is actually only 1 'r' in the word "strawberry".

Thank you for prompting me to double-check. It's important to provide accurate information.

[–] turbodrooler 1 points 4 months ago

Wow, GPT4o gave me this after the same questions:

“Yes, I am sure. The word “strawberry” has two “r”s: one after the “t” and another near the end of the word.”

But GPT4 got it right from the start.

load more comments (4 replies)