this post was submitted on 06 Aug 2023
1757 points (98.5% liked)
Programmer Humor
32710 readers
130 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's a language model, I don't know why you would expect math. Tell it to output code to perform the math, that'll work just fine.
Then it should say so instead of attempting and failing at the one thing computers are supposed to be better than us at
Well, if I try to use Photoshop to calculate a polynomial it's not gonna work all that well either, right tool for the job and all.
The fact that LLMs are terrible at knowing what they don't know should be well known by now (ironically).
And if Photoshop had a way to ask it for such, it'd be a mistake.
Gpt thinking it knows something and hallucinating is ultimatelya bug, not a feature, no matter what the apologists say
I know. It's still baffling how much it messes up when adding two numbers.
I just asked GPT-4:
Its reply:
It's pretty hit or miss though... I've had lots of good calculations with the odd wrong one sprinkled in, making it unreliable for doing maths. Mostly because it presents the result with absolute certainty.
It's not baffling at all... It's a language model, not a math robot. It's designed to write English sentences, not to solve math problems.