this post was submitted on 25 Sep 2024
-53 points (7.9% liked)

Programming

17077 readers
1640 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.

all 14 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 2 days ago
[–] [email protected] 8 points 2 days ago
[–] [email protected] 9 points 2 days ago

Ugh, i thought this was a question, not a link. So i spent time googling for a good tutorial on floats (because I didn't click the link)....

Now i hate myself, and this post.

[–] [email protected] 5 points 2 days ago

It's how CPUs do floating point calculations. It's not just javascript. Long story short, a float is stored in the format of one bit for the +/-, some bits for a base value (mantissa), and some bits for the exponent. As a result, some numbers aren't quite representable exactly.

[–] [email protected] 4 points 2 days ago (1 children)

A good way to think of it is to compare something similar in decimal. .1 and .2 are precise values in decimal, but can't be represented as perfectly in binary. 1/3 might be a pretty good similar-enough example. With a lack of precision, that might become 0.33333333, which when added in the expression 1/3 + 1/3 + 1/3 will give you 0.99999999, instead of the correct answer of 1.

[–] [email protected] 2 points 2 days ago* (last edited 2 days ago)

JavaScript is truly a bizarre language - we don’t need to go as far as arbitrary-precision decimal, it does not even feature integers.

I have to wonder why it ever makes the cut as a backend language.