Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

why 0.3 + 0 == 0.3 and 0.1 + 0 == 0.1 in javascript when 0.3 and 0.1 cannot be represented exactly in binary

First question: In java script if we do.

0.2 + 0.1
=>0.30000000000000004

Because 0.2,0.1 are rounded off to different number and also the sum is rounded of to a different number This article explains the behind the scenes.I understand this but when i do.

0.3 + 0
=>0.3  

It shows 0.3 why? 0.3 cannot be exactly represented in binary right and it should be rounded to different number right?.
so the value of 0.3 + 0 should be the rounded number of 0.3 and the sum if needed?.

Second question:
when we do.

let x = 0.3

As 0.3 cannot be represented exactly in binary it is rounded of to another number but when you access the variable x why does it show 0.3 but not the actual rounded of number.

x
=>0.3

But when you do.

0.2 + 0.1
=>0.30000000000000004

It shows the actual rounded of number but not 0.3. Kindly explain.

like image 229
runtimeerror Avatar asked Oct 25 '25 05:10

runtimeerror


1 Answers

The value you get when you add 0.1 and 0.2 and the value you get from the literal 0.3 are different values (neither of which is exactly 3/10ths, both of which are very near it). (Adding 0 doesn't matter, it doesn't change the value of the number it's being added to.) The one you get from 0.1 + 0.2 is just barely over 3/10ths; the one you get from 0.3 (if I recall correctly) is just under 3/10ths.

When converting a floating point number to text, JavaScript follows the common practice of only including as many digits as are required to differentiate it from the next nearest value the format can represent (details in the spec and in the academic paper the spec references). 0.3 doesn't need any additional digits for that differentiation, but 0.30000000000000004 does (I'm guessing to differentiate it from 0.3, but I don't actually know that).

like image 92
T.J. Crowder Avatar answered Oct 27 '25 19:10

T.J. Crowder