Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does double.Parse ignore the sign of zero?

Tags:

c#

For example, in:

bool eq = (1 / double.Parse("-0.0")) == (1 / -0.0);

eq will be false.

double.Parse would have to go through some trouble to explicitly ignore the sign for zero, even though not doing that almost never results in a problem. Since I need the raw representation, I had to write my own parsing function which special-cases negative zero and uses double.Parse for everything else.

That's not a big problem, but I'm really wondering why they made the decision to ignore the sign of zero, because it seems to me that not doing so wouldn't be a bad thing.

like image 554
harold Avatar asked Sep 10 '25 01:09

harold


1 Answers

I don't know about the why per se, but a potential solution: If you see a - character at the beginning, parse the rest of the string and then negate it.

like image 73
user541686 Avatar answered Sep 12 '25 13:09

user541686