Just do not understand why the following two have different values. The first one has value 0, while the other has value 1
if(1/10);
if(0.1);
By default the type of 1 is int, thus 1/10 will be rounded down to 0 which is equivalent to false. While 0.1 has some bits set and is not 0.
On the other hand 1.0/10 is equivalent to 0.1.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With