I am trying to get an expression to test if one field is null and treat is as false
a=true
b=true
-------------
true
a=null
b=true
-------------
true
but when I execute:
var a=true;
var b=true;
alert((a+b) == true); => false
It returns false, i don't get it.
var a=null;
var b=true;
alert((a+b) == true); => true
The general solution for this in javascript is to use !! to parse to a boolean. !! negates the truthiness twice, resulting in a boolean which has the same truthiness of the original.
You should then use && as a logical and operation.
var a=null;
var b=true;
console.log(!!a && !!b); // false
+ behaviourThe strangeness you're seeing when using + instead of && is because, in JavaScript, + coerces booleans to integers, with true becoming 1 and false becoming 0.
Hence
true + true \\ 2
true + false \\ 1
And then when doing
true + true == true
the left-hand-side of the equality comparison resolves to 2, JavaScript then coerces the right-hand-side to 1 and thus the equality check fails.
When doing
null + true == true
the left-hand-side becomes the integer 1, and then so does the right.
I'd recommend reading the MDN guide on Equality comparisons and sameness for more on JavaScript's value coercion and abstract equality checks.
var a = true;
var b = true;
console.log((a & b) === 1);
var c = null;
console.log((a & c) === 1);
true == 1
This is important. When you convert true to a number, it will be 1
then true + true == true becomes 2 == 1
which is false
similarly null + true == true becomes 1 == 1 which is true because null resolves to 0
I think you are going about testing for null the wrong way.
try
alert((a & b) === 1);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With