What is the difference between if (!x) and if (x == null); that is, when can their results be different?
!x will return true for every "falsy" value (empty string, 0, null, false, undefined, NaN) whereas x == null will only return true if x is null (edit: or apparently undefined (see below)).
Try with x = 0, there is a difference.
You can say that the NOT operator ! converts a value into its opposite boolean equivalent. This is different than actually comparing two values.
In addition, if you compare values with ==, JavaScript does type conversion which can lead to unexpected behavior (like undefined == null). It is better to always use strict comparison === (value and type must be the same) and make use of type conversion only if you really know what you are doing.
Something to read:
Data Type Conversion
Comparison Operators
Logical Operators
Update:
For more information about the non-strict comparison of null and undefined (or the comparison in general), it is worth having a look at the specification. The comparison algorithm is defined there (the comparison is x == y):
If Type(x) is the same as Type(y), then
(...)
If x is null and y is undefined, return true.
If x is undefined and y is null, return true.
(...)
(...)
The results can be different if x is false, NaN, '' (empty string), undefined (using the strict comparison operator ===), or 0 (zero).
See Felix Kling's answer for an excellent summary of type comparison.
if (!x)
coerces x uses the internal ToBoolean function
if (x==null)
coerces both operands using the internal ToPrimitive function (which generally resolves each side to a number, occasionally a string, depending on the operands)
For full explanantion of ToBoolean vs ToPrimitive see http://javascriptweblog.wordpress.com/2011/02/07/truth-equality-and-javascript/
Say x is a string.
x = undefined;
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
x = "";
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
x = null;
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
You'll notice that "X is not a truthy value" is shown in all three cases, but only in the case of X being undefined or null is "X is null" shown.
When X is a boolean value, then (!x) will be true when X is false but (x == null) will not be. For numbers 0 and NaN are considered false values, so not X is truthy.
See it in action, including the difference between == (equality using type conversion) and === (strict equality)
!x tests for a false value. This will be true for any value that can propagate to false for whatever reason. This will be true for permutations of false, 0, etc etc.
x == null is different because var x = 0 will NOT be null... but WILL be false.
Related
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.
Talk is cheap, I will show my code.
var a; // a = undefined
if(a == false){ // As I typed == not ===, a needs to be translated to boolean (undefined == false) but it doesn't
return false;
}
else {
return true;
}
// true
This returns true but I was sure that it would return false because undefined is the same as false when I'm using double equal.
Things came strange when I tried to use
if(!a){..} else {..};
// false
Here I got my false but till this moment I thought (!a) and (a == false) are absolutely equals.
The short answer:
!a converts a value to a Boolean.
a == false compares a value to a Boolean.
These are two different operations.
!a is equivalent to Boolean(a) ? false : true. Boolean(a) returns false if a is
undefined
null
0
''
NaN
false
In every other case it returns true.
What happens in a == false is a bit more evolved, but not that complicated. The most important thing that happens is that false is converted to a number, so you are actually comparing a == 0. But undefined is treated in a special way in the comparison algorithm. It's not converted to any other type, so the algorithm simply returns false.
I wrote the following interactive tool for a JavaScript course which shows you which steps of the algorithm are performed when comparing two values:
Similar questions:
In JavaScript, why is "0" equal to false, but when tested by 'if' it is not false by itself?
Why does ('0' ? 'a' : 'b') behave different than ('0' == true ? 'a' : 'b')
JavaScript: What is the difference between `if (!x)` and `if (x == null)`?
Why "" == "0" is false in javascript?
The only correct answer is "that's the way it is". The source of your confusion is a feature of JavaScript called type coercion and different types of equality comprisons (==, === in JavaScript).
There's an interesting table which tells you which comparisons will result to true on JavaScript Equality Table.
The only two values which will give true when ==-compared with null are null and undefined.
In other words, x == null will be true if and only if x is null or undefined.
You had a false assumption. x == false does not coerect x to boolean. In fact, == has it's own equality table.
If you don't believe random blogposts, here is the spec:
7.2.12 Abstract Equality Comparison
The comparison x == y, where x and y are values, produces true or
false. Such a comparison is performed as follows:
ReturnIfAbrupt(x).
ReturnIfAbrupt(y).
If Type(x) is the same as Type(y), then Return the result of performing Strict Equality Comparison x === y.
If x is null and y is undefined, return true.
If x is undefined and y is null, return true.
If Type(x) is Number and Type(y) is String, return the result of the comparison x == ToNumber(y).
If Type(x) is String and Type(y) is Number, return the result of the comparison ToNumber(x) == y.
If Type(x) is Boolean, return the result of the comparison ToNumber(x) == y.
If Type(y) is Boolean, return the result of the comparison x == ToNumber(y).
If Type(x) is either String, Number, or Symbol and Type(y) is Object, then return the result of the comparison x == ToPrimitive(y).
If Type(x) is Object and Type(y) is either String, Number, or Symbol, then return the result of the comparison ToPrimitive(x) == y.
Return false.
So for undefined == false: first we hit #9, and then #12, which evaluates to false.
(!a) and (a == false) are absolutely equals.
you use two different operators and assume they are absolutely equals - never do this, there is a reason 2 different operators exists instead of 1.
Now, think of NaN (as one example, others may apply). By definition NaN is a falsy value but not false value, so:
if(!NaN) {} // this will execute
if(NaN == false) {} // this will not execute
Why do you think this happens?
Because == operator does type coercion per type/value, thus NaN doesn't coerce to false, while others, like 0 may, but both will be considered falsy and converted to true using !
Summed:
operator ! uses a defined set of falsy values (false,0, '' or "" (empty string), null, undefined, NaN) all else is truthy
operator == uses per type conversions not regarding if the original value is considered falsy, so after being converted, it may well be truthy
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.
To the best of my knowledge, (x == false) should do the same thing as !x, as both of them try to interpret x as a boolean, and then negates it.
However, when I tried to test this out, I started getting some extremely strange behavior.
For example:
false == [] and false == ![] both return true.
Additionally
false == undefined and true == undefined both return false, as does
false == Infinity and true == Infinity and
false == NaN and true == NaN.
What exactly is going on here?
http://jsfiddle.net/AA6np/1/
It's all here: http://es5.github.com/#x11.9.3
For the case of false == []:
false is converted to a number (0), because that is always done with booleans.
[] is converted to a primitive by calling [].valueOf().toString(), and that is an empty string.
0 == "" is then evaluated by converting the empty string to a number, and because the result of that is also 0, false == [] is true.
For the case of false == ![]:
The logical not operator ! is performed by returning the opposite of ToBoolean(GetValue(expr))
ToBoolean() is always true for any object, so ![] evaluates to false (because !true = false), and therefore is false == ![] also true.
(false == undefined) === false and (true == undefined) === false is even simpler:
false and true are again converted to numbers (0 and 1, respectively).
Because undefined cannot be compared to a number, the chain bubbles through to the default result and that is false.
The other two cases are evaluated the same way: First Boolean to Number, and then compare this to the other number. Since neither 0 nor 1 equals Infinity or is Not A Number, those expressions also evaluate to false.
The abstract equality algorithm is described in section 9.3 of the specification.
For x == y where x = false and y = []:
Nope. Types are not equal.
Nope, x is not null.
Nope. x is not undefined.
Nope, x is not a number
Nope, x is not a string.
Yes, x is a boolean, so we compare ToNumber(x) and y.
Repeat the algorithm, x=0 and y=[].
We end at step 8:Type(x) == number. and Type(y) == object.
So, let the result be x == ToPrimitive(y).
ToPrimitive([]) == ""
Now, repeat the algorithm again with x=0 and y="". We end at 4: "return the result of the comparison x == ToNumber(y)."
ToNumber("") == 0
The last repetition of the algorithm ends at step 1 (types are equal). By 1.c.iii, 0 == 0, and true is returned.
The other results can be obtained in a similar manner, by using the algorithm.
false == []
Using == Javascript is allowed to apply conversions. The object will convert into a primitive to match type with the boolean, leaving an empty string. The false will convert into a number 0. The compares the empty string and the number 0. The string is converted to a number which will be 0, so the expression is "true"
![]
Javascript converts the object to the boolean true, therefore denying true ends being false.
false == undefined true == undefined
false == Infinity and true == Infinity
false == NaN and true == NaN
Again a bit of the same! false is converted to 0, true to 1. And then, undefined is converted to a number which is... NaN! So false in any case
I would recommend to use === !== as much as you can to get "expected" results unless you know very well what you are doing. Using something like Boolean(undefined) == false would also be nice.
Check ECMAScript specifications for all the details when converting stuff.
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.