JavaScript equality transitivity is weird - javascript

I've been reading Douglas Crockford's JavaScript: The Good Parts, and I came across this weird example that doesn't make sense to me:
'' == '0' // false
0 == '' // true
0 == '0' // true
false == undefined // false
false == null // false
null == undefined // true
The author also goes on to mention "to never use == and !=. Instead, always use === and !==". However, he doesn't explain why the above behavior is exhibited? So my question is, why are the above results as they are? Isn't transitivity considered in JavaScript?

'' == '0' // false
The left hand side is an empty string, and the right hand side is a string with one character. They are false because it is making a comparison between two un identical strings (thanks Niall).
0 == '' // true
Hence, why this one is true, because 0 is falsy and the empty string is falsy.
0 == '0' // true
This one is a bit trickier. The spec states that if the operands are a string and a number, then coerce the string to number. '0' becomes 0. Thanks smfoote.
false == undefined // false
The value undefined is special in JavaScript and is not equal to anything else except null. However, it is falsy.
false == null // false
Again, null is special. It is only equal to undefined. It is also falsy.
null == undefined // true
null and undefined are similar, but not the same. null means nothing, whilst undefined is the value for a variable not set or not existing. It would kind of make sense that their values would be considered equal.
If you want to be really confused, check this...
'\n\r\t' == 0
A string consisting only of whitespace is considered equal to 0.
Douglas Crockford makes a lot of recommendations, but you don't have to take them as gospel. :)
T.J. Crowder makes an excellent suggestion of studying the ECMAScript Language Specification to know the whole story behind these equality tests.
Further Reading?
The spec.
yolpo (on falsy values)

The answer to this question has to do with how JavaScript handles coercion. In the case of ==, strings are coerced to be numbers. Therefore:
'' == '0' is equivalent to '' === '0' (both are strings, so no coercion is necessary).
0 == '' is equivalent to 0 === 0 because the string '' becomes the number 0 (math.abs('') === 0).
0 == '0' is equivalent to 0 === 0 for the same reason.
false == undefined is equivalent to 0 === undefined because JavaScript coerces booleans to be numbers when types don't match
false == null is equivalent to 0 === null for the same reason.
null == undefined is true because the spec says so.
Thanks for asking this question. My understanding of == is much better for having researched it.

You can actually write a JavaScript function that behaves exactly like == that should give you some insight into how it behaves.
To show you what I mean here is that function:
// loseEqual() behaves just like `==`
function loseEqual(x, y) {
// notice the function only uses "strict" operators
// like `===` and `!==` to do comparisons
if(typeof y === typeof x) return y === x;
if(typeof y === "function" || typeof x === "function") return false;
// treat null and undefined the same
var xIsNothing = (y === undefined) || (y === null);
var yIsNothing = (x === undefined) || (x === null);
if(xIsNothing || yIsNothing) return (xIsNothing && yIsNothing);
if(typeof x === "object") x = toPrimitive(x);
if(typeof y === "object") y = toPrimitive(y);
if(typeof y === typeof x) return y === x;
// convert x and y into numbers if they are not already use the "+" trick
if(typeof x !== "number") x = +x;
if(typeof y !== "number") y = +y;
return x === y;
}
function toPrimitive(obj) {
var value = obj.valueOf();
if(obj !== value) return value;
return obj.toString();
}
As you can see == has a lot of complicated logic for type conversion. Because of that it's hard to predict what result you are going to get.
Here are some examples of some results you wouldn't expect:
Unexpected Truths
[1] == true // returns true
'0' == false // returns true
[] == false // returns true
[[]] == false // returns true
[0] == false // returns true
'\r\n\t' == 0 // returns true
Unexpected Conclusions
// IF an empty string '' is equal to the number zero (0)
'' == 0 // return true
// AND the string zero '0' is equal to the number zero (0)
'0' == 0 // return true
// THEN an empty string must be equal to the string zero '0'
'' == '0' // returns **FALSE**
Objects with Special Functions
// Below are examples of objects that
// implement `valueOf()` and `toString()`
var objTest = {
toString: function() {
return "test";
}
};
var obj100 = {
valueOf: function() {
return 100;
}
};
var objTest100 = {
toString: function() {
return "test";
},
valueOf: function() {
return 100;
}
};
objTest == "test" // returns true
obj100 == 100 // returns true
objTest100 == 100 // returns true
objTest100 == "test" // returns **FALSE**

Related

Why is null < 1 true in JavaScript? [duplicate]

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

Javascript Optimal way of checking false statement except zero

In javascript we use shortand notation if(input){} to check for empty or null input. We will get false for Null, Undefined, false, 0, NAN, "". My requirements is that I want to get true for any number including zero so I have created a method as follows
function checkFalseExceptZero(value){
if( value !== undefined && value !== null && value !== false && value !== "" && value.length !== 0 ) return true;
else return false;
}
I have added all possible checks in the above method to get the desired result. I am curious if there is any quick, sweet, short or more robust approach to achieve the same?
A simple to understand answer. The three equal signs in JS will not convert type to check if the values equal unlike the two equal signs.
0 == false //true
0 === false //false, because 0 is a number and false is a boolean
Therefore, the answer, if you want to put it inside a function:
JS:
function isNumber(v) {
return typeof v === "number"
}
The function will check the type of the variable. So it is not actually comparing the value to determine the result. The function will only return true if the type of the variable is called number.
Test runs:
isNumber(0) //true
isNumber(false) //false
isNumber("0") //false
isNumber(undefined) //false
In case you want to know, typeof variable will return a string.
My requirements is that I want to get true for any number including
zero so I have created a method as follows
function checkFalseExceptZero(value){
if ( variable.constructor === Array) return !!value.length;
return value === 0 || !!value;
}
This a shorter version of your function. This returns true only when value is 0 or a trully value.
So :
checkFalseExceptZero(null) ==> false;
checkFalseExceptZero(undefined) ==> false;
checkFalseExceptZero(false) ==> false;
checkFalseExceptZero("") ==> false;
checkFalseExceptZero(0) ==> true;
checkFalseExceptZero([]) ==> false;
checkFalseExcpetZero([1]) ==> true;
For any valid number, including 0, JavaScript already exposes the isFinite function. This returns false for all non-numbers, as well as for infinite and NaN
Examples (excerpt from the linked mdn page):
isFinite(Infinity); // false
isFinite(NaN); // false
isFinite(-Infinity); // false
isFinite(0); // true
isFinite(2e64); // true
isFinite("0"); // true, would've been false with the
// more robust Number.isFinite("0")

Is x >= y the same as x > y || x == y [duplicate]

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

How can I break the law of non-contradiction in Javascript?

The law of non-contradiction dictates that two contradictory statements cannot both be true at the same time. That means that the expressions
(a && !a)
(a == !a)
(a === !a)
should always evaluate to a falsy value, and
(a || !a)
should always evaluate to a truthy value.
Fortunately, though, Javascript is a fun language that allows you to do all sorts of sick things. I bet someone a small fortune that it's possible to convince Javascript to break the law of non-contradiction, or, at least, convincingly make it look like it's breaking the law of non-contradiction. Now I'm trying to make all four of the above code examples give the unexpected result.
What would be a good way to go about this?
The best I can do is:
[] == ![] // true
or
var a = [];
a == !a
Of course this is really doing [] == false // true and !![] == ![] // false. It's really just a technicality.
EDIT: This is really a joke, but does work:
var a = false; var b = function() { return a = !a };
console.log(!!(b() && !b())); // true
console.log(b() == !b()); // true
console.log(b() === !b()); // true
console.log(b() || !b()); // true
This one will do the trick:
var a = '0';
a == !a
(evaluates to true)
In this case, a == false and !a == false.
a=NaN;
var a=NaN,
A=[(a && !a), (a == !a),(a === !a),(a || !a)];
alert(A)
/* returned value: (Array)
NaN,false,false,true
*/
I still haven't found anything to break && and ===, but here's one for == and ||:
Object.prototype.toString = function() {
return false;
};
a = {};
b = (a || !a);
alert(a || !a); //alerts false
alert(b == !b); //alerts true

Why `null >= 0 && null <= 0` but not `null == 0`?

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

Categories

Resources