Is x >= y the same as x > y || x == y [duplicate] - javascript

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?

Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.

I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.

JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.

console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.

I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !

It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

Related

Why does this code runs correctly with loose equality than with strict equality?

I tried to code the snippets with both strict equality and loose equality to count the total number of truthy values in the given array.
The code runs correctly with the loose equality
let array = [0, '0', true, 100, false, NaN];
countTruthy(array);
function countTruthy(array){
let count = 0 ;
for (let element of array){
if (element == (null || undefined || NaN || false || 0 || '')){ //comparing with loose equality
continue;
}
console.log(element);
count++
}
console.log (`The total truthy in the array is : ${count}`);
}
While the code gives incorrect count with strict equality.
let array = [0, '0', true, 100, false, NaN];
countTruthy(array);
function countTruthy(array){
let count = 0 ;
for (let element of array){
if (element === (null || undefined || NaN || false || 0 || '')){//Using the strict equality
continue;
}
console.log(element);
count++
}
console.log (`The total truthy in the array is : ${count}`);
}
I also tried the
console.log(undefined === undefined);
Why am I getting the wrong count with strict equality while correct count with loose equality?
I also know there is efficient way to write the same code. So please give suggestions only for the above issues I am facing.
When you use a chain of ||s, the whole expression will evaluate to the first truthy value, if there is one - otherwise, it will evaluate to the final (falsey) value. So
(null || undefined || NaN || false || 0 || '')
is equivalent to
('')
With strict equality, none of the array items are the empty string, so all pass the test.
With loose equality, only 0 and false are == to the empty string.
console.log(
0 == '',
false == ''
);
With Abstract Equality Comparison:
For 0 == '': When a number is compared against a string on the right, the string gets converted to a number, and 0 == 0 is true
If Type(x) is Number and Type(y) is String, return the result of the comparison x == ToNumber(y).
For false == '': When a boolean is compared against a string on the right, the boolean is converted into a number first:
If Type(x) is Boolean, return the result of the comparison ToNumber(x) == y.
Then the string is converted into a number:
If Type(x) is Number and Type(y) is String, return the result of the comparison x == ToNumber(y).
and 0 == 0 is true.
I would just like to add that your code does not actually count the truthy values correctly - there are 3 truthy values in your array ("0", true, 100).
The problem comes from equating NaN in both loose and strict equality. NaN is always false when compared to anything, even:
NaN === NaN; //false
NaN == NaN; //false
That's why your code counts 4 values instead of 3 using loose equality.
A better way to check if a value is truthy is just to have Javascript convert it to a boolean:
function countTruthy(array){
let count = 0 ;
for (let element of array){
if (!element)
{
continue;
}
count++;
}
return count;
}

Why is null < 1 true in JavaScript? [duplicate]

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

Logical operators evaluation in javascript

Can you explain how works the Comparison Operators in JS:
"a" > "A" // => why true?
null == undefined; // and here as well?
and some others
null > 0;
null >= 0;
Strings are compared by their character codes, ie. their positions in the Unicode table.
A is 65, a is 97. Therefore "a" > "A".
== is a loose comparison. null == undefined is a special case, since the abstract equality comparison algorithm explicitly states that true should be returned when comparing these two values:
2. If x is null and y is undefined, return true.
3. If x is undefined and y is null, return true.
null > 0 is false, and null >= 0 is true because null, when converted to a number, is zero.

JavaScript: What is the difference between `if (!x)` and `if (x == null)`?

What is the difference between if (!x) and if (x == null); that is, when can their results be different?
!x will return true for every "falsy" value (empty string, 0, null, false, undefined, NaN) whereas x == null will only return true if x is null (edit: or apparently undefined (see below)).
Try with x = 0, there is a difference.
You can say that the NOT operator ! converts a value into its opposite boolean equivalent. This is different than actually comparing two values.
In addition, if you compare values with ==, JavaScript does type conversion which can lead to unexpected behavior (like undefined == null). It is better to always use strict comparison === (value and type must be the same) and make use of type conversion only if you really know what you are doing.
Something to read:
Data Type Conversion
Comparison Operators
Logical Operators
Update:
For more information about the non-strict comparison of null and undefined (or the comparison in general), it is worth having a look at the specification. The comparison algorithm is defined there (the comparison is x == y):
If Type(x) is the same as Type(y), then
(...)
If x is null and y is undefined, return true.
If x is undefined and y is null, return true.
(...)
(...)
The results can be different if x is false, NaN, '' (empty string), undefined (using the strict comparison operator ===), or 0 (zero).
See Felix Kling's answer for an excellent summary of type comparison.
if (!x)
coerces x uses the internal ToBoolean function
if (x==null)
coerces both operands using the internal ToPrimitive function (which generally resolves each side to a number, occasionally a string, depending on the operands)
For full explanantion of ToBoolean vs ToPrimitive see http://javascriptweblog.wordpress.com/2011/02/07/truth-equality-and-javascript/
Say x is a string.
x = undefined;
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
x = "";
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
x = null;
if(!x) {
alert("X is not a truthy value");
}
if(x == null) {
alert("X is null");
}
You'll notice that "X is not a truthy value" is shown in all three cases, but only in the case of X being undefined or null is "X is null" shown.
When X is a boolean value, then (!x) will be true when X is false but (x == null) will not be. For numbers 0 and NaN are considered false values, so not X is truthy.
See it in action, including the difference between == (equality using type conversion) and === (strict equality)
!x tests for a false value. This will be true for any value that can propagate to false for whatever reason. This will be true for permutations of false, 0, etc etc.
x == null is different because var x = 0 will NOT be null... but WILL be false.

Why `null >= 0 && null <= 0` but not `null == 0`?

I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.

Categories

Resources