I am working understanding a JavaScript library and I came across this statement:
const assetsManifest = process.env.webpackAssets && JSON.parse(process.env.webpackAssets)
Then later on in the library, it uses the assetsMannifest like an object e.g.
assetsManifest['/vendor.js']
I thought the && operator was only used to return boolean values in logical checks. Can someone explain to me what is going on here?
Many thanks,
Clement
This operator doesn't always return true or false. It doesn't work like in some other programming languages. In JavaScript && operator returns the first value if it's falsy or the second one if not.
Examples:
null && 5 returns null because null is falsy.
"yeah yeah" && 0 returns 0 because every string is truthy.
Not so obvious :-)
Further reading:
Why don't logical operators (&& and ||) always return a boolean result?
&& returns first value converting to false or last value converting to true. It's because no need to calculate full logical condition with && if first value is falsy
console.log(55 && 66);
console.log(0 && 77);
console.log(88 && 0);
Also you can use && or || as if operator:
if (itsSunny) takeSunglasses();
equals to
itsSunny && takeSunglasses();
in that context it is checking if process.env.webpackAssets is a truthy value. If it is it will evaluate and return the next part. in this case JSON.parse(process.env.webpackAssets)
The logic is essentially
if (process.env.webpackAssets) {
return JSON.parse(process.env.webpackAssets)
}
else {
return process.env.webpackAssets // (null | undefined | 0 | '' | false)
}
Both && and || are evaluting there arguments in lazy mode and return the last value, after witch the result is known.
123 && (0 || '' && 78) || 7 && 8 || [] && {} || 90 || 77 && 13
###_^^ both results are possible 123 && ???
#_^^ first part is falsy, resume 0 || ??
#####_^^ can't be true, return ''
^^_########## 0 || '' return ''
^^_################ return ''
#######################_^^ resume
#_^^ resume
^^_# return 8
^^_###### return 8
^^_########################## drop
And the result is 8.
Related
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.
Why is the mid number result of this statement?
(0 || 0.571428571428571 || 1) == 0.571428571428571
How the comparsion works?
Well. Looks like stupid question. For those who works in php where this statement has a different result could be helpfull to know that JS will return number instead bool.
|| is a short-circuiting operator. It evaluates its left-hand operand, and returns that if it's truthy, otherwise it evaluates and returns its right-hand operator. When you have a sequence of || operators, it performs this left-to-right, returning the first truthy value (or the last value if none are truthy).
In your example, 0 is falsey, so 0.571428571428571 is the first truthy value, and it's returned.
Your expression is interpreted as
((0 || 0.571428571428571) || 1)
The result of the first || is truthy, so that's the result of the overall expression. That's how || works: the result is the left-hand operand if it's truthy, or else the right-hand operand.
The answer to this lies in the fact that 0 is considered a false value while non-zero numbers are considered true.
Since || and && will short circuit and return as soon as it knows the outcome. For OR expressions this means that the first truthy value will be returned. For AND expressions it means the first false value is returned:
(true && true && 0 && true) === 0
(false && false && 1) === 1
(false || false || 1 || false) === 1
(false || false || 0) === 0
Since the first value in your expression is a falsey value, it is not returned but the floating point is a truthy value, so it is returned.
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.
I've been reading Douglas Crockford's JavaScript: The Good Parts, and I came across this weird example that doesn't make sense to me:
'' == '0' // false
0 == '' // true
0 == '0' // true
false == undefined // false
false == null // false
null == undefined // true
The author also goes on to mention "to never use == and !=. Instead, always use === and !==". However, he doesn't explain why the above behavior is exhibited? So my question is, why are the above results as they are? Isn't transitivity considered in JavaScript?
'' == '0' // false
The left hand side is an empty string, and the right hand side is a string with one character. They are false because it is making a comparison between two un identical strings (thanks Niall).
0 == '' // true
Hence, why this one is true, because 0 is falsy and the empty string is falsy.
0 == '0' // true
This one is a bit trickier. The spec states that if the operands are a string and a number, then coerce the string to number. '0' becomes 0. Thanks smfoote.
false == undefined // false
The value undefined is special in JavaScript and is not equal to anything else except null. However, it is falsy.
false == null // false
Again, null is special. It is only equal to undefined. It is also falsy.
null == undefined // true
null and undefined are similar, but not the same. null means nothing, whilst undefined is the value for a variable not set or not existing. It would kind of make sense that their values would be considered equal.
If you want to be really confused, check this...
'\n\r\t' == 0
A string consisting only of whitespace is considered equal to 0.
Douglas Crockford makes a lot of recommendations, but you don't have to take them as gospel. :)
T.J. Crowder makes an excellent suggestion of studying the ECMAScript Language Specification to know the whole story behind these equality tests.
Further Reading?
The spec.
yolpo (on falsy values)
The answer to this question has to do with how JavaScript handles coercion. In the case of ==, strings are coerced to be numbers. Therefore:
'' == '0' is equivalent to '' === '0' (both are strings, so no coercion is necessary).
0 == '' is equivalent to 0 === 0 because the string '' becomes the number 0 (math.abs('') === 0).
0 == '0' is equivalent to 0 === 0 for the same reason.
false == undefined is equivalent to 0 === undefined because JavaScript coerces booleans to be numbers when types don't match
false == null is equivalent to 0 === null for the same reason.
null == undefined is true because the spec says so.
Thanks for asking this question. My understanding of == is much better for having researched it.
You can actually write a JavaScript function that behaves exactly like == that should give you some insight into how it behaves.
To show you what I mean here is that function:
// loseEqual() behaves just like `==`
function loseEqual(x, y) {
// notice the function only uses "strict" operators
// like `===` and `!==` to do comparisons
if(typeof y === typeof x) return y === x;
if(typeof y === "function" || typeof x === "function") return false;
// treat null and undefined the same
var xIsNothing = (y === undefined) || (y === null);
var yIsNothing = (x === undefined) || (x === null);
if(xIsNothing || yIsNothing) return (xIsNothing && yIsNothing);
if(typeof x === "object") x = toPrimitive(x);
if(typeof y === "object") y = toPrimitive(y);
if(typeof y === typeof x) return y === x;
// convert x and y into numbers if they are not already use the "+" trick
if(typeof x !== "number") x = +x;
if(typeof y !== "number") y = +y;
return x === y;
}
function toPrimitive(obj) {
var value = obj.valueOf();
if(obj !== value) return value;
return obj.toString();
}
As you can see == has a lot of complicated logic for type conversion. Because of that it's hard to predict what result you are going to get.
Here are some examples of some results you wouldn't expect:
Unexpected Truths
[1] == true // returns true
'0' == false // returns true
[] == false // returns true
[[]] == false // returns true
[0] == false // returns true
'\r\n\t' == 0 // returns true
Unexpected Conclusions
// IF an empty string '' is equal to the number zero (0)
'' == 0 // return true
// AND the string zero '0' is equal to the number zero (0)
'0' == 0 // return true
// THEN an empty string must be equal to the string zero '0'
'' == '0' // returns **FALSE**
Objects with Special Functions
// Below are examples of objects that
// implement `valueOf()` and `toString()`
var objTest = {
toString: function() {
return "test";
}
};
var obj100 = {
valueOf: function() {
return 100;
}
};
var objTest100 = {
toString: function() {
return "test";
},
valueOf: function() {
return 100;
}
};
objTest == "test" // returns true
obj100 == 100 // returns true
objTest100 == 100 // returns true
objTest100 == "test" // returns **FALSE**
I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.
The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.
null >= 0 && null <= 0
!(null < 0 || null > 0)
null + 1 === 1
1 / null === Infinity
Math.pow(42, null) === 1
Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.
Why is null like 0, although it is not actually 0?
Your real question seem to be:
Why:
null >= 0; // true
But:
null == 0; // false
What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.
null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:
null == null; // true
null == undefined; // true
Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.
You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.
In Summary:
Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.
Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.
I'd like to extend the question to further improve visibility of the problem:
null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0; //false
null < 0; //false
It just makes no sense. Like human languages, these things need be learned by heart.
JavaScript has both strict and type–converting comparisons
null >= 0; is true
but
(null==0)||(null>0) is false
null <= 0; is true but (null==0)||(null<0) is false
"" >= 0 is also true
For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.
typeof null returns "object"
When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):
If PreferredType was not passed, let hint be "default".
Else if PreferredType is hint String, let hint be "string".
Else PreferredType is hint Number, let hint be "number".
Let exoticToPrim be GetMethod(input, ##toPrimitive).
ReturnIfAbrupt(exoticToPrim).
If exoticToPrim is not undefined, then
a) Let result be Call(exoticToPrim, input, «hint»).
b) ReturnIfAbrupt(result).
c) If Type(result) is not Object, return result.
d) Throw a TypeError exception.
If hint is "default", let hint be "number".
Return OrdinaryToPrimitive(input,hint).
The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string".
All other built-in ECMAScript objects treat "default" as being equivalent to "number". (ECMAScript 20.3.4.45)
So I think null converts to 0.
console.log( null > 0 ); // (1) false
console.log( null == 0 ); // (2) false
console.log( null >= 0 ); // (3) true
Mathematically, that’s strange. The last result states that "null is greater than or equal to zero", so in one of the comparisons above it must be true, but they are both false.
The reason is that an equality check == and comparisons > < >= <= work differently. Comparisons convert null to a number, treating it as 0. That’s why (3) null >= 0 is true and (1) null > 0 is false.
On the other hand, the equality check == for undefined and null is defined such that, without any conversions, they equal each other and don’t equal anything else. That’s why (2) null == 0 is false.
I had the same problem !!.
Currently my only solution is to separate.
var a = null;
var b = undefined;
if (a===0||a>0){ } //return false !work!
if (b===0||b>0){ } //return false !work!
//but
if (a>=0){ } //return true !
It looks like the way to check x >= 0 is !(x < 0) In that way make sense the response.