How come this is true? [duplicate] - javascript

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Why does (0 < 5 <3) return true?
How come this is true?:
console.log(100 < 210 < 200); // outputs true

That's equivalent to:
console.log((100 < 210) < 200);
which is equivalent to:
console.log(true < 200);
And this evaluates to true because when using operators like <, true is treated as if it were a 1
Consequently, the following will evaluate to false:
console.log(true < 0)

100 < 210 < 200 is equivalent to (100 < 210) < 200 which is true < 200 which is 1 < 200 which is true.
That last bit (true becoming 1) may be a bit surprising. It's the result of how JavaScript does relational operations (Section 11.8.5 of the spec), which says amongst other things that if the relation is between a non-number (other than null or undefined) and a number, you convert the non-number to a number, and the conversion of true to a number results in 1 (Section 9.3 of the spec.)

Related

Why is 1<x<3 always true?

I'm learning TypeScript, and in the getting started page, they talk about how unexpected javascript is.
Source: https://www.typescriptlang.org/docs/handbook/typescript-from-scratch.html
if ("" == 0) {
// It is! But why??
}
if (1 < x < 3) {
// True for *any* value of x!
}
But I still don't understand why 1<x<3 is always true? For example if I let x=10, it will not be true by logic, but why they said it always true?
1 < x < 3 actually is doing this:
(1 < x) < 3
Or even more long form:
const tempVarA = 1 < x
const tempVarB = tempVarA < 3
So 1 < x is either true or false. Then the next step is true < 3 or false < 3. Those don't make much sense as comparisons, but let's see what javascript does with that:
console.log(true < 3) // true
console.log(false < 3) // true
Weird, but let's dig deeper:
console.log(true >= 0) // true
console.log(true >= 1) // true
console.log(true >= 2) // false
console.log(false >= 0) // true
console.log(false >= 1) // false
console.log(false >= 2) // false
It seems that true is being treated as 1 and false is being treated as 0. To verify that let's compare with == (instead of ===) so that it coerces the type of the data for us.
console.log(true == 1) // true
console.log(true == 0) // false
console.log(false == 1) // false
console.log(false == 0) // true
So 1 < x < 3 is always true because false becomes 0 or true becomes 1, and both 0 and 1 always less than 3.
Explanation:
in javascript, comparison operators <, <=, >, >=, ==, and != coerce their operands to make them comparable when they are different types. So when comparing a boolean to a number it coverts the boolean to a number, 0 or 1.
This is why you should almost always use === instead of ==, and why this is a type error in typescript:
const a = true < 3
// Operator '<' cannot be applied to types 'boolean' and 'number'.(2365)
Short version
Javascript and typescript lack a chainable comparison operator.
Did you mean to do this?
1 < x && x < 3
in other words: true is 1 and false is 0. Therefore 1 < x < 3 with x = 5 is as if it executes (1 <5) <3 and writes to 1 <5 = 1 (true) and then 1 <3 = 1 (true). If instead x were 0? Ok 1 <0 is false (0) consequently 0 <3 is true (1)

Are multiple comparison operators possible in javascript? [duplicate]

This question already has answers here:
Dual or triple (or even multiple) comparison in JavaScript
(3 answers)
Closed 2 years ago.
So i was writing in the Browser console
4 < 5
true
4 < 5 < 10
true
7 < 4 < 10
true
The first two statements look ok, why is the last statement also correct? I would have thought (as in mathematics) the expression would be 'AND'ed ? can any one point me to MDN or similar resource which talks about this and the rules ?
What Javascript does, is evaluate the left side first:
4 < 5 => true
It then continues on the rest:
true < 10 => true
because true on the left side is coerced to a number (1), and 1 < 10 === true. (false in comparisons with numbers is coerced to 0).
Check these funny-looking "goofs":
console.log(3 < 2 < 1)
console.log(1 < 1 < 1)
Beside the not working approach to write a continuing check with more than one copmparison operator, and while the next comparison takes place the the result of the previous comparison, you could take an array and a function for the comparison operator and iterate the items for every pair to check. Then return the result.
const compare = (values, fn) => values.slice(1).every((b, i) => fn(values[i], b));
console.log(compare([7, 4, 10], (a, b) => a < b)); // false
console.log(compare([4, 7, 10], (a, b) => a < b)); // true
console.log(compare([2, 1, 0], (a, b) => a > b)); // true
console.log( false < 10) // true
console.log( true < 10) // true
because false (0) is less than to 10
and true (1) is also less than to 10.
4 < 5 evaluates to true. Then true is compared with 10. Javascript converts true to 1 and as 1 is smaller than 10 the result is true.
With this logic if you try 4 < 5 < 3 this will also return true with the same logic, and 4 < 5 < 1 will return false.

Why does the statement `0 < 0.5 < 1` reduce to false?

I recently discovered the ECMAScript Spec, which I was able to use to answer a question I've had about Math.random() for a while; namely whether or not it would ever generate 0 without the help of Math.floor().
The ECMAScript Spec specifies that 0 <= Math.random() < 1, unlike literally anyone else for some reason. I hopped over to the console and ran a test before saving it in my notes but noticed that statement reduces to false.
Below is a function that tests everything about comparison statements that I thought might be causing this lie. I call the function twice to generate two arrays of Boolean values, and the results seem to imply that literally this statement: 0 <= Math.random() < 1 - and this statement alone, returns FALSE where it should return TRUE. Especially when you consider bonus round where I test the exact same statement but with an additional comparison tacked onto the end, and it also returns true
function getTruths( a, b, c ) {
return [
a + 1 < b + 1,
a + 1 < c + 1,
b + 1 < c + 1,
a + 1 < b + 1 < c + 1,
a + 0 < b + 0,
a + 0 < c + 0,
b + 0 < c + 0,
a + 0 < b + 0 < c + 0
];
}
function demonstrate() {
// array of truth
console.log( getTruths( 0, 1, 2 ) );
// array of lies
console.log( getTruths( 0, 0.5, 1 ) );
// bonus round
return [ 0 < 0.5 < 1 < 1.5 ];
}
demonstrate();
So I did some more plucking around and learned that it isn't just that. it seems that a and b can actually be any value lower that one and equal to or greater than zero, just as long as b is still bigger than a, and c is still equal to 1 of course... and given those parameters, no matter what, the return is still FALSE. If you add 1 to everything though suddenly you're in good shape again, just as in the function provided above.
Anyway can someone explain this to me? Thanks.
a < b < c
is interpreted as
(a < b) < c
The result value from a relational operator like < is a boolean, either true or false. If c is a number, then, that boolean will be converted to a number, either 1 (true) or 0 (false). Thus c is compared to either 0 or 1, not the values of either a or b.
The proper way to write "is b strictly between a and c" is
a < b && b < c

Javascript chained inequality gives bizarre results [duplicate]

This question already has answers here:
Can I use chained comparison operator syntax? [duplicate]
(6 answers)
Closed 8 years ago.
(0 <= 0 <= 0) === false
(-1 < 0 <= 0 <= 0) === true
What's going on here? Does Javascript actually have inequality chaining that's just wrong in some cases?
Typed up the question and then was struck by the answer. Javascript does not have inequality chaining. Rather, 0 <= 0 <= 0 becomes true <= 0, which is evaluated as 1 <= 0. Indeed, 0 < 0 <= 0 evaluates to true.
There is no chaining of operator but precedence. Here all the operators have the same priority so the operations are done from left to right.
When your comparison involves a boolean, the MDN explains how the comparison works :
If one of the operands is Boolean, the Boolean operand is converted to
1 if it is true and +0 if it is false.
This means the first operation is decomposed according to priorities as
((0 <= 0) <= 0)
which is
true <= false
which is
false
And the second one is
(true <= 0) <= 0
which is
false <= 0
which is true.

Javascript expression bug (0 <= 14 < 10)?

How can this be true?
0 <= 14 < 10
I need to evaluate that a number is in between 0 and 10.
But this breaks my logic.
Shouldn't that expression be false?
This expression:
0 <= 14 < 10
is the same as
(0 <= 14) < 10
which is
1 < 10
true.
What you can do instead is:
if (0 <= x && x < 10) { ...
Python is the only programming language I can think of right now where the expression x < y < z does what you expect it to do.
It's not a bug. It's the way the expression is parsed.
It goes left to right:
0 <= 14 < 10
true < 10
1 < 10
true
As you can see, 0 <= 14 is true, so that's what it's replaced with. true is equal to 1 in JavaScript, and most other C-derived languages.
What you should use is:
(0 <= 14) && (14 < 10)
0 <= 14 < 10
(0 <= 14) < 10
true < 10
1 < 10
true
This expression is not evaluating the way you expect it to. It's equivalent to
((0 <= 14) < 10)
So the first one evaluates and returns true, so you then have true < 10, and true < 10 evaluates to true in JavaScript
NO. That is true.
0 <= 14 is true (same as int(1)) and 1 < 10 is true
do this instead:
if(variable > 0 && variable < 10) {
//do something
}
Also check out: https://developer.mozilla.org/en/JavaScript/Reference/Operators/Operator_Precedence

Categories

Resources