Equality comparison between Date and number doesn't work - javascript

According to the ECMA script standard, the following code should return true, but it doesn't:
d = new Date() ;
d.setTime(1436497200000) ;
alert( d == 1436497200000 ) ;
Section 11.9.3 says:
If Type(x) is either String or Number and Type(y) is Object, return the result of the comparison x == ToPrimitive(y).
Then, section 8.12.8 says that ToPrimitive retuns the result of the valueOf method. Which means that the last line in my example above should be equivalent to:
alert( d.valueOf() == 1436497200000 );
Which does return true, indeed.
Why does the first case not return true?

If you look at the spec at section 8.12.8, you will find this text near the end of the section:
When the [[DefaultValue]] internal method of O is called with no hint, then it behaves as if the hint were Number, unless O is a Date object (see 15.9.6), in which case it behaves as if the hint were String.
(Emphasis mine)
Now, in step 8 / 9 of the The Abstract Equality Comparison Algorithm [11.9.3], ToPrimitive(x) and ToPrimitive(y) are called without hint parameter.
The lack of this hint parameter, together with the above text, means that the ToPrimitive method returns the toString() value, on date objects.
As you're probably aware, (new Date()).toString() returns a string representation of the date in American English [source]:
"Wed Jul 01 2015 22:08:41 GMT+0200 (W. Europe Daylight Time)"
That a string like that doesn't equal 1436497200000 shouldn't come as a big surprise. ;-)

ToPrimitive(A) attempts to convert its object argument to a primitive value, by attempting to invoke varying sequences of A.toString and A.valueOf methods on A.
So if the toString() call succeeds, it won't call valueOf().

Related

How js Date() object perform arithmetic (+, - * / )?

d = new Date();
d; // it is a string: Fri Apr 23 2021 10:30:34 GMT+0800(..)
d - 0; // it is a number: 1619145034656
d + 0; // it is a string: Fri Apr 23 2021 10:30:34 GMT+0800(..)0
d * 2; // it is a number: 3238290069312
d / 2; // it is a number: 809572517328
d.valueOf(); // it is a number: 1619145034656
d.toString(); // it is a string "Fri Apr 23 2021 10:30:34 GMT+0800 (...)"
I could accept all arithmetic but add(+), know '+' could be used for concating two string, but d.valueOf() is a number.
How Date objects know when use a string as it's value and when use number as it's value?
When operators like these are used, the specification runs a particular sequence of operations. If the operator used is +:
2. If opText is +, then
a. Let lprim be ? ToPrimitive(lval).
b. Let rprim be ? ToPrimitive(rval).
And calling ToPrimitive on a Date results in the string representation of the Date being returned.
In contrast, all of the other operators like - * / result in the other branch of the specification running:
3. NOTE: At this point, it must be a numeric operation.
4. Let lnum be ? ToNumeric(lval).
5. Let rnum be ? ToNumeric(rval).
...
9. Return ? operation(lnum, rnum).
The main difference is that + can be used to either add or concatenate. When either side is an object, it will cast the objects to strings, then concatenate. When any other operator is used, it will case the objects to numbers, then perform the appropriate arithmetic operation.
When ToPrimitive is called on a Date, the Date's ##toPrimitive method runs with no hint, which does:
If hint is string or default, ##toPrimitive tries to call the toString method. If the toString property does not exist, it tries to call the valueOf method and if the valueOf does not exist either, ##toPrimitive throws a TypeError.
As you can see, when called with no hint, the Date is cast to a string.

Why does JavaScript Date constructor fail on this number but works fine as a method

I'm honestly not sure how to phrase this question. Basically open a JavaScript console (node, your browser or wherever) and try this:
Date(564018060878018050) // 'Fri Nov 23 2018 06:22:20 GMT-0800 (Pacific Standard Time)'
new Date(564018060878018050) // <-- Invalid Date
I have no idea why the first one works and the second one doesn't. Is there another way to parse. I'm trying to stay away from using a library for this.
The specs says that:
The actual range of times supported by ECMAScript Date objects is
[...] exactly –100,000,000 days to 100,000,000 days
measured relative to midnight at the beginning of 01 January, 1970
UTC. This gives a range of 8,640,000,000,000,000 milliseconds to
either side of 01 January, 1970 UTC.
The valid range is much smaller than the value you used (564,018,060,878,018,050).
And deep inside the Date(value) constructor we have:
If abs(time) > 8.64 × 1015, return NaN.
This explains why new Date(564018060878018050) yields invalid date.
As for Date(564018060878018050) the specs say that:
... Invoking a constructor without using new has consequences that
depend on the constructor. For example, Date() produces a string
representation of the current date and time rather than an object.
So Date(value) is supposed to return current date as a string and not a date.
> Date(564018060878018050) === (new Date()).toString()
< true
> typeof Date(564018060878018050)
< "string"
You are calling the Date constructor as Function and as say in ECMAscript doc:
"When Date is called as a function rather than as a constructor, it returns a String representing the current time (UTC)."
"NOTE The function call Date(…) is not equivalent to the object creation expression new Date(…) with the same arguments."
You can find more details here: https://www.ecma-international.org/ecma-262/5.1/#sec-15.9.2

Why can I do math with Date objects? [duplicate]

This question already has answers here:
Integer addition/subtraction issues with the Date object
(2 answers)
Closed 4 years ago.
When I subtract two Date-objects like this:
const startTime = new Date();
await someAsyncStuff();
const endTime = new Date();
const elapsedTime = endTime - startTime;
console.log(`The async stuff took ${elapsedTime} ms`);
Why does the Date objects end up being cast to milliseconds, which are then subtracted? I understand that they do, but I can't figure out what the actual sequence of events is that lead to this.
In general, JavaScript objects can define methods to convert the object to a String or a number (which you can customize by defining toString and valueOf methods). JavaScript will use those methods in numerical contexts (like 2 * a) or string contexts (like '' + a) to convert the object to the appropriate primitive.
In a context where it's ambiguous whether to use numerical or string conversion (like a + b), there's a default behavior, depending on the type of the object. Interestingly, Date are singled out among the default ECMAScript objects to convert to a String, instead of a number. Via the spec:
Date objects, are unique among built-in ECMAScript object in that they
treat "default" as being equivalent to "string", All other built-in
ECMAScript objects treat "default" as being equivalent to "number".
In the particular case of Date objects, the numerical conversion (the valueOf method) converts the time to epoch milliseconds, while the string conversion (the toString method) converts the object to a human-readable string. As #baao mentions in his answer, this can cause some issues when doing "arithmetic" with objects, due to automatic conversions of type.
In summary, Date (unlike most other objects) defaults to string conversion, but since subtraction requires two numbers for it to make sense, it converts the dates to numbers.
It's generally a good idea to explicitly define the behavior, in this case using valueOf, getTime or toString to make the code less ambiguous.
For more information of whether JavaScript chooses to use toString vs valueOf, see this question, the overall spec for addition (and subtraction), and the specific spec for Dates (mdn link), and #baao's answer for a more in depth look.
See Also:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/valueOf
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/valueOf
It's how javascript does automatic type conversion - just like arithmetic operations (that you are performing here). You were lucky that you subtracted, if you've added them you'd end up with a string holding two date strings in a row because of how toPrimitive (that gets called implicitly) works for Dates. Consider the following
// automatic casting
console.log("1" - "1"); // 0
// but
console.log("1" + "1"); // 11
// now with dates
// automatic casting
console.log(new Date() - new Date()); 0
console.log(new Date() + new Date()); // Mon Jun 11 2018 10:10:36 GMT+0200 (Mitteleuropäische Sommerzeit)Mon Jun 11 2018 10:10:36 GMT+0200 (Mitteleuropäische Sommerzeit)
The specification on the additional operator has the following hint that explains this further
All native ECMAScript objects except Date objects handle the absence of a hint as if the hint Number were given; Date objects handle the absence of a hint as if the hint String were given.
JavaScript converts your value to a primitive when using arithmetic opererators, the method that gets called here is
Date.prototype [ ##toPrimitive ] ( hint )
Date.prototype [ ##toPrimitive ] ( hint )
This function is called by ECMAScript language operators to convert a Date object to a primitive value. The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string", All other built-in ECMAScript objects treat "default" as being equivalent to "number".
That said. The reason why your code works how it works is the later auto conversion performed by the subtraction, which gives hint to toPrimitive to return a number.
It equals endTime.getTime() - startTime.getTime()
As you say, they cast to millisecond and that exactly shows the difference.
This is happening because JS asks for the primitive of the Date's object when performing such operation, through the method valueOf. The Date's object overrides valueOf methods, so that the value used is basically the same of getTime.
You can also try by yourself:
const o = { valueOf: () => 10 };
console.log(o + 1) // 11

Determine if string is Date or Number

I'm trying to determine if a string is a number or a date.
Here is my code:
this._getFieldFormat = (value) => {
// check if is date
let d = new Date(value);
if (!isNaN( d.getTime() ) ) {
return 'date';
}
// check if is boolean
// isNaN(false) = false, false is a number (0), true is a number (1)
if(typeof value === 'boolean'){
return 'boolean';
}
// check if a string is a number
if(!isNaN(value)){
return 'number';
}
return typeof value;
};
It works for a date like: 2016-04-19T23:09:10.208092Z.
The problem is that 1 look to be a valid date (Wed Dec 31 1969 16:00:00 GMT-0800 (PST)) and isNaN(new Date()) is return false (a date is a number).
Any idea on how to get out of this loop?
So what is happening is called coercion. Since javascript is dynamic typing when you give it two different types the js engine tries to coerce one of the types into something reasonable or what it thought you meant.
For instance:
isNan("37"); is false because "37" is converted to the number 37
isNaN(new Date()) is return false (a date is a number)
It converted Date to a number so this is correct.
However, invalid values in date strings not recognized as ISO format as defined by ECMA-262 may or may not result in NaN, depending on the browser and values provided
So
new Date('23/25/2014'); // NON-ISO string with invalid date values
So this will return NaN in all browsers that comply with ES5 and later.
Also to do a stricter check you can use:
Number.isNan(new Date()); // This should return true
So to recap make sure the date conform to the ISO standard or it will be NaN and use the stricter check. Hope this helps.
In general and from a Javascript design point of view, however, I don't think you can do it by design. Any number between 8640000000000000 and the earliest date in terms of a number -8640000000000000 can be converted to date (represented as time in milliseconds from 01 January, 1970 UTC).
Therefore, any number not falling is this range cannot be a date. And any number falling in range would be a valid date or a number and you gotta use context to determine if you want to interpret it as a number or a date.
You could however do a naive implementation to test if number is a valid date or not, but that's the best you can do, without context.
Determining whether a date is a number or not can be a bit easier. Because a human-readable representation of date will return true by isNaN() thereby determining that it's definitely not a number. Then you would go on and check if that string is a date or not, which you already did in your function.

Why can a date object be subtracted from another date object in JavaScript?

Consider the following code:
var a = new Date(someDateString);
var b = new Date(someOtherDateString);
console.log(b - a); // Outputs for example 3572, number of millisecs between the dates
Why does this work? This is an arithmetic operation on two objects. It looks suspiciously like operator overloading, known from C++ and other languages, but as far as I know, JavaScript won't get that before ECMAScript 7.
One would think the JS engine would turn it into something like
console.log(b.toString() - a.toString());
but this prints "NaN", as toString on a dateobject returns a string on the format
Mon Mar 23 2015 13:21:33 GMT+0100 (CET)
So, what magic makes this arithmetic possible? Can it be implemented on custom objects?
It converts them through .valueOf() not .toString(). Internally a date is stored as a number. The toString() method just formats it using a date formatting routine.
See The Subtraction Operator in the spec.
valueOf is called to fetch the numeric primitive of the date:
function o(i) {
this.valueOf = function() { return i; }
}
var a = new o(100);
var b = new o(42);
alert(a - b); // 58
A JS Date Object is just "a number of milliseconds".
"Zero time" starts at 01 January 1970 00:00:00 UTC. Everything else is a simple conversion, and adding/subtracting become very easy!

Categories

Resources