d = new Date();
d; // it is a string: Fri Apr 23 2021 10:30:34 GMT+0800(..)
d - 0; // it is a number: 1619145034656
d + 0; // it is a string: Fri Apr 23 2021 10:30:34 GMT+0800(..)0
d * 2; // it is a number: 3238290069312
d / 2; // it is a number: 809572517328
d.valueOf(); // it is a number: 1619145034656
d.toString(); // it is a string "Fri Apr 23 2021 10:30:34 GMT+0800 (...)"
I could accept all arithmetic but add(+), know '+' could be used for concating two string, but d.valueOf() is a number.
How Date objects know when use a string as it's value and when use number as it's value?
When operators like these are used, the specification runs a particular sequence of operations. If the operator used is +:
2. If opText is +, then
a. Let lprim be ? ToPrimitive(lval).
b. Let rprim be ? ToPrimitive(rval).
And calling ToPrimitive on a Date results in the string representation of the Date being returned.
In contrast, all of the other operators like - * / result in the other branch of the specification running:
3. NOTE: At this point, it must be a numeric operation.
4. Let lnum be ? ToNumeric(lval).
5. Let rnum be ? ToNumeric(rval).
...
9. Return ? operation(lnum, rnum).
The main difference is that + can be used to either add or concatenate. When either side is an object, it will cast the objects to strings, then concatenate. When any other operator is used, it will case the objects to numbers, then perform the appropriate arithmetic operation.
When ToPrimitive is called on a Date, the Date's ##toPrimitive method runs with no hint, which does:
If hint is string or default, ##toPrimitive tries to call the toString method. If the toString property does not exist, it tries to call the valueOf method and if the valueOf does not exist either, ##toPrimitive throws a TypeError.
As you can see, when called with no hint, the Date is cast to a string.
Related
Let's assume I have a proper Date object constructed from the string: "Tue Jan 12 21:33:28 +0000 2010".
var dateString = "Tue Jan 12 21:33:28 +0000 2010";
var twitterDate = new Date(dateString);
Then I use the < and > less than and greater than comparison operators to see if it's more or less recent than a similarly constructed Date. Is the algorithm for comparing dates using those operators specified, or is it specifically unspecified, like localeCompare? In other words, am I guaranteed to get a more recent date, this way?
var now = new Date();
if (now < twitterDate) {
// the date is in the future
}
Relational operations on objects in ECMAScript rely on the internal ToPrimitive function (with hint number) that you can access, when it is defined, using valueOf.
Try
var val = new Date().valueOf();
You'll get the internal value of the date which is, as in many languages, the number of milliseconds since midnight Jan 1, 1970 UTC (the same that you would get using getTime()).
This means that you're, by design, ensured to always have the date comparison correctly working.
This article will give you more details about toPrimitive (but nothing relative to comparison).
Date values in Javascript are numbers, as stated in the ECMA Script specification. So the Date values are compared as numbers.
This is a demo of your code (I set twitterDate in the future).
(function(){
var dateString = "Tue Jan 12 21:33:28 +0000 2014";
var twitterDate = new Date(dateString);
var now = new Date();
if (now < twitterDate) {
document.write('twitterDate is in the future');
}
else
{
document.write('twitterDate is NOT in the future');
}
})()
I think yes. Using if (now < twitterDate), it evaluates to if (now.valueOf()<twitterDate.valueOf()). valueOf() delivers the number of milliseconds passed since 01/01/1970 00:00:00, so the comparison of those 2 numbers is valid.
check it like this
var then = new Date("Tue Jan 12 21:33:28 +0000 2010")
,now = new Date;
console.log(then.valueOf(),'::',now.valueOf(),'::',now<then);
//=> 1263332008000 :: 1352365105901 :: false
This question already has answers here:
Integer addition/subtraction issues with the Date object
(2 answers)
Closed 4 years ago.
When I subtract two Date-objects like this:
const startTime = new Date();
await someAsyncStuff();
const endTime = new Date();
const elapsedTime = endTime - startTime;
console.log(`The async stuff took ${elapsedTime} ms`);
Why does the Date objects end up being cast to milliseconds, which are then subtracted? I understand that they do, but I can't figure out what the actual sequence of events is that lead to this.
In general, JavaScript objects can define methods to convert the object to a String or a number (which you can customize by defining toString and valueOf methods). JavaScript will use those methods in numerical contexts (like 2 * a) or string contexts (like '' + a) to convert the object to the appropriate primitive.
In a context where it's ambiguous whether to use numerical or string conversion (like a + b), there's a default behavior, depending on the type of the object. Interestingly, Date are singled out among the default ECMAScript objects to convert to a String, instead of a number. Via the spec:
Date objects, are unique among built-in ECMAScript object in that they
treat "default" as being equivalent to "string", All other built-in
ECMAScript objects treat "default" as being equivalent to "number".
In the particular case of Date objects, the numerical conversion (the valueOf method) converts the time to epoch milliseconds, while the string conversion (the toString method) converts the object to a human-readable string. As #baao mentions in his answer, this can cause some issues when doing "arithmetic" with objects, due to automatic conversions of type.
In summary, Date (unlike most other objects) defaults to string conversion, but since subtraction requires two numbers for it to make sense, it converts the dates to numbers.
It's generally a good idea to explicitly define the behavior, in this case using valueOf, getTime or toString to make the code less ambiguous.
For more information of whether JavaScript chooses to use toString vs valueOf, see this question, the overall spec for addition (and subtraction), and the specific spec for Dates (mdn link), and #baao's answer for a more in depth look.
See Also:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/valueOf
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/valueOf
It's how javascript does automatic type conversion - just like arithmetic operations (that you are performing here). You were lucky that you subtracted, if you've added them you'd end up with a string holding two date strings in a row because of how toPrimitive (that gets called implicitly) works for Dates. Consider the following
// automatic casting
console.log("1" - "1"); // 0
// but
console.log("1" + "1"); // 11
// now with dates
// automatic casting
console.log(new Date() - new Date()); 0
console.log(new Date() + new Date()); // Mon Jun 11 2018 10:10:36 GMT+0200 (Mitteleuropäische Sommerzeit)Mon Jun 11 2018 10:10:36 GMT+0200 (Mitteleuropäische Sommerzeit)
The specification on the additional operator has the following hint that explains this further
All native ECMAScript objects except Date objects handle the absence of a hint as if the hint Number were given; Date objects handle the absence of a hint as if the hint String were given.
JavaScript converts your value to a primitive when using arithmetic opererators, the method that gets called here is
Date.prototype [ ##toPrimitive ] ( hint )
Date.prototype [ ##toPrimitive ] ( hint )
This function is called by ECMAScript language operators to convert a Date object to a primitive value. The allowed values for hint are "default", "number", and "string". Date objects, are unique among built-in ECMAScript object in that they treat "default" as being equivalent to "string", All other built-in ECMAScript objects treat "default" as being equivalent to "number".
That said. The reason why your code works how it works is the later auto conversion performed by the subtraction, which gives hint to toPrimitive to return a number.
It equals endTime.getTime() - startTime.getTime()
As you say, they cast to millisecond and that exactly shows the difference.
This is happening because JS asks for the primitive of the Date's object when performing such operation, through the method valueOf. The Date's object overrides valueOf methods, so that the value used is basically the same of getTime.
You can also try by yourself:
const o = { valueOf: () => 10 };
console.log(o + 1) // 11
According to the ECMA script standard, the following code should return true, but it doesn't:
d = new Date() ;
d.setTime(1436497200000) ;
alert( d == 1436497200000 ) ;
Section 11.9.3 says:
If Type(x) is either String or Number and Type(y) is Object, return the result of the comparison x == ToPrimitive(y).
Then, section 8.12.8 says that ToPrimitive retuns the result of the valueOf method. Which means that the last line in my example above should be equivalent to:
alert( d.valueOf() == 1436497200000 );
Which does return true, indeed.
Why does the first case not return true?
If you look at the spec at section 8.12.8, you will find this text near the end of the section:
When the [[DefaultValue]] internal method of O is called with no hint, then it behaves as if the hint were Number, unless O is a Date object (see 15.9.6), in which case it behaves as if the hint were String.
(Emphasis mine)
Now, in step 8 / 9 of the The Abstract Equality Comparison Algorithm [11.9.3], ToPrimitive(x) and ToPrimitive(y) are called without hint parameter.
The lack of this hint parameter, together with the above text, means that the ToPrimitive method returns the toString() value, on date objects.
As you're probably aware, (new Date()).toString() returns a string representation of the date in American English [source]:
"Wed Jul 01 2015 22:08:41 GMT+0200 (W. Europe Daylight Time)"
That a string like that doesn't equal 1436497200000 shouldn't come as a big surprise. ;-)
ToPrimitive(A) attempts to convert its object argument to a primitive value, by attempting to invoke varying sequences of A.toString and A.valueOf methods on A.
So if the toString() call succeeds, it won't call valueOf().
Consider the following code:
var a = new Date(someDateString);
var b = new Date(someOtherDateString);
console.log(b - a); // Outputs for example 3572, number of millisecs between the dates
Why does this work? This is an arithmetic operation on two objects. It looks suspiciously like operator overloading, known from C++ and other languages, but as far as I know, JavaScript won't get that before ECMAScript 7.
One would think the JS engine would turn it into something like
console.log(b.toString() - a.toString());
but this prints "NaN", as toString on a dateobject returns a string on the format
Mon Mar 23 2015 13:21:33 GMT+0100 (CET)
So, what magic makes this arithmetic possible? Can it be implemented on custom objects?
It converts them through .valueOf() not .toString(). Internally a date is stored as a number. The toString() method just formats it using a date formatting routine.
See The Subtraction Operator in the spec.
valueOf is called to fetch the numeric primitive of the date:
function o(i) {
this.valueOf = function() { return i; }
}
var a = new o(100);
var b = new o(42);
alert(a - b); // 58
A JS Date Object is just "a number of milliseconds".
"Zero time" starts at 01 January 1970 00:00:00 UTC. Everything else is a simple conversion, and adding/subtracting become very easy!
I have often seen the trick
after = +after;
to coerce the variable after to a number. Reading through the Node.JS source I found another method:
after *= 1; // coalesce to number or NaN
Are the two methods strictly equivalent in their behaviour?
Yes. Both Unary Operator + and Multiplicative Operators such as * (called from Compound Assignment op=) invoke the internal ToNumber algorithm.
You can even use a 3rd option by statically calling the Number constructor:
after = Number(after);
After a quick google to make sure my suspicions were true, I came to this conclusion. Using the + operator to convert to a number is faster, because no mathematical operations occur after type-casting, whereas using the *= approach means that after after is converted, it will be multiplied by 1.
Yes, however note that it is only the unary + operator that does this. I.e. 10 + "10" will give you "1010".
A perhaps less error prone option is to use what asm.js does:
10 + ("10"|0)
Although on the down-side it does require the brackets. It should be the fastest option in any case (probably equal to unary +).
Note: In some instances after = after-0 invokes different behaviour than after = after+0. I've noticed it with dates.
This is tested in Chrome v39 only:
var date = new Date(2000,0,1);
date += date; //"Sat Jan 01 2000 00:00:00 GMT+0000 (GMT Standard Time)Sat Jan 01 2000 00:00:00 GMT+0000 (GMT Standard Time)"
var date2 = new Date(2000,0,1);
date2 + 0; //"Sat Jan 01 2000 00:00:00 GMT+0000 (GMT Standard Time)0"
date2 - 0; //946684800000
date2 * 1; //946684800000
I don't know what is defined in the JS spec, but with dates, because both the date and the number can be cast to a string, and the + operator works on a string, then Chrome goes with a string concatenation. Because the - operator has no string equivalent, it falls back to number comparison.
I've found this useful when coercing dates into numbers for comparisons