Here is code
const dateStr = '1989-11-11T03:34';
let date = new Date(dateStr);
console.log(date.toUTCString());
console.log(+date);
And output:
Fri, 10 Nov 1989 22:34:00 GMT
626740440000 <--- far future
Why unix time is in far future?
Here is Playcode link
It's not wrong.
You seem to have two issues with what you're getting:
The time isn't the same in your toISOString output.
The number you're getting from +date isn't what you expect.
But both are correct:
The string has no timezone indicator on it, so since it's a date+time string, it's parsed as local time. This is covered by the Date Time String Format section of the specification. But your output is showing GMT (because toISOString uses GMT).
626740440000 is the number of milliseconds since The Epoch, not seconds as it was in the original Unix epoch scheme. This is covered by the Time Values and Time Range section of the specification.
If you want to parse your string as UTC (loosely, GMT), add a Z to the end of it indicating that't is in GMT.
If you want the number of seconds since The Epoch, divide +date by 1000 (perhaps rounding or flooring the result).
Related
I have a server date time which is also CST and a date based of chicago timezone, when i find the difference between two dates the value is different for different timezones.
I am not able to get to the problem.
Date A is my Chicago time 2019-05-22T04:02:14-05:00
Date B is my server time 2019-05-20T01:39:34-04:00
Hours difference between them 51 when my timezone is set to EST
When i change my timezone to IST
Date A is my Chicago time 2019-05-22T04:03:34-05:00
Date B is my server time 2019-05-20T01:39:34+05:30
Hours difference between them 60 when my timezone is set to IST
Why is there a difference in hours when the dates are same in both the cases?
getIntervalTime(dateA, dateB): ITimer {
console.log("Date A is my Chicago time", dateA)
console.log("Date B is my server time", dateB)
console.log(moment.utc(dateA).diff(moment.utc(dateB), "hours"));
intervalHours = moment.utc(dateA).diff(moment.utc(dateB), "hours")
}
In your question, you gave two very different server times. They are not referencing the same actual point in time. In each case, 01:39:34 is the local time in the time zone offset provided.
2019-05-20T01:39:34-04:00 (EDT) = 2019-05-20T05:39:34Z (UTC) = 2019-05-20T11:09:34+05:30 (IST)
2019-05-20T01:39:34+05:30 (IST) = 2019-04-19T20:09:34Z (UTC) = 2019-04-19T16:09:34-04:00 (EDT)
As you can see just by comparing the UTC times, there is a 9.5 hour difference between these two timestamps. This is also reflected in the difference between the two offsets (5.5 - -4 = 9.5).
This is a common source of confusion, as often people view the + or - sign as an operator, and thus think of it as an instruction ("Oh, I see a plus or minus so I must need to add or subtract this value to get to the local time"). But in reality it is not an operator, but the sign of the offset. Positive offset values are ahead of UTC, while negative offset values are behind UTC. (Alternatively one can think of positive offsets as being east of GMT, while negative offsets are west of GMT.)
In other words, the date and time portion of an ISO 8601 formatted timestamp are already converted to the context provided.
Also note that the time zone of your server won't really matter, nor should it. Now is now - time zones don't change that. Thus, in most cases you should simply use the UTC time.
If I run var myDate = new Date('29-06-2016 10:00'), myDate will only contain one thing: a number. The number of milliseconds from 01-01-1970 00:00:00 GMT to 29-06-2016 10:00:00 XXX
XXX being the timezone of the OS. In my case BST (because it is a summer date, in winter would be GMT).
Now... What if I want the milliseconds from 01-01-1970... to 29-06-2016 10:00:00 GMT-7?
I only found methods to tell me what time is in the GMT-7 timezone when in BST timezone is 29-06-2016 10:00:00, but that is not what I am looking for!
Also, to change an environmental variable so the timezone is GMT-7 is not an option.
I think you want the date string in the following format
"2016-06-29T10:00:00-07:00"
That lets you set the timezone relative GMT (not 100% sure on the timezone, but it's client side so does depend on their locale).
I had a similar thing where JS was changing the time on date objects and the only way I found was to set up the date and set this.
Bonus info, to get this from a .NET DateTime using the following string format.
"yyyy-MM-ddTHH:mm:sszzz"
I think I found a way of doing it, using moment.js as ErikS suggested:
// This code is running in a Node.js server configured to use UTC
// Incorrect date, as it is interpret as UTC.
// However, we do this to get the utcOffset
var auxDate = moment.tz(new Date('2016-6-23 10:15:0'), 'US/Mountain');
// Get the milliseconds since 1970 of the date as if it were interpreted
// as GMT-7 or GMT-6 (depends on the date because of Daylight Saving Time)
var milliseconds = auxDate.valueOf() - auxDate.utcOffset() * 60000;
My goal is to convert an integer expressing the Unix Epoch time (or the number of milliseconds since midnight of January 1, 1970) into a localized time for the UTC (or GMT) time-zone.
So I have this method:
function formatDateTimeFromTicks(nTicks)
{
//'nTicks' = number of milliseconds since midnight of January 1, 1970
//RETURN:
// = Formatted date/time
return new Date(nTicks).toLocaleString();
}
As an example I'm using the value of 1442004135000, which should give me 9/11/2015 8:42:15 PM for my locale (here's where you can check), but my method:
alert(formatDateTimeFromTicks(1442004135000));
gives me 9/11/2015 1:42:15 PM.
Any idea why and how to fix it?
The native Date object won't be enough, as even in the best case it doesn't give you a UTC and locale string. I strongly suggest you use the excellent moment library to have reliable behavior across all platforms.
To display nice localized UTC in French:
moment(1442004135000).utc().locale('fr').format('LLLL')
=> "vendredi 11 septembre 2015 20:42"
I am doing some javascript date stuff, and I executed the following:
console.log(new Date(0));
I expected to see the *nix Epoch, but I was oddly returned:
Wed Dec 31 1969 19:00:00 GMT-0500 (Eastern Standard Time)
What happened?
You are setting the internal time value, which is UTC, but seeing a string that is based on your system settings, which likely have an offset of UTC-05:00.
The ECMAScript specification explains how the Date constructor and instances work. Given:
new Date(0)
the Date constructor is called with one argument (§20.3.2.2 Date(value)) so it creates a Date instance with it's internal time value set depending on the type of argument. As the value is a number, the time value is set to that number.
The time value is an offset in milliseconds from 1970-01-01T00:00:00Z §20.3.1.1 Time Values and Time Range. Note that it's based on UTC.
The behaviour of console.log is entirely implementation dependent, so what you get from:
console.log(dateInstance);
depends on the host. However, most seem to call the object's toString method which returns an implementation dependent string based on the timezone setting for the host system (§20.3.4.41 Date.prototype.toString()). That is, a "local" time.
The timezone offset can be determined using getTimezoneOffset. It's in minutes and has the opposite sense to an ISO 8601 offset (e.g. UTC-05:00 will be +300). If you want to get a date string that represents the internal time value without an offset, use toUTCString.
I was unable to find any resources explaining it, but this 'error' is due to my timezone (as far as I can tell)
My timezone is GMT-0500, which is 5 hours behind UTC time. Add 5 hours to Wed Dec 31 1969 19:00:00 and you get the Epoch (Thurs Jan 1 1970 00:00:00)
As we know that all dates using Javascript Date constructor are calculated in milliseconds from 01 January, 1970 00:00:00 Universal Time (UTC) with a day containing 86,400,000 milliseconds. This implies that JS uses UNIX timestamp. I set my timer to a date beyond 2038 (say 14 Nov 2039) and run the script:
<script>
var d = new Date();
alert(d.getFullYear()+" "+d.getMonth()+" "+d.getDate());
</script>
It alerts 2039 10 14 successfully unlike PHP which prints "9 Oct, 1903 07:45:59"
How JS handles this? Explanation is appreciated as I am confused!
32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That's widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime class) solves this problem.
Javascript doesn't have integers but only floats, which don't have an inherent maximum value (but in return have less precision).
Javascript doesn't have integer numbers, only floating point numbers (details can be found in the standards document).
That means that you can represent some really large numbers, but at the cost of precision. A simple test is this:
i = 1384440291042
=> 1384440291042
i = 13844402910429
=> 13844402910429
i = 138444029104299
=> 138444029104299
i = 1384440291042999
=> 1384440291042999
i = 13844402910429999
=> 13844402910430000
i = 138444029104299999
=> 138444029104300000
i = 1384440291042999999
=> 1384440291043000000
i = 13844402910429999999
=> 13844402910430000000
As you can see the number is not guaranteed to be kept exact. The outer limits of integer precision in javascript (where you will actually get back the same value you put in) is 9007199254740992. That would be good up until 285428751-11-12T07:36:32+00:00 according to my conversion test :)
The simple answer is that Javascript internally uses a larger data type than the longint (4 bytes, 32bit) that is used for the C style epoc ...
This implies that JS uses UNIX timestamp.
Just a sidenote: Unix timestamp are seconds since 1970. JS time is milliseconds since 1970. So JS timestamp does not fit in a 32 bit int much earlier (but JS does not use 32 bit int for this)
It can. Try out new Date(8640000000000000)
Sat Sep 13 275760 03:00:00 GMT+0300 (Eastern European Summer Time)
Year 275760 is is a bit beyond 2038 :)
Read the spec section 15.9.1.1
http://ecma-international.org/ecma-262/5.1/#sec-15.9.1.1
A Date object contains a Number indicating a particular instant in
time to within a millisecond. Such a Number is called a time value. A
time value may also be NaN, indicating that the Date object does not
represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970
UTC. In time values leap seconds are ignored. It is assumed that there
are exactly 86,400,000 milliseconds per day. ECMAScript Number values
can represent all integers from –9,007,199,254,740,992 to
9,007,199,254,740,992; this range suffices to measure times to
millisecond precision for any instant that is within approximately
285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is
slightly smaller: exactly –100,000,000 days to 100,000,000 days
measured relative to midnight at the beginning of 01 January, 1970
UTC. This gives a range of 8,640,000,000,000,000 milliseconds to
either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC
is represented by the value +0.
The year 2038 problem applies to signed 32 bit timestamps only, which PHP and some other systems use. A signed 32-bit timestamp's range runs out with the number of seconds in 2038.
From the Wikipedia article (emphasis mine):
The year 2038 problem may cause some computer software to fail at some point near the year 2038. The problem affects all software and systems that both store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 UTC on Thursday, 1 January 1970.1 The furthest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038.[2] ... This is caused by integer overflow. The counter "runs out" of usable digits, "increments" the sign bit instead, and reports a maximally negative number (continuing to count up, toward zero). This is likely to cause problems for users of these systems due to erroneous calculations.
Storing a timestamp in a variable with a greater range solves the problem.