Why can JavaScript handle timestamps beyond 2038? - javascript

As we know that all dates using Javascript Date constructor are calculated in milliseconds from 01 January, 1970 00:00:00 Universal Time (UTC) with a day containing 86,400,000 milliseconds. This implies that JS uses UNIX timestamp. I set my timer to a date beyond 2038 (say 14 Nov 2039) and run the script:
<script>
var d = new Date();
alert(d.getFullYear()+" "+d.getMonth()+" "+d.getDate());
</script>
It alerts 2039 10 14 successfully unlike PHP which prints "9 Oct, 1903 07:45:59"
How JS handles this? Explanation is appreciated as I am confused!

32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That's widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime class) solves this problem.
Javascript doesn't have integers but only floats, which don't have an inherent maximum value (but in return have less precision).

Javascript doesn't have integer numbers, only floating point numbers (details can be found in the standards document).
That means that you can represent some really large numbers, but at the cost of precision. A simple test is this:
i = 1384440291042
=> 1384440291042
i = 13844402910429
=> 13844402910429
i = 138444029104299
=> 138444029104299
i = 1384440291042999
=> 1384440291042999
i = 13844402910429999
=> 13844402910430000
i = 138444029104299999
=> 138444029104300000
i = 1384440291042999999
=> 1384440291043000000
i = 13844402910429999999
=> 13844402910430000000
As you can see the number is not guaranteed to be kept exact. The outer limits of integer precision in javascript (where you will actually get back the same value you put in) is 9007199254740992. That would be good up until 285428751-11-12T07:36:32+00:00 according to my conversion test :)
The simple answer is that Javascript internally uses a larger data type than the longint (4 bytes, 32bit) that is used for the C style epoc ...

This implies that JS uses UNIX timestamp.
Just a sidenote: Unix timestamp are seconds since 1970. JS time is milliseconds since 1970. So JS timestamp does not fit in a 32 bit int much earlier (but JS does not use 32 bit int for this)

It can. Try out new Date(8640000000000000)
Sat Sep 13 275760 03:00:00 GMT+0300 (Eastern European Summer Time)
Year 275760 is is a bit beyond 2038 :)
Read the spec section 15.9.1.1
http://ecma-international.org/ecma-262/5.1/#sec-15.9.1.1
A Date object contains a Number indicating a particular instant in
time to within a millisecond. Such a Number is called a time value. A
time value may also be NaN, indicating that the Date object does not
represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970
UTC. In time values leap seconds are ignored. It is assumed that there
are exactly 86,400,000 milliseconds per day. ECMAScript Number values
can represent all integers from –9,007,199,254,740,992 to
9,007,199,254,740,992; this range suffices to measure times to
millisecond precision for any instant that is within approximately
285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is
slightly smaller: exactly –100,000,000 days to 100,000,000 days
measured relative to midnight at the beginning of 01 January, 1970
UTC. This gives a range of 8,640,000,000,000,000 milliseconds to
either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC
is represented by the value +0.

The year 2038 problem applies to signed 32 bit timestamps only, which PHP and some other systems use. A signed 32-bit timestamp's range runs out with the number of seconds in 2038.
From the Wikipedia article (emphasis mine):
The year 2038 problem may cause some computer software to fail at some point near the year 2038. The problem affects all software and systems that both store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 UTC on Thursday, 1 January 1970.1 The furthest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038.[2] ... This is caused by integer overflow. The counter "runs out" of usable digits, "increments" the sign bit instead, and reports a maximally negative number (continuing to count up, toward zero). This is likely to cause problems for users of these systems due to erroneous calculations.
Storing a timestamp in a variable with a greater range solves the problem.

Related

Wrong unix time from parsed DateTime

Here is code
const dateStr = '1989-11-11T03:34';
let date = new Date(dateStr);
console.log(date.toUTCString());
console.log(+date);
And output:
Fri, 10 Nov 1989 22:34:00 GMT
626740440000 <--- far future
Why unix time is in far future?
Here is Playcode link
It's not wrong.
You seem to have two issues with what you're getting:
The time isn't the same in your toISOString output.
The number you're getting from +date isn't what you expect.
But both are correct:
The string has no timezone indicator on it, so since it's a date+time string, it's parsed as local time. This is covered by the Date Time String Format section of the specification. But your output is showing GMT (because toISOString uses GMT).
626740440000 is the number of milliseconds since The Epoch, not seconds as it was in the original Unix epoch scheme. This is covered by the Time Values and Time Range section of the specification.
If you want to parse your string as UTC (loosely, GMT), add a Z to the end of it indicating that't is in GMT.
If you want the number of seconds since The Epoch, divide +date by 1000 (perhaps rounding or flooring the result).

Is Date.now() okay to use, will it reach a number that JS can't handle?

Is there a possibility that Date.now() will start returning numbers in scientific notation?
It can possibly be a bug like Y2K. Is it safe to use it? Will it cross the biggest
number possible in JavaScript?
The biggest integer (Number.MAX_SAFE_INTEGER) JavaScript can handle is 2^53 - 1.
Converting that to years:
console.log(Number.MAX_SAFE_INTEGER / 1000 / 60 / 60 / 24 / 365)
// outputs 285616.41472415626
So the answer is YES, it's safe to use it.
That's not going to be a concern of yours:
Number.MAX_SAFE_INTEGER to Date:
Wed 12 Oct 287396
Relative: In 287.396‬ years
On the other hand
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC.
However unrelated, there is the year 2038 problem which might set you off, but as mentioned is not related to JS
The Year 2038 problem (also called Y2038 or Y2k38 or Unix Y2K) relates to representing time in many digital systems as the number of seconds passed since 00:00:00 UTC on 1 January 1970 and storing it as a signed 32-bit integer. Such implementations cannot encode times after 03:14:07 UTC on 19 January 2038. Similar to the Y2K problem, the Year 2038 problem is caused by insufficient capacity used to represent time.
Date is funny :)

Date difference changing based on timezone

I have a server date time which is also CST and a date based of chicago timezone, when i find the difference between two dates the value is different for different timezones.
I am not able to get to the problem.
Date A is my Chicago time 2019-05-22T04:02:14-05:00
Date B is my server time 2019-05-20T01:39:34-04:00
Hours difference between them 51 when my timezone is set to EST
When i change my timezone to IST
Date A is my Chicago time 2019-05-22T04:03:34-05:00
Date B is my server time 2019-05-20T01:39:34+05:30
Hours difference between them 60 when my timezone is set to IST
Why is there a difference in hours when the dates are same in both the cases?
getIntervalTime(dateA, dateB): ITimer {
console.log("Date A is my Chicago time", dateA)
console.log("Date B is my server time", dateB)
console.log(moment.utc(dateA).diff(moment.utc(dateB), "hours"));
intervalHours = moment.utc(dateA).diff(moment.utc(dateB), "hours")
}
In your question, you gave two very different server times. They are not referencing the same actual point in time. In each case, 01:39:34 is the local time in the time zone offset provided.
2019-05-20T01:39:34-04:00 (EDT) = 2019-05-20T05:39:34Z (UTC) = 2019-05-20T11:09:34+05:30 (IST)
2019-05-20T01:39:34+05:30 (IST) = 2019-04-19T20:09:34Z (UTC) = 2019-04-19T16:09:34-04:00 (EDT)
As you can see just by comparing the UTC times, there is a 9.5 hour difference between these two timestamps. This is also reflected in the difference between the two offsets (5.5 - -4 = 9.5).
This is a common source of confusion, as often people view the + or - sign as an operator, and thus think of it as an instruction ("Oh, I see a plus or minus so I must need to add or subtract this value to get to the local time"). But in reality it is not an operator, but the sign of the offset. Positive offset values are ahead of UTC, while negative offset values are behind UTC. (Alternatively one can think of positive offsets as being east of GMT, while negative offsets are west of GMT.)
In other words, the date and time portion of an ISO 8601 formatted timestamp are already converted to the context provided.
Also note that the time zone of your server won't really matter, nor should it. Now is now - time zones don't change that. Thus, in most cases you should simply use the UTC time.

momentjs toDate - different output on different clients/browsers

I use momentjs to parse a Date String and convert it to a native JavaScript Date:
let dateString = '1980-04-06';
console.log(moment().utcOffset());
console.log(moment(dateString, 'YYYY-MM-DD').toDate());
<script src="https://cdn.jsdelivr.net/npm/moment#2.22.2/moment.min.js"></script>
The output on client 1(Firefox 62) is
120
Date 1980 - 04 - 05 T23: 00: 00.000 Z
and the output on client 2(Firefox 52 ESR) is
120
Date 1980 - 04 - 05 T22: 00: 00.000 Z
Can somebody explain me, why the utcOffset is the same (new Date().getTimezoneOffset() prints also -120 on both clients), but the Date (hour) is different?
You're checking the current UTC offset, not the offset of your 1980 moment instance. My guess is that if you took moment(dateString, 'YYYY-MM-DD') and called utcOffset on that, you'd get different offsets on the different browsers.
I bet what's happening is that the rules for your zone have changed since 1980 (for example, perhaps the timing of the DST has changed, or DST has been added or eliminated, or perhaps the standard offset has even changed). Browsers vary in the degree to which they get historical zone data right, which leads to errors interpreting date strings. I suspect that Firefox fixed their historical zone database for your zone, leading to different behavior in newer versions of the browser.
The offsets you're showing are for the current date and time, not for the date provided. If you change the middle line to log moment(dateString, 'YYYY-MM-DD').utcOffset(), you should see that the result in the older Firefox 52 is 60 instead of 120.
Contributing factors explaining this difference are:
The daylight saving time rules for your time zone are not the same in 1980 as they are today. Assuming Vienna (from your user profile), in 1980 the start of DST was at 00:00 on April 6th (reference here) which is the first Sunday in April. The current (2018) rule for Vienna is the last Sunday in March, which would be March 25th 2018 (reference here).
ECMAScript 5.1 (section 15.9.1.8) and earlier required browsers to always assume the current DST rule was in effect for all time - even if this was not actually what happened. This was corrected in ECMAScript 6 / 2015 (section 20.3.1.8) .
ECMAScript 2015 was implemented in Firefox starting with version 54. Since you are testing version 52, you are seeing the old behavior.
Since this particular DST change is right at the stroke of midnight, and it's a spring-forward transition, then, 1980-04-06T00:00 is not valid. The first moment of that day in that time zone is 1980-04-06T01:00. Moment takes care of this for you when you pass a date-only value. In the current browser (62, not 52), If you call .format() on the moment, you should see 1980-04-06T01:00:00+02:00. Note that this is time is already in DST, with a UTC+02:00 offset. Converted to UTC is 1980-04-05T23:00:00Z, thus aligning with the correct data as seen in your examples.
Long story short, there are many reasons to use up-to-date browsers. This is one of them.

Why is 275481/69/100089 the max date "new Date()" will parse? [duplicate]

Assuming a 32-bit OS/Browser, could a Date object created in JavaScript rollover to 1970 if I set a date beyond 2038?
The Mozilla documentation says a year can be set to 9999, however I don't know if this is consistent across all JavaScript implementations, or if this is an accurate description of what the specification dictates.
I would think given the wording in the documentation, it seems like it's either using a 64-bit number for storing the time or storing the actual data in ISO date format.
Does anyone know how browsers implement this?
It shouldn't be - according to the ECMAScript specification seciont 15.9.1.1:
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. Leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript number values can represent all integers from –9,007,199,254,740,991 to 9,007,199,254,740,991; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC.
This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January,
1970 UTC. The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.
Only bitwise operators in JS are 32bit. There is no version that changes this, and there is no difference if your OS is 64bit. So if someone is using bitwise on timestamps, this could happen. For example, here I use bitwise or because I want the side-effect of all bitwise operators that they convert to int, just so I loose the milliseconds of the date.
new Date('2038-01-01T01:01:01.345') / 1000 | 0; // 2145913261.
new Date('2039-01-01T01:01:01.345') / 1000 | 0; // -2117518035. Wraps around...
I could be using anything else, such as Math.round, or parseInt and there will be no problem, but if I use bitwise, it's going to wrap around.

Categories

Resources