Differences in dates between .Net and Javascript - javascript

I had an issue where dates were out by one day when displaying on an asp.net web form. These dates are only used for display so I can pass them as strings to resolved the issue, but I'm curious about why I'm seeing this behaviour.
I'm in Ireland, and while Ireland is more or less in line with GMT, we use IST (Irish Standard Time) during summer instead of DST and then revert to GMT for Winter. This has the same effect as being on GMT, but "officially" is slightly different.
As we're not on GMT, in the past, IST and DST didn't always line up.
For example, in 1958, IST started in the 20th April and ended on the 5th October whereas, DST started on the 27th of April and ended on 26th of October.
So if a date between the 5th and 26th of October 1958 is passed to JS, JS will display it as the previous day.
I wrote this this code to try and understand what's going on:
DateTime date = new DateTime(1958, 10, 4);
while (date <= new DateTime(1958, 10, 30))
{
Console.WriteLine($"normal : {date} | isDst? : {date.IsDaylightSavingTime()}");
Console.WriteLine($"universal: {date.ToUniversalTime()} | isDst? : {date.ToUniversalTime().IsDaylightSavingTime()}");
Console.WriteLine($"local : {date.ToLocalTime()} | isDst? : {date.ToLocalTime().IsDaylightSavingTime()}");
Console.WriteLine("-------------------------");
date = date.AddDays(1);
}
Which produced this output (truncated):
So I can see there are a number of days being mis identified as DST days, but it doesn't seem like that would cause this? If both .Net and JS though they were DST days, then surely the end result should be correct?
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Here's a screenshot of JS processing a few dates duirng this problematic window
You can see that JS (or chrome?) is aware that during the 5th to the 27th of that year, Ireland is no longer on GMT+1 (even though it still says it's IST) so why is the date passed from VB an incorrect date? I though they both got their datetime information from the same source i.e. the host pc?

You appear to be running .NET on Windows, in which case .NET is using the Windows time zone data for your local time zone.
The Windows time zone data does not have the full history of time zone changes from all time. Microsoft's policy only guarantees historical data is present for dates from 2010 forward, though some time zones have historical data before then.
Conversely, Chrome is using time zone data from ICU, which uses IANA time zone data. IANA time zones have historical data since at least 1970, though many time zones have historical data before then.
With specific regard to Ireland, IANA has Irish time zone data going back to 1880. Windows has no history for Ireland at all, so it assumes the current rule has always been in effect. In reality, the current rule has been in effect since Oct 1968, so any dates before then will only have accurate time zone information in the IANA data.
If you run the same .NET code you showed above on Linux or MacOS, you'll see that .NET will use IANA time zone data on those platforms and your results will match up for 1958. Or, if you pick a more recent date your results will match on Windows too.
In short - don't run these sorts of tests on old dates and expect to get the same fidelity that you'll get with modern dates.
You also asked:
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Your date variables are all DateTime where .Kind is DateTimeKind.Unspecified. The IsDaylightSavingTime method will treat such values as if they belonged to the local time zone, as if they actually had DateTimeKind.Local. The same is true for the ToUniversalTime method, however, the ToLocalTime method will assume that DateTime values with DateTimeKind.Unspecified kind are actually in terms of UTC - as if they were DateTimeKind.Utc. Thus, when DST is in effect, date.ToUniversalTime() shifts an hour backward, and date.ToLocalTime() shifts an hour forward.
You can avoid such ambiguities by using DateTimeOffset instead of DateTime.

Related

When formatting date object - does it matter that the T and the 000Z will be removed when storing to db?

Sorry if its a very basic question but I dont understand the following:
When I format the Date object (no matter what library I used), I get a string.
from this: 2022-11-28T16:55:44.000Z (new Date object)
I get this: 2022-11-28 16:55:44 (or other formats obviously depending how I format it)
Even if I turn it back into an object it, the T and 000Z will never be there anymore. Do I just ignore that (seems like it as any library or date methods are ignoring the T and the string ending when formatting) or do I add it 'back' Isnt it a problem if dates stored in my db are different (for later queries etc.)?
The Z indicates UTC (Coordinated Universal Time, also known as Greenwich Meridian Time), dropping that changes the meaning - unless your browser or server lives in the Greenwich time zone and it is winter (no daylight saving time).
You can convert back and forth between a Date object and a UTC string as follows (my browser lives in the Central European time zone):
> utc = '2022-11-28T16:55:44.000Z'
'2022-11-28T16:55:44.000Z'
> d = new Date(utc)
Mon Nov 28 2022 17:55:44 GMT+0100 (Central European Standard Time)
> d.toISOString()
'2022-11-28T16:55:44.000Z'
Alternatively, you can convert back and forth between a Date object and a formatted string in your browser's or server's time zone (the last line shows that my browser's format differs from yours):
> formatted = '2022-11-28 17:55:44'
'2022-11-28 17:55:44'
> d = new Date(formatted)
Mon Nov 28 2022 17:55:44 GMT+0100 (Central European Standard Time)
> d.toLocaleString()
'11/28/2022, 5:55:44 PM'
But you should not store the Date objects in this format in a database, unless you can guarantee that they are always read and written in the same time zone. For example, if you format a Date object with your browser (in CET) and store it, then someone else who reads it and converts it back to a Date object with their browser in the New Zealand time zone will see a wrong value. Also, dates like 9/11/2022 are ambiguous if the formatting rules are not clear (September 11th or November 9th?).
That's why I would prefer UTC strings when storing Date objects and use formatted strings only for outputting them to the user and for parsing user input.
I see it even stronger: You should never store dates as strings, it's a design flaw. Store always proper Date objects. Here on SO you can find hundreds of questions, where people have problems, because they stored date values as (localized) strings. It is not limited to MongoDB, it applies to any database.
Date objects in MongoDB are UTC times - always and only! Usually the client application is responsible to display the date/time in local time zone and local format.
What do you mean by "turn it back", i.e. how do you do it?
You should not rely on new Date(<string>) without time zone. Some browsers/environments may apply UTC time, others may use current local time zone, see Differences in assumed time zone
Have a look at 3rd party date libraries, e.g. moment.js, Luxon, or Day.js. Usually they provide better control how to parse strings and time zones.

Why the difference between 2 time stamps has one hour deviation compared to expected value?

Take the date '2022-04-01' and another date '2022-05-15' for example. When I calculated their deviation in Chrome devtools, what I got is:
The result is 3801600000. But when my friend did the same thing in another device, what he got is:
The result is 3798000000. The difference between 3801600000 and 3798000000 is exactly one hour. What may causes this result? How can I eliminate this difference?
You lack the zone data:
UTC datetime: new Date("2021-01-01T12:00:00Z");
UTC-4 datetime: new Date("2021-01-01T12:00:00-04:00");
The main issue you are experiencing is because your input string is being interpreted as assigned to the local time zone, and you have different time zones on the two machines you've been testing with. One of them has a DST transition between the two dates, and the other does not - which explains your one hour difference.
The examples you show also reveal your possible time zones:
Your machine is showing 8 hours ahead of UTC (UTC+8) for both timestamps. There are several parts of the world that use UTC+8 without DST, including China, Western Australia, Irkutsk Russia, and many others. There are no places on earth that use UTC+8 in conjunction with DST.
Your friend's machine is a different story. The timestamps are showing 2 hours ahead of UTC (UTC+2) on 2022-04-01, and 3 hours ahead of UTC (UTC+3) on 2022-05-15. While many countries use those offsets (such as those that are in Eastern Europe that use EET/EEST), none of those areas have a DST transition between those two dates. See the bottom of this table of DST transitions. All of the +2/+3 areas of the world transitioned in March. I can only conclude that your friend's machine has some non-standard time zone setting, or they are significantly behind on time zone data updates, or both. (Please reply in comments if I am incorrect on this!)
Also, your input string format, 2022-04-01 00:00:00 is not in the standard date time string format defined by the ECMAScript specification. Thus, it is not guaranteed to be parsed consistently by all browsers. Current versions of Chrome, Edge, and Firefox will interpret it as a local date and time, but the current version of Safari will fail with "Invalid Date".
If you want it interpreted as local time correctly in all browsers, you would need to specify it as 2044-04-01T00:00:00.
If you want it interpreted as UTC then specify as 2044-04-01T00:00:00Z.
If you want it interpreted with a specific time zone offset, then append that instead, as in: 2044-04-01T00:00:00+08:00.
If you must parse the string in the original format, then don't do it using the Date object. Either use a library (such as Luxon or date-fns), or parse it yourself with regex and/or string manipulation techniques.

MomentJS Accounting for Timestamp's DST and not current DST

I am using momentjs to account for timezone differences in an international application. I use it to parse ISO 8601 (ex. '2021-12-17T18:40:02.389Z') strings, and expect that the date and time be local to the client. I am running into an issue where it displays both CST and CDT, for example, in the application. It seems to be using the timezone/daylight savings of the timestamp, rather than converting it to the timezone we are currently in. For example, it's December as of this posting, so I would like all times to be displayed by moment in CDT, whereas in May, I would like it to be displayed in CST, regardless of when the timestamp was created. Is this possible with moment? I haven't been able to find out about this in their docs.

JavaScript date quirk

This is the strangest behavior I have witnessed in JavaScript.
Under a very specific set of circumstances, this code will yield a time that's 40 minutes and 28 seconds earlier:
var jsDate = new Date('01/01/1900 11:00:00');
jsDate.setSeconds(0);
var dateString = jsDate.toLocaleTimeString("en", {
hour12 : false
});
alert(dateString); //dateString will be 10:19:32
This happens for one site in the Netherlands but not for our developer there. It happens on Firefox and Chrome. The workstation is Windows 7.
Testing it, I found that the broken result happens for any year prior to 1942.
For 1943 and 1944, it adds an hour.
Every year after that works fine, regardless of the date format: 01/01/1900 and 1900-01-01.
Background for the curious:
We have a date time widget and are only interested in the time portion.
The fix is to set the dummy date to 2000. But we are boggled about the "why".
Link to jsFiddle
This issue is not related to UTC vs Localized time. The strangest feature is that it alters the time by minutes and seconds, not hours.
I'll stick to your Netherlands example that was returning 10:19:32, and address the why part of this question.
The time zone database entry for Europe/Amsterdam is here, and looks like this:
# Zone NAME GMTOFF RULES FORMAT [UNTIL]
Zone Europe/Amsterdam 0:19:32 - LMT 1835
0:19:32 Neth %s 1937 Jul 1
0:20 Neth +0020/+0120 1940 May 16 0:00
1:00 C-Eur CE%sT 1945 Apr 2 2:00
1:00 Neth CE%sT 1977
1:00 EU CE%sT
Since you passed a date in 1900, the second line applies, which has an offset from UTC of 0:19:32, which happens to coincide with the Local Mean Time (LMT) entry above it.
As Chris Pearce explains in The Great Daylight Saving Time Controversy:
... All this was in a country that hadn't yet adopted standard time. Controversy had raged in the Netherlands over whether it should use GMT and turn clocks back about 20 minutes or use GMT+1 and go forward 40 minutes. Agreement couldn't be reached, so they stayed on local time long after all other European countries had moved to standard time. Local mean time of capital city Amsterdam, GMT+0:19:32, had been used nationwide for decades and this became the legal time ...
With regard to other time zones, you're likely hitting on their own interesting bits of history, or just on the Local Mean Time entry in the TZDB, which is usually the earliest entry and is used when there is no other history known about timekeeping in the region.
The lesson here is that time zones are a relatively modern invention, and have often been a point of political controversy. Don't assume that because you know how they work today that your knowledge will apply to the past. Keep in mind also that as various tidbits of history are uncovered, the TZDB is updated accordingly, so indeed history can change!
In general, expect to encounter oddities for any dates before 1970.
As to why you don't get this same result in every browser, see my oldish blog post JavaScript Date type is horribly broken, which among other things describes that ECMAScript 5.1 and prior required TZ/DST rules to ignore the history of timekeeping and assume the current rule always existed. This was fixed in ECMAScript 6, and in the ECMA Internationalization API - so most modern browsers will give you the correct result.
However, it also depends on where they source their time zone data. Windows computers don't have the full history of the tzdb, unless the browser ships their own copy of it. Browsers that use ICU for this functionality do ship their own copy of the TZDB, so those would have the historical data, but not every browser does this.

Representing a date in JavaScript when timezone is irrelevant

I am working on a very small JavaScript library that allows users to retrieve content based on date. For this content, the date is just an identifier and timezones are completely irrelevant (think along the lines of a Dilbert flip calendar). The "May 14th" content is the same whether you are in the United States or Australia.
The function for retrieving data currently takes a Date object as a parameter. Within the function, the timezone is ignored. Does this approach make sense, or would it be better to take a timezone-independent identifier (like 2012-01-01) as a parameter instead? With the Date object approach, do I run the risk of returning the wrong data as a result of timezone adjustments browsers make?
How about using Date.getUTC*() functions? UTC time is the same for everyone.
After doing some research, it appears that simply ignoring the timezone information is the best approach. Why? This will always preserve the date and time that were provided to the Date constructor (which is my goal), whereas the getUTC* methods will return altered versions of the date and time. For example, take a look at this node REPL session I ran on a computer in the Eastern Time zone.
> d = new Date(2013, 03, 27, 23, 00, 00)
Sat Apr 27 2013 23:00:00 GMT-0400 (EDT)
> d.getDate() // The same date provided in the constructor. Woo!
27
> d.getUTCDate() // A different date. Boo!
28
Long story short, if you want to read the exact date and time that were provided in the Date constructor, using the normal get* methods (like getDate) will do that. If you use the getUTC* methods (like getUTCDate) modified versions of the date and time will be returned.
I know this may sound rudimentary to some of the more experienced programmers out there, but this really helped me make sense of things. I hope it helps others who come along.
The only problem with the approach in your own answer is that it doesn't account for ambiguous times. This happens during daylight savings fall-back transitions.
For example, set your computer's time zone to US Mountain Time ("Mountain Time (US & Canada)" on windows, or "America/Denver" on mac/linux). Then restart your browser and run the following javascript:
var dt = new Date(2013,10,3,1,0);
alert(dt);
This is November 3rd, 2013 at 1:00 AM. But you don't know which 1:00 AM it is representing. Is it in Mountain Daylight Time (UTC-6) before the transition, or Mountain Standard Time (UTC-7) after? There's no way to tell, and JavaScript will just use the standard time.
Now if all you need is 2013-11-03 01:00 - then you are correct. You can just ignore the offset and be done with it. But if you are going to use that value for anything meaningful - such as recording a point in time, or subtracting from another point in time for duration between them, then you have a problem that you can't resolve without the offset.
Unfortunately, there is no great solution for this problem in JavaScript. The closest thing is Moment.js, but it is still not perfect yet. Still, it is better than the Date object by itself, because it gets around browser inconsistencies and provides better parsing and formatting.

Categories

Resources