Does JavaScript's Date object automatically handle daylight savings? - javascript

I am investigating an issue involving the conversion of serialized UTC dates in to JavaScript date objects; I have read a few questions on this topic already but I am still unclear.
Firstly let me say that I am in the UK. If I take for example the UTC epoch 1473805800000, which is Tue, 13 Sep 2016 22:30:00 GMT, then use that value to create a JavaScript date:
var date = new Date(1473805800000);
console.log(date);
The console logs:
Tue Sep 13 2016 23:30:00 GMT+0100 (GMT Summer Time)
I.e. the browser has recognised that an extra hour needs to be added for DST.
My question is, if I were to run this same code again after the 30th October when the clocks have gone back, would I still get the same result of 23:30, or would it be 22:30 as if it were GMT? In other words, has the browser added an hour because the subject date is DST or because we are currently in DST?
I'm prevented from altering my work station's system clock by group policy, otherwise I would skip it forward in time and test this myself.

Javascript Date objects use a time value that is an offset in milliseconds since 1970-01-01T00:00:00Z. It is always UTC.
If the Date constructor is given a single number argument, it is treated as a UTC time value, so represents the same instant in time regardless of system time zone settings.
When you use console.log(date), the built–in toString method is called which generates an implementation dependent string, generally using the current time zone setting of the host system to create a convenient, human readable string.
The current daylight saving rules in the system are used to determine the offset to use for "local" time, so if the date changes from a time when daylight saving applies to one when it doesn't, the time zone offset will be similarly adjusted (note that daylight saving offsets aren't always 1 hour). It does not matter what the current system offset is, the one used is based on the setting for the date and time that the time value represents.
Also, Date objects are very simple, they're just a time value. The time zone offset comes from system settings, it's not a property of the Date itself.
So, given:
My question is, if I were to run this same code again after the 30th
October when the clocks have gone back, would I still get the same
result of 23:30, or would it be 22:30 as if it were GMT?
the answer is "yes", it will still be 23:30 since on 13 September BST applies. It doesn't matter when the code is run, only what the system offset setting is for that date.

In your case, the Date was created using epoch 1473805800000 and converted to your timezone GMT+0100. Epoch is always UTC time, therefore it was read as UTC and converted to your current timezone.
On September 13 2016, GMT+01 had summer time, therefore it was considered in the calculus.
In my case, I get the following, running the same code as yours:
Thu Sep 15 2016 14:13:14 GMT-0300 (E. South America Standard Time)

Related

When formatting date object - does it matter that the T and the 000Z will be removed when storing to db?

Sorry if its a very basic question but I dont understand the following:
When I format the Date object (no matter what library I used), I get a string.
from this: 2022-11-28T16:55:44.000Z (new Date object)
I get this: 2022-11-28 16:55:44 (or other formats obviously depending how I format it)
Even if I turn it back into an object it, the T and 000Z will never be there anymore. Do I just ignore that (seems like it as any library or date methods are ignoring the T and the string ending when formatting) or do I add it 'back' Isnt it a problem if dates stored in my db are different (for later queries etc.)?
The Z indicates UTC (Coordinated Universal Time, also known as Greenwich Meridian Time), dropping that changes the meaning - unless your browser or server lives in the Greenwich time zone and it is winter (no daylight saving time).
You can convert back and forth between a Date object and a UTC string as follows (my browser lives in the Central European time zone):
> utc = '2022-11-28T16:55:44.000Z'
'2022-11-28T16:55:44.000Z'
> d = new Date(utc)
Mon Nov 28 2022 17:55:44 GMT+0100 (Central European Standard Time)
> d.toISOString()
'2022-11-28T16:55:44.000Z'
Alternatively, you can convert back and forth between a Date object and a formatted string in your browser's or server's time zone (the last line shows that my browser's format differs from yours):
> formatted = '2022-11-28 17:55:44'
'2022-11-28 17:55:44'
> d = new Date(formatted)
Mon Nov 28 2022 17:55:44 GMT+0100 (Central European Standard Time)
> d.toLocaleString()
'11/28/2022, 5:55:44 PM'
But you should not store the Date objects in this format in a database, unless you can guarantee that they are always read and written in the same time zone. For example, if you format a Date object with your browser (in CET) and store it, then someone else who reads it and converts it back to a Date object with their browser in the New Zealand time zone will see a wrong value. Also, dates like 9/11/2022 are ambiguous if the formatting rules are not clear (September 11th or November 9th?).
That's why I would prefer UTC strings when storing Date objects and use formatted strings only for outputting them to the user and for parsing user input.
I see it even stronger: You should never store dates as strings, it's a design flaw. Store always proper Date objects. Here on SO you can find hundreds of questions, where people have problems, because they stored date values as (localized) strings. It is not limited to MongoDB, it applies to any database.
Date objects in MongoDB are UTC times - always and only! Usually the client application is responsible to display the date/time in local time zone and local format.
What do you mean by "turn it back", i.e. how do you do it?
You should not rely on new Date(<string>) without time zone. Some browsers/environments may apply UTC time, others may use current local time zone, see Differences in assumed time zone
Have a look at 3rd party date libraries, e.g. moment.js, Luxon, or Day.js. Usually they provide better control how to parse strings and time zones.

Differences in dates between .Net and Javascript

I had an issue where dates were out by one day when displaying on an asp.net web form. These dates are only used for display so I can pass them as strings to resolved the issue, but I'm curious about why I'm seeing this behaviour.
I'm in Ireland, and while Ireland is more or less in line with GMT, we use IST (Irish Standard Time) during summer instead of DST and then revert to GMT for Winter. This has the same effect as being on GMT, but "officially" is slightly different.
As we're not on GMT, in the past, IST and DST didn't always line up.
For example, in 1958, IST started in the 20th April and ended on the 5th October whereas, DST started on the 27th of April and ended on 26th of October.
So if a date between the 5th and 26th of October 1958 is passed to JS, JS will display it as the previous day.
I wrote this this code to try and understand what's going on:
DateTime date = new DateTime(1958, 10, 4);
while (date <= new DateTime(1958, 10, 30))
{
Console.WriteLine($"normal : {date} | isDst? : {date.IsDaylightSavingTime()}");
Console.WriteLine($"universal: {date.ToUniversalTime()} | isDst? : {date.ToUniversalTime().IsDaylightSavingTime()}");
Console.WriteLine($"local : {date.ToLocalTime()} | isDst? : {date.ToLocalTime().IsDaylightSavingTime()}");
Console.WriteLine("-------------------------");
date = date.AddDays(1);
}
Which produced this output (truncated):
So I can see there are a number of days being mis identified as DST days, but it doesn't seem like that would cause this? If both .Net and JS though they were DST days, then surely the end result should be correct?
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Here's a screenshot of JS processing a few dates duirng this problematic window
You can see that JS (or chrome?) is aware that during the 5th to the 27th of that year, Ireland is no longer on GMT+1 (even though it still says it's IST) so why is the date passed from VB an incorrect date? I though they both got their datetime information from the same source i.e. the host pc?
You appear to be running .NET on Windows, in which case .NET is using the Windows time zone data for your local time zone.
The Windows time zone data does not have the full history of time zone changes from all time. Microsoft's policy only guarantees historical data is present for dates from 2010 forward, though some time zones have historical data before then.
Conversely, Chrome is using time zone data from ICU, which uses IANA time zone data. IANA time zones have historical data since at least 1970, though many time zones have historical data before then.
With specific regard to Ireland, IANA has Irish time zone data going back to 1880. Windows has no history for Ireland at all, so it assumes the current rule has always been in effect. In reality, the current rule has been in effect since Oct 1968, so any dates before then will only have accurate time zone information in the IANA data.
If you run the same .NET code you showed above on Linux or MacOS, you'll see that .NET will use IANA time zone data on those platforms and your results will match up for 1958. Or, if you pick a more recent date your results will match on Windows too.
In short - don't run these sorts of tests on old dates and expect to get the same fidelity that you'll get with modern dates.
You also asked:
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Your date variables are all DateTime where .Kind is DateTimeKind.Unspecified. The IsDaylightSavingTime method will treat such values as if they belonged to the local time zone, as if they actually had DateTimeKind.Local. The same is true for the ToUniversalTime method, however, the ToLocalTime method will assume that DateTime values with DateTimeKind.Unspecified kind are actually in terms of UTC - as if they were DateTimeKind.Utc. Thus, when DST is in effect, date.ToUniversalTime() shifts an hour backward, and date.ToLocalTime() shifts an hour forward.
You can avoid such ambiguities by using DateTimeOffset instead of DateTime.

convert local timezone timestamp to UTC timestamp

My server returns date data as local timezone timestamps.
On the client-side, I want to display those dates as local date strings. If I do the following, I got the wrong date ("6/30/2014" instead of "7/01/2014")
var ts = 1404172800;
new Date(1404172800*1000).toLocaleDateString()
>>>"6/30/2014"
To prevent this problem, I suppose I have to convert the local timezone timestramp I receive from the server to UTC timestamp before creating the new Date() object.
Am I right? What is the best way to achieve that that will work in most browsers?
Edit:
I confirm that the real date in local time zone should be 7/01/2014. That's local Eastern time UTC -5(-4). but the new Date() object thinks this is UTC but it's not. I suppose it's because the date is returned as a timestamp without having been converted to UTC.
Isn't that right already? Timestamps are always in UTC.
You're seeing 30th June and not 1st of July because when that event happened, in the local time zone, it was still 30th of June. For example, for me it is showing as 1st of July in IST.
Also, this timestamp represents an event which occurred at 1st July 2014 at 00:00:00 GMT exactly. India is GMT+05:30, as you can see in the screenshot - so if the local timezone, even if it is GMT minus one minute, it would still be 30th of June there.

Difference between UTC, GMT and Daylight saving time in JS

I already read a lot of post and a little confused with UTC, GMT and daylight saving time.
Anyone can explain about javascript Date() object with UTC, GMT and daylight saving time.
The main point I want to know is, when we work with date, we need to think or not about daylight saving time.
And the calculation of UTC,GMT and daylight saving time is same or not in different kind of programming languages.
UTC is a standard, GMT is a time zone. UTC uses the same offset as GMT, i.e. +00:00. They are essentially interchangeable when discussing offsets.
All javascript (ECMAScript) Date objects use a time value that is UTC milliseconds since 1970-01-01T00:00:00Z. When the Date constructor is called without any arguments, it gets the time and time zone offset from the host system and calculates the time value. Therefore, the accuracy of the generated date depends on the accuracy of those components.
When outputting date values using the UTC methods (e.g. getUTCHours, getUTCMinutes, etc.), the vaules are UTC (GMT). When not using those methods (e.g. getHours, getMinutes, etc.) the host system time zone offset is used with the time value to generate "local" values from the UTC time value.
Whether daylight saving is applied or not depends on the host system settings. Whatever the current rules are for the host system time zone changes for daylight saving are applied to all dates, regardless of the actual offset that date (e.g. if currently DST starts on the first Sunday in October then it will be assumed to have always started on the first Sunday in October).
Date object behaviour is described in ECMA-262 §20.3.2 and a bit more clearly (for some parts) in MDN Date.

How do I get timezone codes such as BST / GMT / CET etc?

I want to show the timezone code (such as BST or GMT) based on the user's locale. However, in each browser it is positioned in different places from new Date().toLocaleString() and on some browsers (such as Opera) it is not available at all.
It looks like the solution would be to get a database or structure of timezone codes against timezone offset but there can be multiple matches (e.g. BST == GMT) and I can't even find such a list.
Does this mean it can't be done?
You have to manually define the timezone differences in your code. It is complicated because GMT time is different than UTC (Zulu) time. GMT includes a change for daylight savings time, while UTC has no daylight savings and is otherwise the same timezone representation.
Additional confusion is that not all political zones define their timezones to vary by an example number of hours. India is UTC + 5.5 hours and Afghanistan UTC + 4.5 hours. Notice that is UTC and not GMT as the base as those countries do not observe daylight savings. Some small island nations in the southwest Pacific can vary on the quarter hour.
Then some political timezone definitions have internal differentiation. Arizona is typically considered to be in the US Pacific timezone, GMT - 8 hours, except that it does not observe daylight savings, so half the year it is in US Mountain timezone as Arizona is really UTC - 8.
With all that confusion there are hundreds of definitions for timezones, and some of those change as political boundaries change. I just say screw it and use either the user's clock time or adjust to UTC time by dynamically adjusting a constant in your JavaScript from the server to reflect a known value that represents UTC time when the user requested the page.

Categories

Resources