I've run the following in the console on Firefox (version 21) and I'm getting results I don't expect.
new Date(1362891600000);
var date = new Date(1362891600000);
var time = date.getHours();
new Date(date.setHours(date.getHours() + 24));
The result really throws me for a loop.
The first date shows up as Eastern Daylight Time, while the second one shows up with Eastern Standard Time. It's totally backwards. This does not happen with IE or with Chrome.
What's going on here?
This is definitely a bug in Firefox. You should probably report it to them.
However, be aware that anything after the offset is non-standard and support varies wildly across browsers and operating systems.
For example, some browsers display a time zone name, while others display an abbreviation or internal id. Also, some keep their own strings, and some use the values returned by the operating system. And on Windows, there is a different time zone database than on Linux or Mac. Also, some browsers may localize this string using language, locale, or culture settings.
You can display it to a user, if you know the value is in their own local time zone. But don't rely on it for anything critical.
Related
I'm trying to access the native iOS date/time interface in a web app by using <input type="datetime-local"...>.
When using this style of <input>, the time value you set or get is a string in standard ISO format, like 2019-07-09T16:30.
You'd think that for whatever time you put in as a string value, the time you're going to see in the native interface (albeit reformatted and perhaps changed to 12-hour AM/PM form for some locales) is going to be the same time. And that's what I've seen so far using an Android tablet, and that's what I get most of the time using iOS.
But for dates over a century ago I'm finding that the time displayed by iOS is three minutes ahead of the ISO time string I put into the input. Likewise, if I edit the time via iOS, and stick to roughly the same antique time period, the time I get back out of the <input> is three minutes behind whatever I enter into the on-screen interface.
This offset is timezone dependent. My normal timezone is America/New_York. If I switch my iPhone or iPad to America/Chicago, the same sort of error occurs, but the difference is a full 9 minutes.
You can see the bug in action here: https://angular-ios-datetime.stackblitz.io (code here: https://stackblitz.com/edit/angular-ios-datetime)
I know what's going on actually...
On November 18, 1883, at 12:03:58 local mean solar time in New York City, the clocks were turned back three minutes and 58 seconds, back to exact noon. This was one of the many steps taken all over the country to switch to standardized time zones.
Chicago (and many other cities too) made a similar change, but Chicago's change from local mean time to standard time was closer to nine minutes.
What seems to be happening with <input type="datetime-local"...> is that there's a conflict between JavaScript and iOS, where one of these two (probably JavaScript) doesn't know about that long-ago timezone history, but the other (probably iOS) does know about it.
What I'd like to know then is this:
Is there a way to use <input type="datetime-local"...> that sticks with UTC instead? A non-standard feature of iOS, perhaps?
There used to be an option <input type="datetime"...> (without the -local), but it's been deprecated as isn't suppored by most current web browsers, including Safari on iOS.
I agree that this is a bug in the iOS implementation of <input type="datetime-local" />. It impacts both Safari and Chrome on iOS. (I tested on iOS 12 with Safari Mobile 605.1 and Chrome 75.)
The interesting part is that it is not just a problem with historical dates, but also with any value that might be affected by the local time zone. For example, with the device set for New York time, try selecting 2020-03-08T02:00.
Note that I can get to 1 AM, but the 2 is grayed out. If you try to pick it, it moves to a different time. That's because 2:00 AM is invalid in New York on that day. The clock advances from 1:59 AM to 3:00 AM for the start of daylight saving time.
The problem is that if the application is picking time for a different time zone than the device's time zone, such as Phoenix, Arizona - where DST doesn't apply. Or perhaps a different country which might have DST transitions on different dates.
This all boils down to interpretation of the word "local".
By the HTML specification for datetime-local, which says (emphasis mine):
The input element represents a control for setting the element's value to a string representing a local date and time, with no time-zone offset information.
Furthermore, it defines a "local date and time" as:
... consisting of a year, a month, and a day, and a time, consisting of an hour, a minute, a second, and a fraction of a second, but expressed without a time zone.
In other words, it "a local date and time", not "the user's local date and time". By the way, this aligns perfectly with the terminology in the ISO 8601 specification.
Unfortunately, not only is the iOS implementation incorrect in this regard, but so was the Mozilla MDN documentation, which previously included the text "The user's local time zone is used." (I submitted an update to correct this.)
In short, datetime-local is supposed to be not associated with any time zone.
As a workaround, consider using two controls.
<input type="date" /> <input type="time" />
They will appear like so in iOS:
And then you get separate pickers for each one.
It solves both the LMT issue and the DST-gap issue, because neither control does anything with the local time zone. As an added bonus, you get the year in the date picker, which is missing from the combined one.
Since today I noticed a problem in one of the webapps running on my server, in that it reported the timezone wrong - but only on Firefox
And indeed, checking in the console:
Intl.DateTimeFormat().resolvedOptions().timeZone
delivers
Etc/GMT-1
What's this supposed to mean? I am in a GMT+1 (+2 when counting currently active daylight savings time) timezone.
On Chrome, the above command correctly returns Europe/Vienna.
OS: Windows 10, Firefox version: 59.0.2 (64 bit) with Add-Ons: NoScript, PrivacyBadger, uBlock Origin (tried with add-ons deactivated, no change), Chrome 65.0.3325.181 (64-Bit)
Is this a Firefox bug (though as far as I can tell Firefox was not updated in the past few days, and the problem only started today)?
In the webapp also nothing changed as far as I can tell, so I suspect this to be a Firefox issue somehow (though I have no idea how Firefox gets to this "wrong" timezone info). Or is the timezone actually correctly referencing a time 2 hours ahead of UTC? And is the webapp mabye wrong in not recognizing this properly? I'm fresh out of ideas.
Searching in google hasn't brought up any recent hints regarding this (only some outdated ones, e.g.: Incorrect timezone in Firefox, compared to Safari, using javascript Date(), https://support.mozilla.org/de/questions/1191823). I also can't find anything else pointing at unusual timezone readings currently in Firefox, so I'm really wondering where this is coming from!
UPDATE:
To be honest, I have no idea what timezone firefox reported before - the actual problem I have is that the webapp (owncloud), when run in Firefox, reports not knowing the time zone specification - the message translates to "unknown timezone specification Etc/GMT-1. Falling back to UTC". The times are then 2 hours behind what I would expect them to be (which makes sense as current europe/vienna or europe/berlin time is 2 hours ahead of UTC). Not knowing how to interpret Etc/GMT-1 might be an issue on owncloud's part, but it has worked up until a few days ago, and still continues to work on Chrome...
More info as requested by Matt Johnson below:
>tzutil /g
W. Europe Standard Time
>echo %TZ%
%TZ%
Registry:
I suppose I get Europe/Vienna instead of Europe/Berlin since I have configured to be located in Austria?
Addendum: I only get this behavior so far only on a single Windows 10 machine. On another Linux machine running Firefox, I do not see this behavior. I yet have to check on another Windows 10 machine
There are two separate questions here:
Etc/GMT-1
What's this supposed to mean? I am in a GMT+1 ...
The tz database identifiers of the form Etc/GMT±* deliberately have an inverted sign than the usual forms we expect under ISO 8601. That is, they are in terms of positive values being West of GMT, rather than positive values being East of GMT. This is covered both in the Wikipedia article on the tz database, and in the commentary in the tz database itself.
Thus, it does indeed align with the current GMT+1 offset of your time zone. However, it doesn't reflect any DST of GMT+2.
... On Chrome, the above command correctly returns Europe/Vienna.
... Is this a Firefox bug?
That depends. What is your OS time zone set to? You said you are running Windows 10. Though Europe/Vienna maps to W. Europe Standard Time, the primary (001) mapping for that would be Europe/Berlin - so it's a bit odd that you would get Europe/Vienna unless there is something else influencing the result.
It is possible you have stumbled upon a bug, but it's also possible you have customized or corrupted time zone settings on your OS.
Please supply (via edit of your question) the values of:
The output of the tzutil /g command on the command line.
The value (if any) of a TZ environment variable (echo %TZ%)
All the values under this registry key (screenshot would be best):
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\TimeZoneInformation
That will help identify the problem further. I will edit my answer accordingly. Thanks.
I have a snippet code in javascript as shown below:
$(document).ready(function () {
var day1 = new Date();
});
Now I open a browser (in this case I use FireFox) and the current TimeZone of my computer is UTC +10:00 Hobart. Then I view the day1 javascript variable in debug mode, it displays its value as image shown below.
The TimeZone name of UTC+10:00 Hobart is Tasmania Standard Time. FF gets the correct name and as my expected too.
Next, I will chang the TimeZone on my computer to UTC+10:00 Canberra, Melbourne, Sydney while FF browser is being opened. Then I refresh FF browser by pressing Ctrl+F5 and view day1's value in debug mode again.
As you can see, FF updated the new correct TimeZone name as AUS Eastern Standard Time. It worked as my expected too.
MS Edge and IE worked as the same way with FF.
BUT, Chrome worked in a different way.
The new TimeZone name that Chrome returns to me is Local Daylight Time, not AUS Eastern Standard Time as my expected and as the other browsers (FF, MS Edge, IE) did.
Why does not Chrome return "AUS Eastern Standard Time" name?
Is there any way can solve this case for Chrome?
Unfortunately, the implementation of time zone name is not standardized. The ECMAScript spec allows it to be anything the vendor wants it to be. It's not even just different across browsers, but also across different versions of the browser, and across operating systems.
From ECMA-262 (5.1) Section 15.9.5.2 (emphasis mine)
Date.prototype.toString()
This function returns a String value. The contents of the String are implementation-dependent, but are intended to represent the Date in the current time zone in a convenient, human-readable form.
If what you're trying to do is to detect the user's time zone, there is a newer standard for retrieving the time zone name, as an IANA time zone identifier (ex, Australia/Sydney), which you can do like this:
Intl.DateTimeFormat().resolvedOptions().timeZone
However, not all browsers have implemented this yet. There are libraries that can guess at it though, which you can read about here.
I have recently discovered that there is a new extension to JavaScript. This adds several features to the Date object in the toLocaleString, toLocaleDateString and toLocaleTimeString functions. Reference here.
I am particularly interested in the timeZone option, that supports IANA/Olson time zones, such as America/New_York or Europe/London. This is currently only supported in Google Chrome.
Previous advice was that to work in JavaScript with any other time zone than UTC or your own local time zone, one had to use a library. But now, it appears that this is starting to be incorporated directly into the browser. So now you can do this:
new Date().toLocaleString("en-US", {timeZone: "America/New_York"})
// output: "7/4/2013 5:15:45 PM"
Or:
new Date().toLocaleString("en-NZ", {timeZone: "Pacific/Chatham",
timeZoneName: "long"})
// output: "7/5/2013 9:59:52 AM GMT+12:45"
Or:
new Date().toLocaleString("en-GB", {timeZone: "Europe/London",
timeZoneName: "short"})
// output: "4/7/2013 22:18:57 United Kingdom Time"
// (strange time zone name, but ok)
This is very cool, but I have a few questions:
Is this part of a new standard? Perhaps buried somewhere in ECMAScript 6? Or is it just something custom to Chrome?
Why just Google Chrome? Is it supported anywhere else? Are there plans to supported it anywhere else?
I checked node.js, which uses Chrome's JavaScript runtime, but it doesn't work there. Why not?
Is the time zone data accessible in any other way than the functions I listed? If only available when formatting strings, then doing any calculations based on the results may be difficult.
This is focused on output, but how would I use it for input? Is there a way to pass the time zone in the constructor to the Date object? I tried the following:
// parsing it with a date and time
new Date("2013-01-01 12:34:56 America/New_York")
// passing it as a named option
new Date(2013,0,1,12,34,56,{timeZone:"America/New_York"})
Neither worked. I couldn't find anything in the specs, so I don't think this exists (yet), but please tell me if I am wrong.
The issue described in this post, created by a flaw in the ECMAScript 5 spec, still affects the output, even when the correct data is in the TZDB. How is it that both old and new implementations are coexisting? One would think it would be all the old way, or all the new way. For example, with my computer's time zone set to US Eastern Time:
new Date(2004,10,7,0,0).toLocaleString("en-US",{timeZone:"America/New_York"})
returns "11/6/2004 11:00:00 PM". It should return midnight, since I started at midnight and my local time zone matches the output time zone. But it places the provided input date at the wrong UTC point due to the ES5 issue.
Can I expect that as IANA releases updates to the TZDB that Google will push Chrome updates that contain the changes?
update
There is pretty extensive write-up about the API here
Is this part of a new standard? Perhaps buried somewhere in ECMAScript
6? Or is it just something custom to Chrome?
Yes, these are part of the ECMAScript Internationalization API. It is implemented separate from ECMAScript, but the requirement of implementing ECMAScript Internationalization API is to first have correct implementation of ECMAScript 5.1
Why just Google Chrome? Is it supported anywhere else? Are there plans
to supported it anywhere else?
For the recent years, Google Chrome has mostly been first to implement new features. Mozilla is more conservative, still for example discussing whether to implement the download attribute of a elements. It is now available in IE11 Beta and Opera too. It will be available in Firefox 25.
I checked node.js, which uses Chrome's JavaScript runtime, but it
doesn't work there. Why not?
node.js just uses the same engine, which is a separate project from the Google Chrome browser. The engine just implements Ecmascript 5.1. This is an extension node.js would have to implement separately right now. It will become available in V8 in Q3 so probably a bit after that you can use it in node.js.
This is focused on output, but how would I use it for input? Is there
a way to pass the time zone in the constructor to the Date object? I
tried the following:
There is nothing about inputting dates in the specification. I personally cannot see how this would be useful, you are doing it wrong if you are not transmitting UTC timestamps because something like "2013-01-01 12:34:56 America/New_York" is ambiguous during transitions from DST to standard time.
The issue described in this post, created by a flaw in the ECMAScript
5 spec, still affects the output, even when the correct data is in the
TZDB.
This is input issue, not output. Again, constructing a date with local timezone that you cannot influence or detect is doing it wrong. Use the timestamp constructor overload or Date.UTC.
Can I expect that as IANA releases updates to the TZDB that Google
will push Chrome updates that contain the changes?
Nothing in the spec but I think it will be reasonable to expect that the rules are not too much behind.
Is this part of a new standard? Perhaps buried somewhere in ECMAScript
6? Or is it just something custom to Chrome?
Indeed it is part of the new ECMA-402 standard. The standard is very difficult to read, but there is this friendly introduction.
Why just Google Chrome? Is it supported anywhere else? Are there plans
to supported it anywhere else?
MDN has a list of supporting browsers. According to Bug 853301 it will be available in Firefox 25.
I checked node.js, which uses Chrome's JavaScript runtime, but it doesn't work there. Why not?
Possible reasons are many; it is not up to the current code base, or it would make the node.js bigger and slower (the previous bug tracker entry from Mozilla indicates that the timezone data increased the download size of Firefox by 10 %, and caused the I/O to increase substantially during browser start up.
Is the time zone data accessible in any other way than the functions I listed? If only
available when formatting strings, then doing any calculations based on the results may be
difficult.
Seems that it is not available. Furthermore, the Intl API primer talks that only UTC and local time zone are absolutely required to be supported.
This is focused on output, but how would I use it for input? Is there a way to pass
the time zone in the constructor to the Date object? I tried the following:
The Intl API only speaks about date/time formatting, string collations and number formatting. The datetime formatting not only supports the Gregorian calendar but also many kinds of other calendars, lunar, lunisolar and so forth.
The issue described in this post, created by a flaw in the ECMAScript 5 spec, still affects
the output, even when the correct data is in the TZDB. How is it that both old and new
implementations are coexisting? One would think it would be all the old way, or all the new
way. For example, with my computer's time zone set to US Eastern Time:
new Date(2004,10,7,0,0).toLocaleString("en-US",{timeZone:"America/New_York"})
returns "11/6/2004 11:00:00 PM". It should return midnight, since I started at midnight and > my local time zone matches the output time zone. But it places the provided input date at the > wrong UTC point due to the ES5 issue.
The reason for that is that ES5 mandates that the input to new Date be calculated using the current DST and offset, that is it is America/New York but with EDT timezone, even though Nov 6 is not in EDT. Obviously as this is so specified, then it cannot be changed. However, as Chrome is using the TZDB to do the conversion from the bare UTC point-in-time value to the America/New York tz, it then does consider the time as being in EST.
Can I expect that as IANA releases updates to the TZDB that Google will push Chrome updates
that contain the changes?
I'd believe so
We have a simple function the works out a duration, it works fine in every browser apart from Safari on a Mac (works in chrome on the mac, works on Safari on the PC)
For example,
new Date().toLocaleTimeString()
We expect this to give a time formatted like this:
11:59:25
However, on the Mac Safari we get this
11:59:25 GMT+01:00
Any calculations we do on these times are one hour out (its adding the hour onto the calculation)
e.g.
11:59:25 - 11:59:25 = 01:00:00 (should be 00:00:00)
Any ideas?
Why is it adding the time zone to the string? this caused us a little issue with our database
Why is it adding an hour to the sting?
Why just in that one bloody browser!
Thanks for your time.
The toLocaleTimeString method relies on the underlying operating system in formatting dates. It converts the date to a string using the formatting convention of the operating system where the script is running. For example, in the United States, the month appears before the date (04/15/98), whereas in Germany the date appears before the month (15.04.98).
Methods such as getHours, getMinutes, and getSeconds give more consistent results than toLocaleTimeString. Use toLocaleTimeString when the intent is to display to the user a string formatted using the regional format chosen by the user. Be aware that this method, due to its nature, behaves differently depending on the operating system and on the user's settings.
Source: https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Date/toLocaleTimeString
In OS X, time format can be fine tuned by Apple menu > System Preferences > Language & Region > Advanced > Time. The format used by toLocaleTimeString() is the Long format. You can adjust the format as desired for your needs, however keep in mind this change will be effective system wide.
Source: Apple Support - Customize Formats