Javascript on Mac (Safari) toLocaleTimeString() oddities - javascript

We have a simple function the works out a duration, it works fine in every browser apart from Safari on a Mac (works in chrome on the mac, works on Safari on the PC)
For example,
new Date().toLocaleTimeString()
We expect this to give a time formatted like this:
11:59:25
However, on the Mac Safari we get this
11:59:25 GMT+01:00
Any calculations we do on these times are one hour out (its adding the hour onto the calculation)
e.g.
11:59:25 - 11:59:25 = 01:00:00 (should be 00:00:00)
Any ideas?
Why is it adding the time zone to the string? this caused us a little issue with our database
Why is it adding an hour to the sting?
Why just in that one bloody browser!
Thanks for your time.

The toLocaleTimeString method relies on the underlying operating system in formatting dates. It converts the date to a string using the formatting convention of the operating system where the script is running. For example, in the United States, the month appears before the date (04/15/98), whereas in Germany the date appears before the month (15.04.98).
Methods such as getHours, getMinutes, and getSeconds give more consistent results than toLocaleTimeString. Use toLocaleTimeString when the intent is to display to the user a string formatted using the regional format chosen by the user. Be aware that this method, due to its nature, behaves differently depending on the operating system and on the user's settings.
Source: https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Date/toLocaleTimeString

In OS X, time format can be fine tuned by Apple menu > System Preferences > Language & Region > Advanced > Time. The format used by toLocaleTimeString() is the Long format. You can adjust the format as desired for your needs, however keep in mind this change will be effective system wide.
Source: Apple Support - Customize Formats

Related

How does Chrome determine how dates are formatted with method toLocaleString()?

I am displaying dates client-side via the toLocaleString() method from Intl.DateTimeFormat.
The page is accessed via Chrome from "German" timezone.
The OS (Ubuntu) language is set to English (US). The region is set to Germany.
The code for date display is like this:
const date = new Date();
const displayDateStr = localeDate.toLocaleString(date);
console.log(displayDateStr)
The formatting is shown as "9/28/2022, 3:00:20 PM", which is probably due to the Chrome language being set to English (US) and hence expected.
If I go to Chrome DevTools -> Sensors pane (Cmd + Shift + P and type sensors), I can overwrite the "location".
Now if I select "Berlin" for location, both the time-zone and the display format are displayed in German (i.e. "28.9.2022, 15:00:20").
The change of location in Chrome DevTools (affects time-zone and date formatting) behaves different than a change of the OS Systems location (affects the time-zone, but not the formatting).
Now my confusion is where the default language for date formatting is generally taken from. Is using the default format reliable or is it best practice to specify it somehow manually for users?

Why the difference between 2 time stamps has one hour deviation compared to expected value?

Take the date '2022-04-01' and another date '2022-05-15' for example. When I calculated their deviation in Chrome devtools, what I got is:
The result is 3801600000. But when my friend did the same thing in another device, what he got is:
The result is 3798000000. The difference between 3801600000 and 3798000000 is exactly one hour. What may causes this result? How can I eliminate this difference?
You lack the zone data:
UTC datetime: new Date("2021-01-01T12:00:00Z");
UTC-4 datetime: new Date("2021-01-01T12:00:00-04:00");
The main issue you are experiencing is because your input string is being interpreted as assigned to the local time zone, and you have different time zones on the two machines you've been testing with. One of them has a DST transition between the two dates, and the other does not - which explains your one hour difference.
The examples you show also reveal your possible time zones:
Your machine is showing 8 hours ahead of UTC (UTC+8) for both timestamps. There are several parts of the world that use UTC+8 without DST, including China, Western Australia, Irkutsk Russia, and many others. There are no places on earth that use UTC+8 in conjunction with DST.
Your friend's machine is a different story. The timestamps are showing 2 hours ahead of UTC (UTC+2) on 2022-04-01, and 3 hours ahead of UTC (UTC+3) on 2022-05-15. While many countries use those offsets (such as those that are in Eastern Europe that use EET/EEST), none of those areas have a DST transition between those two dates. See the bottom of this table of DST transitions. All of the +2/+3 areas of the world transitioned in March. I can only conclude that your friend's machine has some non-standard time zone setting, or they are significantly behind on time zone data updates, or both. (Please reply in comments if I am incorrect on this!)
Also, your input string format, 2022-04-01 00:00:00 is not in the standard date time string format defined by the ECMAScript specification. Thus, it is not guaranteed to be parsed consistently by all browsers. Current versions of Chrome, Edge, and Firefox will interpret it as a local date and time, but the current version of Safari will fail with "Invalid Date".
If you want it interpreted as local time correctly in all browsers, you would need to specify it as 2044-04-01T00:00:00.
If you want it interpreted as UTC then specify as 2044-04-01T00:00:00Z.
If you want it interpreted with a specific time zone offset, then append that instead, as in: 2044-04-01T00:00:00+08:00.
If you must parse the string in the original format, then don't do it using the Date object. Either use a library (such as Luxon or date-fns), or parse it yourself with regex and/or string manipulation techniques.

Is there a fix for this obscure off-by-a-few-minutes iOS datetime-local bug?

I'm trying to access the native iOS date/time interface in a web app by using <input type="datetime-local"...>.
When using this style of <input>, the time value you set or get is a string in standard ISO format, like 2019-07-09T16:30.
You'd think that for whatever time you put in as a string value, the time you're going to see in the native interface (albeit reformatted and perhaps changed to 12-hour AM/PM form for some locales) is going to be the same time. And that's what I've seen so far using an Android tablet, and that's what I get most of the time using iOS.
But for dates over a century ago I'm finding that the time displayed by iOS is three minutes ahead of the ISO time string I put into the input. Likewise, if I edit the time via iOS, and stick to roughly the same antique time period, the time I get back out of the <input> is three minutes behind whatever I enter into the on-screen interface.
This offset is timezone dependent. My normal timezone is America/New_York. If I switch my iPhone or iPad to America/Chicago, the same sort of error occurs, but the difference is a full 9 minutes.
You can see the bug in action here: https://angular-ios-datetime.stackblitz.io (code here: https://stackblitz.com/edit/angular-ios-datetime)
I know what's going on actually...
On November 18, 1883, at 12:03:58 local mean solar time in New York City, the clocks were turned back three minutes and 58 seconds, back to exact noon. This was one of the many steps taken all over the country to switch to standardized time zones.
Chicago (and many other cities too) made a similar change, but Chicago's change from local mean time to standard time was closer to nine minutes.
What seems to be happening with <input type="datetime-local"...> is that there's a conflict between JavaScript and iOS, where one of these two (probably JavaScript) doesn't know about that long-ago timezone history, but the other (probably iOS) does know about it.
What I'd like to know then is this:
Is there a way to use <input type="datetime-local"...> that sticks with UTC instead? A non-standard feature of iOS, perhaps?
There used to be an option <input type="datetime"...> (without the -local), but it's been deprecated as isn't suppored by most current web browsers, including Safari on iOS.
I agree that this is a bug in the iOS implementation of <input type="datetime-local" />. It impacts both Safari and Chrome on iOS. (I tested on iOS 12 with Safari Mobile 605.1 and Chrome 75.)
The interesting part is that it is not just a problem with historical dates, but also with any value that might be affected by the local time zone. For example, with the device set for New York time, try selecting 2020-03-08T02:00.
Note that I can get to 1 AM, but the 2 is grayed out. If you try to pick it, it moves to a different time. That's because 2:00 AM is invalid in New York on that day. The clock advances from 1:59 AM to 3:00 AM for the start of daylight saving time.
The problem is that if the application is picking time for a different time zone than the device's time zone, such as Phoenix, Arizona - where DST doesn't apply. Or perhaps a different country which might have DST transitions on different dates.
This all boils down to interpretation of the word "local".
By the HTML specification for datetime-local, which says (emphasis mine):
The input element represents a control for setting the element's value to a string representing a local date and time, with no time-zone offset information.
Furthermore, it defines a "local date and time" as:
... consisting of a year, a month, and a day, and a time, consisting of an hour, a minute, a second, and a fraction of a second, but expressed without a time zone.
In other words, it "a local date and time", not "the user's local date and time". By the way, this aligns perfectly with the terminology in the ISO 8601 specification.
Unfortunately, not only is the iOS implementation incorrect in this regard, but so was the Mozilla MDN documentation, which previously included the text "The user's local time zone is used." (I submitted an update to correct this.)
In short, datetime-local is supposed to be not associated with any time zone.
As a workaround, consider using two controls.
<input type="date" /> <input type="time" />
They will appear like so in iOS:
And then you get separate pickers for each one.
It solves both the LMT issue and the DST-gap issue, because neither control does anything with the local time zone. As an added bonus, you get the year in the date picker, which is missing from the combined one.

Chrome cannot get TimeZone name correctly after changing the TimeZone of computer

I have a snippet code in javascript as shown below:
$(document).ready(function () {
var day1 = new Date();
});
Now I open a browser (in this case I use FireFox) and the current TimeZone of my computer is UTC +10:00 Hobart. Then I view the day1 javascript variable in debug mode, it displays its value as image shown below.
The TimeZone name of UTC+10:00 Hobart is Tasmania Standard Time. FF gets the correct name and as my expected too.
Next, I will chang the TimeZone on my computer to UTC+10:00 Canberra, Melbourne, Sydney while FF browser is being opened. Then I refresh FF browser by pressing Ctrl+F5 and view day1's value in debug mode again.
As you can see, FF updated the new correct TimeZone name as AUS Eastern Standard Time. It worked as my expected too.
MS Edge and IE worked as the same way with FF.
BUT, Chrome worked in a different way.
The new TimeZone name that Chrome returns to me is Local Daylight Time, not AUS Eastern Standard Time as my expected and as the other browsers (FF, MS Edge, IE) did.
Why does not Chrome return "AUS Eastern Standard Time" name?
Is there any way can solve this case for Chrome?
Unfortunately, the implementation of time zone name is not standardized. The ECMAScript spec allows it to be anything the vendor wants it to be. It's not even just different across browsers, but also across different versions of the browser, and across operating systems.
From ECMA-262 (5.1) Section 15.9.5.2 (emphasis mine)
Date.prototype.toString()
This function returns a String value. The contents of the String are implementation-dependent, but are intended to represent the Date in the current time zone in a convenient, human-readable form.
If what you're trying to do is to detect the user's time zone, there is a newer standard for retrieving the time zone name, as an IANA time zone identifier (ex, Australia/Sydney), which you can do like this:
Intl.DateTimeFormat().resolvedOptions().timeZone
However, not all browsers have implemented this yet. There are libraries that can guess at it though, which you can read about here.

javascript seems to be using time zones backwards with Firefox

I've run the following in the console on Firefox (version 21) and I'm getting results I don't expect.
new Date(1362891600000);
var date = new Date(1362891600000);
var time = date.getHours();
new Date(date.setHours(date.getHours() + 24));
The result really throws me for a loop.
The first date shows up as Eastern Daylight Time, while the second one shows up with Eastern Standard Time. It's totally backwards. This does not happen with IE or with Chrome.
What's going on here?
This is definitely a bug in Firefox. You should probably report it to them.
However, be aware that anything after the offset is non-standard and support varies wildly across browsers and operating systems.
For example, some browsers display a time zone name, while others display an abbreviation or internal id. Also, some keep their own strings, and some use the values returned by the operating system. And on Windows, there is a different time zone database than on Linux or Mac. Also, some browsers may localize this string using language, locale, or culture settings.
You can display it to a user, if you know the value is in their own local time zone. But don't rely on it for anything critical.

Categories

Resources