Jquery Date picker shows different day on phone vs PC? - javascript

I am baffled by this current situation.
My PC Shows the Jquery date picker properly. Where as accessing the same webpage from the phone shows the date pickers days as offset by 1.
I.e.
10/4/2012 = Tuesday on PC.
10/4/2012 = Monday on phone.
Things I have checked:
Both say April 2012 along the top.
The Date and time is set correctly on the
phone.
Both are using GMT London time.
Both are running from the same
page, hence same code.

This is because javascript is client-side not server-side.
So it takes the device date.

Related

Differences in dates between .Net and Javascript

I had an issue where dates were out by one day when displaying on an asp.net web form. These dates are only used for display so I can pass them as strings to resolved the issue, but I'm curious about why I'm seeing this behaviour.
I'm in Ireland, and while Ireland is more or less in line with GMT, we use IST (Irish Standard Time) during summer instead of DST and then revert to GMT for Winter. This has the same effect as being on GMT, but "officially" is slightly different.
As we're not on GMT, in the past, IST and DST didn't always line up.
For example, in 1958, IST started in the 20th April and ended on the 5th October whereas, DST started on the 27th of April and ended on 26th of October.
So if a date between the 5th and 26th of October 1958 is passed to JS, JS will display it as the previous day.
I wrote this this code to try and understand what's going on:
DateTime date = new DateTime(1958, 10, 4);
while (date <= new DateTime(1958, 10, 30))
{
Console.WriteLine($"normal : {date} | isDst? : {date.IsDaylightSavingTime()}");
Console.WriteLine($"universal: {date.ToUniversalTime()} | isDst? : {date.ToUniversalTime().IsDaylightSavingTime()}");
Console.WriteLine($"local : {date.ToLocalTime()} | isDst? : {date.ToLocalTime().IsDaylightSavingTime()}");
Console.WriteLine("-------------------------");
date = date.AddDays(1);
}
Which produced this output (truncated):
So I can see there are a number of days being mis identified as DST days, but it doesn't seem like that would cause this? If both .Net and JS though they were DST days, then surely the end result should be correct?
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Here's a screenshot of JS processing a few dates duirng this problematic window
You can see that JS (or chrome?) is aware that during the 5th to the 27th of that year, Ireland is no longer on GMT+1 (even though it still says it's IST) so why is the date passed from VB an incorrect date? I though they both got their datetime information from the same source i.e. the host pc?
You appear to be running .NET on Windows, in which case .NET is using the Windows time zone data for your local time zone.
The Windows time zone data does not have the full history of time zone changes from all time. Microsoft's policy only guarantees historical data is present for dates from 2010 forward, though some time zones have historical data before then.
Conversely, Chrome is using time zone data from ICU, which uses IANA time zone data. IANA time zones have historical data since at least 1970, though many time zones have historical data before then.
With specific regard to Ireland, IANA has Irish time zone data going back to 1880. Windows has no history for Ireland at all, so it assumes the current rule has always been in effect. In reality, the current rule has been in effect since Oct 1968, so any dates before then will only have accurate time zone information in the IANA data.
If you run the same .NET code you showed above on Linux or MacOS, you'll see that .NET will use IANA time zone data on those platforms and your results will match up for 1958. Or, if you pick a more recent date your results will match on Windows too.
In short - don't run these sorts of tests on old dates and expect to get the same fidelity that you'll get with modern dates.
You also asked:
Additionally, why is there a 2 hour difference between the output of ToUniversalTime and ToLocalTime during DST?
Your date variables are all DateTime where .Kind is DateTimeKind.Unspecified. The IsDaylightSavingTime method will treat such values as if they belonged to the local time zone, as if they actually had DateTimeKind.Local. The same is true for the ToUniversalTime method, however, the ToLocalTime method will assume that DateTime values with DateTimeKind.Unspecified kind are actually in terms of UTC - as if they were DateTimeKind.Utc. Thus, when DST is in effect, date.ToUniversalTime() shifts an hour backward, and date.ToLocalTime() shifts an hour forward.
You can avoid such ambiguities by using DateTimeOffset instead of DateTime.

Is there a fix for this obscure off-by-a-few-minutes iOS datetime-local bug?

I'm trying to access the native iOS date/time interface in a web app by using <input type="datetime-local"...>.
When using this style of <input>, the time value you set or get is a string in standard ISO format, like 2019-07-09T16:30.
You'd think that for whatever time you put in as a string value, the time you're going to see in the native interface (albeit reformatted and perhaps changed to 12-hour AM/PM form for some locales) is going to be the same time. And that's what I've seen so far using an Android tablet, and that's what I get most of the time using iOS.
But for dates over a century ago I'm finding that the time displayed by iOS is three minutes ahead of the ISO time string I put into the input. Likewise, if I edit the time via iOS, and stick to roughly the same antique time period, the time I get back out of the <input> is three minutes behind whatever I enter into the on-screen interface.
This offset is timezone dependent. My normal timezone is America/New_York. If I switch my iPhone or iPad to America/Chicago, the same sort of error occurs, but the difference is a full 9 minutes.
You can see the bug in action here: https://angular-ios-datetime.stackblitz.io (code here: https://stackblitz.com/edit/angular-ios-datetime)
I know what's going on actually...
On November 18, 1883, at 12:03:58 local mean solar time in New York City, the clocks were turned back three minutes and 58 seconds, back to exact noon. This was one of the many steps taken all over the country to switch to standardized time zones.
Chicago (and many other cities too) made a similar change, but Chicago's change from local mean time to standard time was closer to nine minutes.
What seems to be happening with <input type="datetime-local"...> is that there's a conflict between JavaScript and iOS, where one of these two (probably JavaScript) doesn't know about that long-ago timezone history, but the other (probably iOS) does know about it.
What I'd like to know then is this:
Is there a way to use <input type="datetime-local"...> that sticks with UTC instead? A non-standard feature of iOS, perhaps?
There used to be an option <input type="datetime"...> (without the -local), but it's been deprecated as isn't suppored by most current web browsers, including Safari on iOS.
I agree that this is a bug in the iOS implementation of <input type="datetime-local" />. It impacts both Safari and Chrome on iOS. (I tested on iOS 12 with Safari Mobile 605.1 and Chrome 75.)
The interesting part is that it is not just a problem with historical dates, but also with any value that might be affected by the local time zone. For example, with the device set for New York time, try selecting 2020-03-08T02:00.
Note that I can get to 1 AM, but the 2 is grayed out. If you try to pick it, it moves to a different time. That's because 2:00 AM is invalid in New York on that day. The clock advances from 1:59 AM to 3:00 AM for the start of daylight saving time.
The problem is that if the application is picking time for a different time zone than the device's time zone, such as Phoenix, Arizona - where DST doesn't apply. Or perhaps a different country which might have DST transitions on different dates.
This all boils down to interpretation of the word "local".
By the HTML specification for datetime-local, which says (emphasis mine):
The input element represents a control for setting the element's value to a string representing a local date and time, with no time-zone offset information.
Furthermore, it defines a "local date and time" as:
... consisting of a year, a month, and a day, and a time, consisting of an hour, a minute, a second, and a fraction of a second, but expressed without a time zone.
In other words, it "a local date and time", not "the user's local date and time". By the way, this aligns perfectly with the terminology in the ISO 8601 specification.
Unfortunately, not only is the iOS implementation incorrect in this regard, but so was the Mozilla MDN documentation, which previously included the text "The user's local time zone is used." (I submitted an update to correct this.)
In short, datetime-local is supposed to be not associated with any time zone.
As a workaround, consider using two controls.
<input type="date" /> <input type="time" />
They will appear like so in iOS:
And then you get separate pickers for each one.
It solves both the LMT issue and the DST-gap issue, because neither control does anything with the local time zone. As an added bonus, you get the year in the date picker, which is missing from the combined one.

How to get the *real* date using Moment.js?

So this has been on my mind for a long time - I was trying to "trick" my website and change my device's date. Unfortunately, this worked. I easily changed my laptop's date in the settings, entered the website and got the fake date.
So my question is simple, how do I get the real date?
This was the code I used: (The PST real day is 17th of August), and I changed my device's date to the 20th of August:
moment().tz('America/Los_Angeles').toDate().format('D') //returns 20

php timezone messing with javascript time

I'm using the TimeAgo plugin to update the time of posts automatically. It makes time like 05-08-2014 15:02:39 to 5 minutes ago.
The problem is, the javascript uses client's clock for time. I'm from Bangladesh, Asia. But in php the default timezone varies from server to server. I'm outputting the format 05-08-2014 15:02:39 in PHP to convert it into something like 5 mintues ago using the TimeAgo plugin.
Because server's timezone is different, instead of showing less than a minute ago on a recent post, it shows nn hours ago. If the server is american, it shows 11 hours ago, if it's indian, the time shows 30 minutes ago. How do I fix it?
As stated on the TimeAgo project page:
Are you concerned about time zone support? Don't be. Timeago handles
this too. As long as your timestamps are in ISO 8601 format and
include a full time zone designator (±hhmm), everything should work
out of the box regardless of the time zone that your visitors live in.
So echo the dates server side with date('c',$timestamp) and everything should work fine.

time zones: user preference vs client-side Javascript

In Javascript it's fairly straightforward to render and manipulate dates in the current user's local time zone using the methods of the Date object. For example, toLocaleString() for output and the 7-argument form of the Date constructor for input. So up until now I haven't even bothered setting a time zone as a user preference. Internally everything is stored, computed and sent back and forth to the client using UTC, and we do the translation client-side for both input and output.
For example, suppose a user has their local machine's time zone set to US Eastern. When looking at my web page, an event that occurred at Unix timestamp 1359416775000 would render as, say "Mon Jan 28 18:46:15 2013" with code no more complex than
new Date(1359416775000).toLocaleString();
But suppose I need to send that user an email about this event. What time zone should I use to render the timestamp in this email? The obvious answer is to let the user pick their time zone. Now suppose I do this and this user picks US/Eastern. Great. Now suppose the next time the user logs into my website their local machine is on US Central time. That same piece of Javascript code would now cause the timestamp to render as "Mon Jan 28 17:46:15 2013".
Is this really the right behavior? The user has picked a time zone in my application, but it only applies to email?
This seems like a common enough problem that I feel like there ought to be a common best practice, I'm just wondering what that is.
You should, by default always display times in the users local time zone. At any point of you display the time using another time zone, this should be made clear by also printing the timezone.
As such, if the users time zone is US/Eastern, you would display a time in hos time zone, in your example "Mon Jan 28 18:46:15 2013", while if you show him an event that actually happens in US/Central, you should show "Mon Jan 28 17:46:15 2013 US/Central".
Now if the user moves to a computer whose time zone is US/Central, then yes, as default you should now show him the time in US/Central. So you would in both cases display the date as "Mon Jan 28 18:46:15 2013", no time zone necessary. They will have the computers current time in the corner of the screen, so it won't cause much confusion.
If you let the user pick his timezone, which is common in sites where the time display isn't decided by the client time zone setting, then you should by default show the times in that time zone all the time, no matter what time zone the computer is in. Remember that it is up to the user to make sure his computer is in the right time zone. Most people who travel with their laptops won't change the time zone when they move.
It is theoretically possible for you to warn the user that he has selected another time zone than the one he seems to be located in, by getting a geolocation from the IP. But unless you are writing a calendar application I would think that annoys people more than it helps them.
Unfortunately you cannot set the user's timezone which is used by the non-UTC date methods.
You can only work around that by adding/substracting your custom timezone offset when outputting/reading a date, like in this example.

Categories

Resources