Trying to figure out if this is an actual bug or a fundamental gap in understanding moment.js utc() method on my part.
When the method is used to convert an existing time/date string it returns an incorrect result on Ubuntu only
using moment#2.22.2
On Ubuntu 16.04.4 LTS
> moment().utc().format()
'2018-11-10T16:30:28Z'
> moment('2018-11-13 19:00:00').utc().format()
'2018-11-13T19:00:00Z'
On Mac OsX 10.13.2
> moment().utc().format()
'2018-11-10T16:29:24Z'
> moment('2018-11-13 19:00:00').utc().format()
'2018-11-14T00:00:00Z'
Moment interprets your string as a local time. Your Ubuntu machine's time zone is set to UTC, so it reads "2018-11-13 19:00:00" as a UTC time, and converting it to UTC in your code is a noop. Your Mac is on your local time, so it interprets the string as having been expressed in whatever time zone you're in, and then utc() translates it to UTC time. So you get different results.
If you want Moment to know that the string is expressed in UTC, you need to tell it that, for example by using ISO's "Z" (e.g. '2018-11-13T19:00:00Z') or by using moment.utc("2018-11-13 19:00:00", format)
Try to provide the format in which data string is
console.log(moment('2018-11-13 19:00:00','YYYY-MM-DD h:mm:ss').utc().format())
Related
I am comparing two dates but problem is that in my local computer generated date when i use setHours method like so console.log(new Date(new Date().setHours(8))); it gives me an output of 2020-07-22T04:41:46.624Z my Timezone is GMT+4 (Georgia Standard Time) but on my heroku server (EU) or repl.it it gives where timezone offset is -0 it gives me 2020-07-22T08:41:46.624Z for the exact same command, how can i fix this? otherwise if i log new Date() on both machines they all log the same date even if i add time to them, its setting hours (or minutes, etc.) that alters the timezone
A few things:
The setHours function interprets the input in the local time zone. So yes - you will get different results depending on the time zone of the computer where you run it. You are literally saying: "Set the hour to 8 am local time".
If you intended to set the hour to 8 am UTC, you can use the setUTCHours function instead. UTC is the same across the entire planet, so you will get the same result both locally and on your server, regardless of time zone. (Assuming your clock is synchronized correctly by your OS.)
The Z in the string output means UTC.
When you log a Date object, the resulting string format is environment specific. In some environments, you'll get output like you showed, which is the same as if you called toISOString - which is in ISO 8601 format and is always in UTC. In other environments you'll get the output that comes from toString - which is in a different format, and is usually based on local time. The point is - there's no standard that controls the log output for a Date object. Don't log it directly - log the string result of calling one of those two functions (probably toISOString).
I have an use case where I use input from the user, which is on the form YYYY-MM-DD'T'TT:mm to create a Date object in Javascript. The problem is that Firefox and Chrome interpret the input as the local time (which is what I want), whilst Safari interprets the input as UTC time and converts it to local. How do I force Safari to use the same interpretation of the input as the other two?
The simplest way is to parse the string yourself (it's trivial, after all) and use the multiple-argument Date constructor, which always works in local time. Or use a library like Moment.js to do it for you. I was tempted to suggest adding a timezone offset (+0400, etc.) to the string (based on getTimezoneOffset on a Date object), but determining the right timezone offset requires that you know the date/time (because of Daylight Savings Time), and so...you'd have to do the work anyway.
A really long title, I know, but I had to highlight the fact, that I'm confronting a situation that's a little different than all the usual javascript date conversions.
I am getting the following datetime in a string from the server:
2017-05-18T08:00:00
When I put this string into the following statement:
var newDate = new Date("2017-05-18T08:00:00");
It assumes it's in the UTC timezone, so it automatically adjusts, and converts it into local time, which in Sidney would become 2017/05/18 18:00:00.
Any way that I can stop the date constructor to assume that the string is UTC time (make it assume that it's local time)?
use getTimezoneOffset() function to adjust timezone. By default Date converts it to local timezone :(
If you're applying your code in serious applications, consider a tool like Moment.js
Why do IE/FF and Chrome javascript engines differ on how to interpret this Date format (YYYY-MM-DDTHH:mm:ss.fff) without the timezone designator?
new Date("2015-02-18T15:43:57.803").getUTCHours()
UTC Hours
Chrome: 15
IE11/FF: 21
I don't understand this - is it because Chrome assumes it's local whereas IE/FF assume it's UTC? This seems like a Chrome bug.
Interestingly - appending a "Z" to the end of the string tells both Chrome and IE/FF that the time is UTC and they can agree. Has anyone else noticed this javascript implementation discrepancy with Date?
new Date("2015-02-18T15:43:57.803Z").getUTCHours()
UTC Hours
Chrome: 15
IE11/FF: 15
Ultimately - this is the result of the out-of-box serializer for ASP.NET Web API, which I thought used JSON.NET, but now appears to be internal where JSON.NET uses IsoDateTimeConverter.
Checking GlobalConfiguration.Configuration.Formatters.JsonFormatter tells me we're using JsonMediaTypeFormatter. Is Web API not using JSON.NET serializer out of the box?
This is a boon for Web API people - at least back in ASP.NET MVC we had a consistent date format (albeit proprietary - /Date(number of ticks)/) via the JavascriptSerializer
ES5 says that ISO 8601 format dates without a time zone should be treated as local(that interpretation has since been revised), but the ed. 6 draft says to treat them as UTC. Some script engines have implemented ed. 6, some ES5 and the rest neither.
The ed. 6 (and later) aren't consistent with the ISO 8601 specification.
The bottom line is don't use Date.parse (or pass strings to the Date constructor), manually parse date strings.
For us, the crux of this issue is that DateTimeStyles.RoundtripKind only works if your DateTime properties set the DateTime.DateTimeKind (other than DateTimeKind.Unspecified - the default) or better yet - using DateTimeOffset which enforces use of the TimeZone specificity.
Since we already had DateTime class properties, we worked around this for now by assigning the JsonSerializerSettings.DateTimeZoneHandling from DateTimeZoneHandling.RoundtripKind (DateTime default) to DateTimeZoneHandling.Utc in our Global.asax.cs. This change essentially appends the "Z" to the end of the DateTime - however there is one more step to convert the Local time to UTC.
The second step is to provide the offset by assigning IsoDateTimeConverter.DateTimeStyles which JSON.NET JsonSerializer will pickup from a SerializerSettings.Converters and automatically convert from Local time to UTC - much like the out-of-the-box ASP.NET MVC does.
Obviously - there are other options, but this is solution worked for us.
Global.asax.cs
protected void Application_Start() {
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.DateTimeZoneHandling = Newtonsoft.Json.DateTimeZoneHandling.Utc;
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.Converters.Add(new IsoDateTimeConverter() { DateTimeStyles = DateTimeStyles.AdjustToUniversal });
}
The reason this works is because RoundtripKind honors DateTime's DateTimeKind - which is Unspecified by default. We want to explicitly convert this to UTC - which JavaScriptSerializer used to do for us out of the box for ASP.NET MVC. The regional offset is provided by DateTimeStyles.AdjustToUniversal which converts your Local DateTime to UTC.
We have a simple function the works out a duration, it works fine in every browser apart from Safari on a Mac (works in chrome on the mac, works on Safari on the PC)
For example,
new Date().toLocaleTimeString()
We expect this to give a time formatted like this:
11:59:25
However, on the Mac Safari we get this
11:59:25 GMT+01:00
Any calculations we do on these times are one hour out (its adding the hour onto the calculation)
e.g.
11:59:25 - 11:59:25 = 01:00:00 (should be 00:00:00)
Any ideas?
Why is it adding the time zone to the string? this caused us a little issue with our database
Why is it adding an hour to the sting?
Why just in that one bloody browser!
Thanks for your time.
The toLocaleTimeString method relies on the underlying operating system in formatting dates. It converts the date to a string using the formatting convention of the operating system where the script is running. For example, in the United States, the month appears before the date (04/15/98), whereas in Germany the date appears before the month (15.04.98).
Methods such as getHours, getMinutes, and getSeconds give more consistent results than toLocaleTimeString. Use toLocaleTimeString when the intent is to display to the user a string formatted using the regional format chosen by the user. Be aware that this method, due to its nature, behaves differently depending on the operating system and on the user's settings.
Source: https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Date/toLocaleTimeString
In OS X, time format can be fine tuned by Apple menu > System Preferences > Language & Region > Advanced > Time. The format used by toLocaleTimeString() is the Long format. You can adjust the format as desired for your needs, however keep in mind this change will be effective system wide.
Source: Apple Support - Customize Formats