I have a server date time which is also CST and a date based of chicago timezone, when i find the difference between two dates the value is different for different timezones.
I am not able to get to the problem.
Date A is my Chicago time 2019-05-22T04:02:14-05:00
Date B is my server time 2019-05-20T01:39:34-04:00
Hours difference between them 51 when my timezone is set to EST
When i change my timezone to IST
Date A is my Chicago time 2019-05-22T04:03:34-05:00
Date B is my server time 2019-05-20T01:39:34+05:30
Hours difference between them 60 when my timezone is set to IST
Why is there a difference in hours when the dates are same in both the cases?
getIntervalTime(dateA, dateB): ITimer {
console.log("Date A is my Chicago time", dateA)
console.log("Date B is my server time", dateB)
console.log(moment.utc(dateA).diff(moment.utc(dateB), "hours"));
intervalHours = moment.utc(dateA).diff(moment.utc(dateB), "hours")
}
In your question, you gave two very different server times. They are not referencing the same actual point in time. In each case, 01:39:34 is the local time in the time zone offset provided.
2019-05-20T01:39:34-04:00 (EDT) = 2019-05-20T05:39:34Z (UTC) = 2019-05-20T11:09:34+05:30 (IST)
2019-05-20T01:39:34+05:30 (IST) = 2019-04-19T20:09:34Z (UTC) = 2019-04-19T16:09:34-04:00 (EDT)
As you can see just by comparing the UTC times, there is a 9.5 hour difference between these two timestamps. This is also reflected in the difference between the two offsets (5.5 - -4 = 9.5).
This is a common source of confusion, as often people view the + or - sign as an operator, and thus think of it as an instruction ("Oh, I see a plus or minus so I must need to add or subtract this value to get to the local time"). But in reality it is not an operator, but the sign of the offset. Positive offset values are ahead of UTC, while negative offset values are behind UTC. (Alternatively one can think of positive offsets as being east of GMT, while negative offsets are west of GMT.)
In other words, the date and time portion of an ISO 8601 formatted timestamp are already converted to the context provided.
Also note that the time zone of your server won't really matter, nor should it. Now is now - time zones don't change that. Thus, in most cases you should simply use the UTC time.
Related
I want to get how far away is the next occurence of a particular PST time regardless of the client's timezone.
This would be trivial if the time were in UTC but I don't know how to do it in PST keeping in mind the observance of daylight savings time.
Eg. 4 PM PST would be 11 PM UTC since it is right now summer.
I would prefer not to have to manually input the dates of daylight saving time.
I am happy to use a library if this is not possible without one.
// returns the number of milliseconds from the current time until the specified time in PST.
function getTimeUntil (hour, minutes = 0, seconds = 0)
{
// implementation needed
}
The following is an explanation of why this is likely a duplicate of How to initialize a JavaScript Date to a particular time zone.
PST (presumably US Pacific Standard Time) is a timezone with a fixed offset, UTC -8. Places that observe PST and have daylight saving typically call that offset Pacific Daylight Time (PDT), which is UTC -7.
PST might also be Pitcairn Standard Time, which is also UTC -8 and observed all year round on Pitcairn Island. Converting PST to UTC is achieved by adding 8 hours.
However, likely you want to work with times and dates for a place that observes US PST in winter and US PDT in summer, e.g. Los Angeles. In that case you can use a library like Luxon or date.js that allows creating dates based on a timestamp and specified IANA representative location such as "America/Los_Angeles". If that is the case, then see the link above.
My implementation:
// returns the formatted time from the current time until the specified time in PST.
function getTimeUntil (hour, minutes = 0, seconds = 0)
{
let future = luxon.DateTime.now().setZone('America/Vancouver').set({
hours: hour,
minutes: minutes,
seconds: seconds
});
let now = luxon.DateTime.now().setZone('America/Vancouver');
if (future < now)
{
future = future.plus({ days:1 });
}
return future.diff(now, ["hours", "minutes", "seconds"]);
// implementation needed
}
console.log(getTimeUntil(13, 0, 0).toObject());
<script src="https://cdn.jsdelivr.net/npm/luxon#2.0.1/build/global/luxon.min.js"></script>
I was wondering what would be the best way to tackle getting the unix timestamps of Monday and Sunday in New Zealand timezone while the system clock (AWS Lambda) is in a different timezone.
I've tried the below and it seems to work well in my local computer however obviously when executed on AWS, it'll be a different timezone.
Can someone please suggest the best way to deal with timezones so the code can run on whatever location?
var monday = moment().day(-13).startOf('day').toDate().getTime() // Monday last week
var sunday = moment().day(-7).startOf('day').toDate().getTime() // Sunday last week
Unix Timestamp always give time in UTC regardless of your position. To get the local time of your position please apply the timezone offset to the unix timestamp. https://en.wikipedia.org/wiki/Unix_time
A few things:
You can use Moment-Timezone to work with IANA time zone identifiers in Moment.
New Zealand has two different IANA time zone identifiers:
Pacific/Auckland covers most of New Zealand, which uses UTC+12:00 during standard time, and UTC+13:00 during daylight saving time.
Pacific/Chatham covers the Chatham Islands which is also part of New Zealand. However, Chatham uses UTC+12:45 during standard time, and UTC+13:45 during daylight saving time.
You do not need to convert to a Date object. Moment can give you a Unix timestamp directly, either as seconds with .unix(), or as milliseconds with .valueOf(). Both imply a conversion to UTC, as Unix timestamps are inherently UTC based.
One cannot get a Unix timestamp for an entire day, but rather only for a point in time. The timestamp you are probably looking for would be at the start of the local day, which is always midnight (00:00) in New Zealand (but not necessarily in other time zones on DST transition days, depending on the time of the transition).
When use Moment's day function, day(-13) doesn't mean "Monday last week". It means "two Mondays ago." If you meant "one Monday ago", that would be day(-6). Likewise day(-7) means "one Sunday ago. In both cases, it doesn't count the current day. For example, since today is Sunday 2019-09-01, day(-7) refers to Sunday 2019-08-25.
Putting this all together:
var oneMondayAgoInAuckland = moment.tz('Pacific/Auckland').day(-6).startOf('day').valueOf();
var oneSundayAgoInAuckland = moment.tz('Pacific/Auckland').day(-7).startOf('day').valueOf();
At the moment, these return 1566734400000 and 1566648000000 respectively.
I use the moment-timezone v.0.5.3-2016c library to calculate the UTC offset for a timezone:
var z = moment().tz("America/Los_Angeles");
z.utcOffset(); // -420 mins or -7 hours
// check if DST is shifted
z.isDSTShifted(); // false
But here https://en.wikipedia.org/wiki/List_of_tz_database_time_zones the UTC offset for the America/Los_Angeles is -8 hours.
Moment uses the tzdb v.2016c and the wikipedia article too.
Well, why there are two different results here? And which result is right?
P.S.: there is the same difference for America/Kentucky/Monticello and Europe/Tiraspol, as well.
Calling the moment creation function, moment() without any arguments returns the current moment in time. Since time zone offsets vary depending on what date and time they are attached to, your results will vary depending on when you call this function.
If you want to know whether or not the time is daylight saving time or not, use isDST. The isDSTShifted function is for dealing with invalid local times, not checking DST. It probably could have been named better.
The USA is currently in DST (Daylight Savings Time). Therefore, I'd use the UTC DST Offset column, which shows -07:00.
I am sending Json data via a REST Service to my client. This client should use the data to Display it.
My client uses JavaScript.
I am converting the date in the following way:
var from = new Date(myJsonDate.match(/\d+/)[0] * 1);
The JSON looks like this:
...="From":"\/Date(1450134000000)\/" ...
My problem is that the dates are correct in Germany but are off by one day in Brazil (e.g. showing Sunday instead of Monday in Brazil).
Does this code use time zones and calculates this accordingly?
How could I turn this off?
I want that the date is displayed exactly how i have sent it.
The operations with dates in JavaScript have a time zone variation in which the client machine is configured.
Right opportunity had to fix a function that showed difference between dates and nobody knew because. When you instance a date, the return her appears as: “Thu Feb 14 2008 08:41:27 GMT-0300 (Official Hour of Brazil)”
Note that in date has the GMT (Greenwich Mean Time) that indicates in which time zone the date is configured.
I’ll show as avoid the difference of time caused by this in operations with date. To this we have create a function that convert the date always to the time zone that if wait.
var calculateTimeZone = function(date, offset) {
var miliseconds_with_utc = date.getTime() + (date.getTimezoneOffset() * 60000);
return new Date(miliseconds_with_utc + (3600000 * offset));
}
Note that in the line 3, we invoke the method getTime() that convert the local moment of date to a number represented by miliseconds since January 1st, 1970 (Unix Epoch). We get the current time zone that is set in browser by method geTimezoneOffset() of API the date in JavaScript and we multiply by miliseconds of time of a hour. We add then the two values.
Why a hour?
Why this is the time that represents each time zone. By default this method return this time zone in minutes, by this the convertion in hour is necessary.
For to arrive this number 60000 you have that remember that 1 second have 1000 miliseconds and which 1 minute have 60 seconds, then converting minutes for miliseconds we multiply 60*1000 = 60000.
This moment we have the UTC (Coordinated Universal Time) represented by variable “utc” by sum of local moment the time zone in miliseconds.
We need now get a date starting this UTC added with the time zone of destiny, how by example a date expressed in time zone +5 transforming in time zone of brazil (Hour of Brazilian).
Note that in line 5 we got an offset (Time Zone Representation) in hour and converting to miliseconds. Remember that here 1 second have 1000 miliseconds and which 1 hour have 3600 seconds, then convert hour in miliseconds should multiply 1000 * 3600 = 3600000.
We add this result with the value of variable “utc” and we got the moment to the time zone wanted. Thenceforth we create a new date with based in long appropriate and return this new date.
In this way we can maintain of integrity desired in application when we need expressed a date in right time zone.
Does this code use time zones and calculates this accordingly?
No. Passing a number to the Date constructor is interpreted as a time value, i.e. milliseconds since 1970-01-01T00:00:00Z. Regardless of the settings of the client, it will create a Date for exactly the same instant in time.
However, by default, Date.prototype.toString uses the host system settings to apply an offset to the displayed values as "local" time.
How could I turn this off?
Modify the script engine. It's part of the ECMAScript standard so any implementation that doesn't do it is non–compliant.
I want that the date is displayed exactly how i have sent it.
Either:
Send it as a plain string, not as a date
Also send the time zone offset of the source so you can apply it at the other end to keep the date the same.
ECMAScript offsets have an opposite sense to most standards, they're -ve for east and +ve for west, so to get a Date with local settings that has the same as the source system:
var d = new Date(timevalue);
d.setMinutes(d.getMinutes() + d.getTimezoneOffset() - sourceTimezoneOffset);
Where sourceTimezoneOffset is the offset of the source system in minutes, +ve for west and -ve for east.
Usually dates related to a specific time zone, so as pointed out, the date in one place might be different to the date in another place at the same instant in time.
If you are not doing any modifications in dates when sending it from server side, the date will be in the timezone where the server is hosted.
So, if your server is hosted in Germany, dates will be in Germany's timezone.
There would be 2 ways to solve this:
Send dates to client in user-timezone from server in the response.
Make adjustments in your client application to implement appropriate
date conversion.
I'm looking for best practices regarding dates - where it's the date itself that's important rather than a particular time on that day.
An excellent start was this question:
Daylight saving time and time zone best practices
I'd like some guidance applying this for my situation. I have medications starting on a particular date, and ending on another. I then need to query medications which are active in a given date range.
I've tried setting start and end dates as midnight local time, then storing in UTC on the database. I could add timezone entries too.
I'm using moment.js on both client and server, and can use moment timezone if needed.
I'm wondering how to deal with the effect of DST on my times - which makes an hour difference in my locally-midnight UTC times between DST and non DST periods.
The problem I have is for example when some medications have end dates set during a DST period, and some which were set in a non-DST period. Then, their UTC times differ by an hour. When a query is made for a particular date range starting at local midnight, it's not accurate as there are two different representations of midnight. The query itself may treat midnight as one of two different times, depending on when in the year the query is made.
The end result is that a medication may appear to end a day later than it should, or start a day earlier.
A simple but wonky workaround would be to consistently set the start date as 1am in standard (non DST) time, and end dates as 11:59pm standard (non DST) time, and query at midnight.
Or, should I check the start and end dates of each query, and work out what the UTC offset would be for each date?
But I'd much prefer to know what best practice is in this situation. Thanks.
Both the JavaScript Date object and the moment object in moment.js are for representing a specific instant in time. In other words, a date and a time. They internally track time by counting the number of milliseconds that have elapsed since the Unix Epoch (Midnight, Jan 1st 1970 UTC) - ignoring leap seconds.
That means, fundamentally, they are not the best way to work with whole calendar dates. When you have only a date, and you use a date+time value to track it, then you are arbitrarily assigning a time of day to represent the entire day. Usually, this is midnight - but as you pointed out, that leads to problems with daylight saving time.
Consider that in some parts of the world (such as Brazil) the transition occurs right at midnight - that is, in the spring, the clocks jump from 11:59:59 to 01:00:00. If you specify midnight on that date, the browser will either jump forward or jump backward (depending on which browser you are using)!
And if you convert a local date-at-midnight to a different time zone (such as UTC), you could change the date itself! If you must use a date+time to store a date-only value, use noon instead of midnight. This will mitigate most (but not all) of the adjustment issues.
The better idea is to treat whole dates as whole dates. Don't assign them a time, and don't try to adjust them to UTC. Don't use a Date or a moment. Instead, store them either as an ISO-8601 formatted string like "2014-11-25", or if you need to do math on them, consider storing them as an integer number of whole days since some starting value. For example, using the same Jan 1st 1970 epoch date, we can represent November 11th 2014 as 16399 with the following JavaScript:
function dateToValue(year, month, day) {
return Date.UTC(year, month-1, day) / 86400000;
}
function valueToDate(value) {
var d = new Date(86400000 * value);
return { year : d.getUTCFullYear(),
month : d.getUTCMonth() + 1,
day : d.getUTCDate()
};
}
There are a few other things to keep in mind when working with whole dates:
When working with ranges of whole dates, humans tend to use fully-inclusive intervals. For example, Jan 1st to Jan 2nd would be two days. This is different from date+time (and time-only) ranges, in which humans tend to use half-open intervals. For example, 1:00 to 2:00 would be one hour.
Due to time zones, everyone's concept of "today" is different around the globe. We usually define "today" by our own local time zone. So normally:
var d = new Date();
var today = { year : d.getFullYear(),
month : d.getMonth() + 1,
day : d.getDate()
};
You usually don't want to shift this to UTC or another time zone, unless your business operates globally under that time zone. This is rare, but it does occur. (Example, StackOverflow uses UTC days for its calculations of badges and other achievements.)
I hope this gets you started. You asked a fairly broad question, so I tried to answer in way that would address the primary concerns. If you have something more specific, please update your question and I'll try to respond.
If you would like even more information on this subject, I encourage you to watch my Pluralsight course, Date and Time Fundamentals.