I am sending Json data via a REST Service to my client. This client should use the data to Display it.
My client uses JavaScript.
I am converting the date in the following way:
var from = new Date(myJsonDate.match(/\d+/)[0] * 1);
The JSON looks like this:
...="From":"\/Date(1450134000000)\/" ...
My problem is that the dates are correct in Germany but are off by one day in Brazil (e.g. showing Sunday instead of Monday in Brazil).
Does this code use time zones and calculates this accordingly?
How could I turn this off?
I want that the date is displayed exactly how i have sent it.
The operations with dates in JavaScript have a time zone variation in which the client machine is configured.
Right opportunity had to fix a function that showed difference between dates and nobody knew because. When you instance a date, the return her appears as: “Thu Feb 14 2008 08:41:27 GMT-0300 (Official Hour of Brazil)”
Note that in date has the GMT (Greenwich Mean Time) that indicates in which time zone the date is configured.
I’ll show as avoid the difference of time caused by this in operations with date. To this we have create a function that convert the date always to the time zone that if wait.
var calculateTimeZone = function(date, offset) {
var miliseconds_with_utc = date.getTime() + (date.getTimezoneOffset() * 60000);
return new Date(miliseconds_with_utc + (3600000 * offset));
}
Note that in the line 3, we invoke the method getTime() that convert the local moment of date to a number represented by miliseconds since January 1st, 1970 (Unix Epoch). We get the current time zone that is set in browser by method geTimezoneOffset() of API the date in JavaScript and we multiply by miliseconds of time of a hour. We add then the two values.
Why a hour?
Why this is the time that represents each time zone. By default this method return this time zone in minutes, by this the convertion in hour is necessary.
For to arrive this number 60000 you have that remember that 1 second have 1000 miliseconds and which 1 minute have 60 seconds, then converting minutes for miliseconds we multiply 60*1000 = 60000.
This moment we have the UTC (Coordinated Universal Time) represented by variable “utc” by sum of local moment the time zone in miliseconds.
We need now get a date starting this UTC added with the time zone of destiny, how by example a date expressed in time zone +5 transforming in time zone of brazil (Hour of Brazilian).
Note that in line 5 we got an offset (Time Zone Representation) in hour and converting to miliseconds. Remember that here 1 second have 1000 miliseconds and which 1 hour have 3600 seconds, then convert hour in miliseconds should multiply 1000 * 3600 = 3600000.
We add this result with the value of variable “utc” and we got the moment to the time zone wanted. Thenceforth we create a new date with based in long appropriate and return this new date.
In this way we can maintain of integrity desired in application when we need expressed a date in right time zone.
Does this code use time zones and calculates this accordingly?
No. Passing a number to the Date constructor is interpreted as a time value, i.e. milliseconds since 1970-01-01T00:00:00Z. Regardless of the settings of the client, it will create a Date for exactly the same instant in time.
However, by default, Date.prototype.toString uses the host system settings to apply an offset to the displayed values as "local" time.
How could I turn this off?
Modify the script engine. It's part of the ECMAScript standard so any implementation that doesn't do it is non–compliant.
I want that the date is displayed exactly how i have sent it.
Either:
Send it as a plain string, not as a date
Also send the time zone offset of the source so you can apply it at the other end to keep the date the same.
ECMAScript offsets have an opposite sense to most standards, they're -ve for east and +ve for west, so to get a Date with local settings that has the same as the source system:
var d = new Date(timevalue);
d.setMinutes(d.getMinutes() + d.getTimezoneOffset() - sourceTimezoneOffset);
Where sourceTimezoneOffset is the offset of the source system in minutes, +ve for west and -ve for east.
Usually dates related to a specific time zone, so as pointed out, the date in one place might be different to the date in another place at the same instant in time.
If you are not doing any modifications in dates when sending it from server side, the date will be in the timezone where the server is hosted.
So, if your server is hosted in Germany, dates will be in Germany's timezone.
There would be 2 ways to solve this:
Send dates to client in user-timezone from server in the response.
Make adjustments in your client application to implement appropriate
date conversion.
Related
I am currently working on method in which system has to give automated call to client in their timezone.
Let's say if the client is in "Africa/Blantyre" time zone and I am in "Asia/Jakarta" time zone and the client says to give call at 7 P.M then I would need to store the time in db and system has to call him.
The method which I have thought is to get offset between two timezone ("Africa/Blantyre" and "Asia/Jakarta") and then get 7 P.M in "Asia/Jakarta" , finally add/subtract the offset to the time and that is how I will get 7.PM time for "Africa/Blantyre". But I am not sure how I can implement this?
You can:
Create a moment for now
Set the timezone to Africa/Blantyre
Set the time to the required local time in Blantyre
Set the timezone to Asia/Jakarta
Display the time in Jakarta
Internally, the moment object uses an instance of the built–in Date object, which has an ECMAScript time value that represents a single moment in time as an offset from the ECMAScript epoch (1 Jan 1970). Changing the timezone just changes calculations for manipulating and displaying the date, it doesn't change the time value.
So after setting the timezone to Africa/Blantyre, setting the time to 19:00 sets it for 19:00 in Blantyre. Changing the timezone to Jakarta also doesn't change the underlying time value, it just means that the generated timestamp is for Jakarta.
So here's some code:
// First get a moment for "now"
let mBlantyre = moment();
// Set the timezone to Africa/Blantyre
mBlantyre.tz('Africa/Blantyre');
// Set the time to 7 pm
mBlantyre.startOf('day').hour(19);
// Copy the moment object
mJakarta = moment(mBlantyre);
// Set the timezone to Asia/Jakarta
mJakarta.tz('Asia/Jakarta');
// Display as timestamps
console.log(
'Africa/Blantyre: ' + mBlantyre.format() +
'\nAsia/Jakarta : ' + mJakarta.format()
);
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.29.3/moment.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment-timezone/0.5.34/moment-timezone-with-data-10-year-range.min.js"></script>
I think you can fetch the time difference data through this site. As for the method, you can make a json file through which you add all the countries and their time, or via api, you will find it through one of the sites for calculating time differences and calculating Time Zone
https://www.calculator.net/time-zone-calculator.html
I'm trying to compare the hours in between a website visitors location and a set location, by grabbing their current time, and then inserting that in to the Google Time Zone API with the location. However, no matter how I twist and turn it, it's either 1 or 2 hours incorrect, it seems.
Am I reading something wrong?
The request:
https://maps.googleapis.com/maps/api/timezone/json?location=40.7127753,-74.0059728×tamp=1569956387&key=API_KEY
The Output:
{
"dstOffset" : 3600,
"rawOffset" : -18000,
"status" : "OK",
"timeZoneId" : "America/New_York",
"timeZoneName" : "Eastern Daylight Time"
}
The first part of the request is the longitude and latitude of New York City (retrieved from Google Places API), and the second value (timestamp) is seconds since 1 January 1970. This I get from the visitor with the following JavaScript:
+ new Date()
if (!Date.now) {
Date.now = function() { return new Date().getTime(); }
}
Math.floor(Date.now() / 1000)
..which for me results in: 1569956387 (seconds)
Date.now() returns milliseconds since 1 January 1970, so I need to convert to seconds by dividing by 1000 as Google Time Zone API uses seconds since 1 January 1970.
And according to Google Time Zone API docs:
rawOffset: the offset from UTC (in seconds) for the given location. This does not take into effect daylight savings.
..rawOffset is the JSON output
timestamp specifies the desired time as seconds since midnight, January 1, 1970 UTC. The Time Zone API uses the timestamp to determine whether or not Daylight Savings should be applied, based on the time zone of the location. Note that the API does not take historical time zones into account. That is, if you specify a past timestamp, the API does not take into account the possibility that the location was previously in a different time zone.
..timestamp is the seconds since 1 January 1970 specified in the request link
My local time is currently 9:59 PM (1569956387). Calculating either (18000/60)/60 or (21600/60)/60 tells me New York City should be 5 or 6 hours away, respectively, while the truth is that it's 7 hours away.
A few things:
You don't need to make a function for Date.now unless you have to still support IE8, which is rare these days. IE9+ and all major browsers have that function built in.
Date.now() and new Date().getTime() return timestamps based on UTC, not your local time. It doesn't matter which time zone you run it from, it only matters that your computer's clock is synchronized.
The Google Time Zone API also takes its timestamp in terms of UTC, so you are doing the correct thing to pass it along (adjusting milliseconds to seconds).
Google is correctly telling you that for the given timestamp:
The lat/lon coordinates given are in the time zone identified with the America/New_York IANA time zone ID
The English long-form name to display is Eastern Daylight Time
The rawOffset is -18000 seconds, or UTC-5. This field does not include DST, so it is the offset that would be used if Eastern Standard Time were in effect (which it is not for the timestamp given)
There is a dstOffset in effect of 3600 seconds (or 1 hour), which when added to the raw offset gives you (-18000 + 3600 == -14400), or UTC-4
Google is only telling you the offsets from UTC, not from your local time zone. If you wanted to include that, you could call new Date().getTimezoneOffset() (based on the current time) and add that to the result.
I have a server date time which is also CST and a date based of chicago timezone, when i find the difference between two dates the value is different for different timezones.
I am not able to get to the problem.
Date A is my Chicago time 2019-05-22T04:02:14-05:00
Date B is my server time 2019-05-20T01:39:34-04:00
Hours difference between them 51 when my timezone is set to EST
When i change my timezone to IST
Date A is my Chicago time 2019-05-22T04:03:34-05:00
Date B is my server time 2019-05-20T01:39:34+05:30
Hours difference between them 60 when my timezone is set to IST
Why is there a difference in hours when the dates are same in both the cases?
getIntervalTime(dateA, dateB): ITimer {
console.log("Date A is my Chicago time", dateA)
console.log("Date B is my server time", dateB)
console.log(moment.utc(dateA).diff(moment.utc(dateB), "hours"));
intervalHours = moment.utc(dateA).diff(moment.utc(dateB), "hours")
}
In your question, you gave two very different server times. They are not referencing the same actual point in time. In each case, 01:39:34 is the local time in the time zone offset provided.
2019-05-20T01:39:34-04:00 (EDT) = 2019-05-20T05:39:34Z (UTC) = 2019-05-20T11:09:34+05:30 (IST)
2019-05-20T01:39:34+05:30 (IST) = 2019-04-19T20:09:34Z (UTC) = 2019-04-19T16:09:34-04:00 (EDT)
As you can see just by comparing the UTC times, there is a 9.5 hour difference between these two timestamps. This is also reflected in the difference between the two offsets (5.5 - -4 = 9.5).
This is a common source of confusion, as often people view the + or - sign as an operator, and thus think of it as an instruction ("Oh, I see a plus or minus so I must need to add or subtract this value to get to the local time"). But in reality it is not an operator, but the sign of the offset. Positive offset values are ahead of UTC, while negative offset values are behind UTC. (Alternatively one can think of positive offsets as being east of GMT, while negative offsets are west of GMT.)
In other words, the date and time portion of an ISO 8601 formatted timestamp are already converted to the context provided.
Also note that the time zone of your server won't really matter, nor should it. Now is now - time zones don't change that. Thus, in most cases you should simply use the UTC time.
Using moment.js, I'm attempting to extract the offset from an ISO date string so I can use the offset later when formatting an epoch timestamp to ensure the conversion of the timestamp is in the same timezone.
Even though the offset in the string is -0400, the result is always 0;
var currentTime = "2015-03-18T16:10:00.001-0400";
var utcOffset = moment(currentTime).utcOffset(); // 0
I've attempted to use parseZone() as well without success. Is there a way to extract -0400 from the string so I can use it while formatting another time?
Thanks for the help!
KC
The correct way to extract the offset is indeed with parseZone
var currentTime = "2015-03-18T16:10:00.001-0400";
var utcOffset = moment.parseZone(currentTime).utcOffset();
This should result in -240, which means 240 minutes behind UTC, which is the same as the -0400 in the input string. If you wanted the string form, instead of utcOffset() you could use .format('Z') for "-04:00" or .format('ZZ') for "-0400".
The form you gave in the question just uses the computer's local time zone. So it is currently UTC+00:00 in your time zone (or wherever the code is running), that would explain why you would get a zero. You have to use parseZone to retain the offset of the input string.
Also - your use case is a bit worrying. Remember, an offset is not the same thing as a time zone. A time zone can change its offset at different points in time. Many time zones do this to accommodate daylight saving time. If you pick an offset off of one timestamp and apply it to another, you don't have any guarantees that the offset is correct for the new timestamp.
As an example, consider the US Eastern time zone, which just changed from UTC-05:00 to UTC-04:00 when daylight saving time took effect on March 8th, 2015. If you took a value like the one you provided, and applied it to a date of March 1st, you would be placing it into the Atlantic time zone instead of the Eastern time zone.
Sorry if the title is a little convoluted. I'm bashing my head against the floor with times in NodeJS / Javascript. I can get the current UTC time like this:
var currentTime = Date.now();
I can get the current time for a user who is, for example, in the -3 timezone like this:
var offsetTime = Date.now() + (numTimeZone * 3600000);
But how do I get the local user time at, say, 6am, converted to UTC?
Practical application:
What I'm trying to do is create an auto-emailer which sends an email to a user at 6am in their local time. My server is in one timezone and they will be in another, so I'm trying to standardise it against UTC so every minute I can set my server to check the currentUTC time, then check what the user's 6am time is converted to UTC (local6am), and if the currentUTC > local6am then an email should be sent.
What's the best way to achieve this? Preferably without using a library if possible.
Utc to Local
moment.utc('2014-02-19 05:24:32 AM').toDate();
Local to utc
Read this documentation.
MomentJS is parsing the date as a locale date-time. If no hour is given, it is assuming midnight.
Then, you convert it to UTC, so it is shifted, according to your local time, forward or backwards. If your are in UTC+N, then you will get the previous date.
moment(new Date('02-19-2014')).utc().format("YYYY-MM-DD HH:mm").toString()
moment(new Date('02-19-2014 12:00')).utc().format("YYYY-MM-DD HH:mm").toString()
(or)
You can try this:
moment.utc('07-18-2013', 'MM-DD-YYYY')
moment.utc('07-18-2013', 'MM-DD-YYYY').format('YYYY-MM-DD')
You do not need to call toString explicitly.