This question already has answers here:
Clock on webpage using server and system time?
(8 answers)
Closed 9 years ago.
I am working on a very time-sensitive web application. One of the business rules given to me is that the application's behavior must always depend on the time on the web server, regardless of what time is on the client's clock. To make this clear to the user, I was asked to display the server's time in the web application.
To this end, I wrote the following Javascript code:
clock = (function () {
var hours, minutes, seconds;
function setupClock(updateDisplayCallback) {
getTimeAsync(getTimeCallback);
function getTimeCallback(p_hours, p_minutes, p_seconds) {
hours = p_hours;
minutes = p_minutes;
seconds = p_seconds;
setInterval(incrementSecondsAndDisplay, 1000);
}
function incrementSecondsAndDisplay() {
seconds++;
if (seconds === 60) {
seconds = 0;
minutes++;
if (minutes === 60) {
minutes = 0;
hours++;
if (hours === 24) {
hours = 0;
}
}
}
updateDisplayCallback(hours, minutes, seconds);
}
}
// a function that makes an AJAX call and invokes callback, passing hours, minutes, and seconds.
function getTimeAsync(callback) {
$.ajax({
type: "POST",
url: "Default.aspx/GetLocalTime",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (response) {
var date, serverHours, serverMinutes, serverSeconds;
date = GetDateFromResponse(response);
serverHours = date.getHours();
serverMinutes = date.getMinutes();
serverSeconds = date.getSeconds();
callback(serverHours, serverMinutes, serverSeconds);
}
})
}
return {
setup: setupClock
};
})();
The function passed in for updateDisplayCallback is a simple function to display the date on the web page.
The basic idea is that the Javascript makes an asynchronous call to look up the server's time, store it on the client, and then update it once per second.
At first, this appears to work, but as time goes by, the displayed time gets behind a few seconds every minute. I left it running overnight, and when I came in the next morning, it was off by more than an hour! This is entirely unacceptable because the web application may be kept open for days at a time.
How can I modify this code so that the web browser will continuously and accurately display the server's time?
Javascript's setInterval is not accurate enough to allow you to keep the time like this.
My solution would be:
Periodically get the server's time in milliseconds (it does not need to be very often as the two clocks will hardly deviate that much)
Get the client time in milliseconds
Calculate the clock deviation between server and client (client-server)
Periodically update the display of the clock by getting the client time and adding the clock deviation
Edit:
To be more accurate, you could measure the round trip time of the server's request, divide it by 2 and factor that delay into the clock deviation. Assuming round trips are symmetrical in their duration, this would give a more accurate calculation.
setInterval is not a reliable way to schedule time critical events. It may take less or more than 1000ms to run your callback depending on how busy JavaScript it is at the moment.
A better approach would be to take a shorter interval and use new Date().getTime() to check if a second has passed.
The minimum interval browsers allow is as high 10.
Thanks for the answers. I have up-voted both answers so far as they contain useful information. However, I am not using the exact answer prescribed in either answer.
What I finally decided on is a bit different.
I wrote about what I learned on my personal web page.
First of all, I now understand that using setInterval(..., 1000) is not good enough to have something done once per second for a long time. However, 'polling' the time with a much shorter interval looking for the second to change seems very inefficient to me.
I decided that it does make sense to keep track of the 'offset' between the server time and the client time.
My final solution is to do the following:
(1) Do an AJAX call to the server to get the time. The function also checks the client time and computes the difference between the server time and the client time, in milliseconds. Due to network latency and other factors, this initial fetch may be off by a few seconds. For my purposes, this is okay.
(2) Execute a tick function. Each time tick executes, it checks how long it has been since the last time tick executed. It will use this time to compute an argument to be passed to setTimeout so that the time display is updated approximately once per second.
(3) Each time the tick function computes the time to be displayed, it takes the client time and adds the difference that was computed in step (1). This way, I don't depend upon the client to have the time set correctly, but I do depend upon the client to accurately measure elapsed time. For my purposes, this is okay. The most important thing is that regardless of how setTimeout may be inaccurate or interrupted by other processes (such as a modal dialog, for instance), the time displayed should always be accurate.
Related
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
As I was looking for a simple Stopwatch implementation in JS, I found this code http://codepen.io/_Billy_Brown/pen/dbJeh/
The problem is that it's not working fine, the clock go way too fast. I got 30 seconds on the screen when i got only 23 seconds on my watch.
And I don't understand why. The timer function is called every millisecond and should be updating the time correctly.
setInterval(this.timer, 1);
Is the problem coming from the browser or from the JS code.
Thanks in advance.
The timers in Javascript doesn't have millisecond precision.
There is a minimum time for the interval, which differs depending on the browser and browser version. Typical minimums are 4 ms for recent browsers and 10 ms for a little older browsers.
Also, you can't rely on the callback being called at exact the time that you specify. Javascript is single threaded, which means that if some other code is running when the timer triggers a tick, it has to wait until that other code finishes.
In fact the code you gave is imitating time flow, but it is not synchronized with system time.
Every millisecond it just invokes the function this.time, which performs recounting of millis, seconds and so on
without getting native system time, but just adding 1 to variable representing "imaginary milliseconds".
So we can say that resulting pseudo-time you see depends on your CPU, browser and who knows what else.
On our modern fantastically fast computers the body of this.time function is being executed faster than millisecond (wondering what would happen on Pentium 2 with IE5 on board).
Anyhow there is no chance for the this.time to be executed exactly in particular fixed period on all computers and browsers.
The simplest correct way to show the time passed since startpoint according to the system time is:
<body>
<script>
var startTime = new Date() // assume this initialization to be start point
function getTimeSinceStart()
{
var millisSinceStart = new Date() - startTime
, millis = millisSinceStart % 1000
, seconds = Math.floor(millisSinceStart / 1000)
return [seconds, millis].join( ':' )
}
(function loop()
{
document.title = getTimeSinceStart() // look on the top of page
setTimeout( loop, 10 )
}())
</script>
</body>
P.S. What #Guffa says in his answer is correct (generally for js in browsers), but in this case it does not matter and not affect the problem
I have 2 node.js webservers. I cache data inside webservers. I sync the cache load/clear based on system time. I have done time sync of all my hosts.
Now I clear cache every 15 mins using following code:
millisTillNexthour = "Calculate millis remaining until next hour"
setTimeout(function() {
setInterval(function() {
cache.clear();
}, 60000*15);
}, millisTillNexthour);
My expectation is even if this process runs for ever, cache will be cleared every 15th minute of each hour of the day.
My question is: can setInterval drift over time?
For eg: right now it clears cache at 10:00 10:15 10:30 10:45 11:00 ......
Can it happen that instead of 10:15 system time, setInterval gets executed at 10:20 system time when it was supposed to clear cache at 10:15??
I am not sure how this works. Please shed some light. I hope I explained my question well.
I'm probably more than a bit late to the party here, but this is how I solved this particular time-slipping problem just now, using a recursively called setTimeout() function instead of using setInterval().
var interval = 5000;
var adjustedInterval = interval;
var expectedCycleTime = 0;
function runAtInterval(){
// get timestamp at very start of function call
var now = Date.now();
// log with time to show interval
console.log(new Date().toISOString().replace(/T/, ' ').replace(/Z/, '') + " runAtInterval()");
// set next expectedCycleTime and adjustedInterval
if (expectedCycleTime == 0){
expectedCycleTime = now + interval;
}
else {
adjustedInterval = interval - (now - expectedCycleTime);
expectedCycleTime += interval;
}
// function calls itself after delay of adjustedInterval
setTimeout(function () {
runAtInterval();
}, adjustedInterval);
}
On each iteration, the function checks the actual execution time against the previously calculated expected time, and then deducts the difference from 'interval' to produce 'adjustedInterval'. This difference may be positive or negative, and the results show that actual execution times tend to oscillate around the 'true' value +/- ~5ms.
Either way, if you've got a task that is executing once a minute, and you run it for an entire day, using this function you can expect that - for the entire day - every single hour will have had 60 iterations happen. You won't have that occasional hour where you only got 59 results because eventually an entire minute had slipped.
setInterval is definitely drifting (although I agree that it should not be). I'm running a Node.js server with an interval of 30 seconds. On each tick, a few async web requests are made which from beginning to end take roughly 1 second. No other user-level/application processing happens in the intervening 29 seconds.
However, I notice from my server logs that over the course of 30 minutes, a drift of 100ms occurs. Of course, the underlying operating system is not to blame for the drift and it can only be some defect of Node.js's design or implementation.
I am very disappointed to notice that there is a bug in the NodeJS implementation of setInterval. Please take a look at here:
https://github.com/nodejs/node/issues/7346#issuecomment-300432730
You can use Date() object to set specific time and then add a certain number of milliseconds to the date.
It definitly can because of how Javascript works (See Event Loop)
Javascript event loop executes the setInterval queue when other queued events are finished. These events will take some time and it will effect your setInterval function's execute time and it will eventually drift away as time passes.
setInterval should not drift in a perfect world. It might be delayed due to other things taking up system resources. If you need a more precise solution to what you have, use the clock() function to " calibrate " your nodes.
Today I was introduced to the world of Web Workers in JavaScript. This made me rethink about timers. I used to program timers the ugly way, like this.
var time = -1;
function timerTick()
{
time++;
setTimeout("timerTick()",1000);
$("#timeI").html(time);
}
I know this could be improved by saving the date when you start the timer, but I've never been a fan of that.
Now I came up with a method using Web Workers, I did a little benchmark and found it much more reliable. Since I am not an expert on JavaScript I would like to know if this function works correct or what problems it might have thanks in advance.
My JavaScript code (please note I use JQuery):
$(function() {
//-- Timer using web worker.
var worker = new Worker('scripts/task.js'); //External script
worker.onmessage = function(event) { //Method called by external script
$("#timeR").html(event.data)
};
};
The external script ('scripts/task.js'):
var time = -1;
function timerTick()
{
time++;
setTimeout("timerTick()",1000);
postMessage(time);
}
timerTick();
You can also view a live demo on my website.
If you're trying to reliably display seconds ticking by, then the ONLY reliable way to do that is to get the current time at the start and use the timer ONLY for updating the screen. On each tick, you get the current time, compute the actual elapsed seconds and display that. Neither setTimeout() nor setInterval() are guaranteed or can be used for accurately counting time.
You can do it like this:
var start = +(new Date);
setInterval(function() {
var now = +(new Date);
document.getElementById("time").innerHTML = Math.round((now - start)/1000);
}, 1000);
If the browser gets busy and timers are erratically spaced, you may get a slightly irregular update on screen, but the elapsed time will remain accurate when the screen is updated. Your method is susceptible to accumulating error in the elapsed time.
You can see this work here: http://jsfiddle.net/jfriend00/Gfwze/
The most accurate timer would be a comparison of two time stamps. You could increase the precision of your timer by updating more frequently (such as every 100ms). I prefer using setInterval() over setTimeout().
In a javascript code I develop, some function should be called every 1 second. But to be sure that this operation takes place every 1 second, the following code is utilized:
setInterval(doIt, 500);
function doIt() {
var now = (new Date()).getTime();
if(lastUpdate + 1000 >= now) {
/// code...
lastUpdate = now;
}
}
As far as I know setInterval(doIt, 1000) doesn't always mean that it's called every one second.
Is the above solution is a valid one? If not, what do you recommend?
You could use setTimeout instead of setInterval, and make dynamic adjustments each time your function is called. The idea is to set the timeout for a number of milliseconds sufficient to carry you to the next second boundary.
function timeoutFunc() {
// do interesting things
var time = new Date().getTime();
setTimeout(timeoutFunc, 1000 - time % 1000);
}
You'd start it off with:
setTimeout(timeoutFunc, 1000 - new Date().getTime() % 1000);
Synchronizing with the server seems like a bad idea, because you have no way of knowing whether the client clock is synchronized to anything (like the NTP server network). If it's not, then you server synchronizations are going to make things look wrong at the client, because the client clock will always be what seems right.
well setInterval IS defined in milliseconds. so it means its called every X millisdconds.
however the system can freeze or something like that!
but theres no practical better solution, you approach is fine.
if you really have an extensive javascript client application the results could stretch a little bit.
a possible solution for that is to get the system time and have a counter in your function. then ever X executions you align with the system clock, calculate how many function calls you should have until now and speed up the interval or slow it down.
this is as far as you can get to perfection. but it will be only a matter of milliseconds and probably not worth the effort.
may i ask what you are developing?