Will setInterval drift? - javascript

This is a pretty simple question really. If I use setInterval(something, 1000), can I be completely sure that after, say, 31 days it will have triggered "something" exactly 60*60*24*31 times? Or is there any risk for so called drifting?

Short answer: No, you can't be sure. Yes, it can drift.
Long answer: John Resig on the Accuracy of JavaScript Time and How JavaScript Timers Work.
From the second article:
In order to understand how the timers work internally there's one important concept that needs to be explored: timer delay is not guaranteed. Since all JavaScript in a browser executes on a single thread asynchronous events (such as mouse clicks and timers) are only run when there's been an opening in the execution.
Both articles (and anything on that site) is great reading, so have at it.

Here's a benchmark you can run in Firefox:
var start = +new Date();
var count = 0;
setInterval(function () {
console.log((new Date() - start) % 1000,
++count,
Math.round((new Date() - start)/1000))
}, 1000);
First value should be as close to 0 or 1000 as possible (any other value shows how "off the spot" the timing of the trigger was.) Second value is number of times the code has been triggered, and third value is how many times the could should have been triggered. You'll note that if you hog down your CPU it can get quite off the spot, but it seems to correct itself. Try to run it for a longer period of time and see how it handles.

(Sorry about my bad english)
I had same problem about counting down function, i writed a function countdown() and loop with setInterval but its drifting 1-3 milliseconds per loop. Then i write a function that controls is there any drifting and fixed it.
It controls with real minute and second only. Here it is. Its works fine to me, i hope it will help you too.
$.syncInterval(functionname,interval,controlinterval)
example:
countdown(){ some code };
$.syncInterval(countdown,1000,60);
it says run countdown function every 1000 milliseconds and check it every 60 seconds
here is the code:
$.syncInterval = function (func,interval,control) {
var
now=new Date();
realMinute=now.getMinutes(),
realSecond=now.getSeconds(),
nowSecond=realSecond,
nowMinute=realMinute,
minuteError=0,
countingVar=1,
totalDiff=0;
var loopthat = setInterval(function(){
if (nowSecond==0) {
nowMinute++;
nowMinute=nowMinute%60;
};
if (countingVar==0){
now=new Date();
realSecond=now.getSeconds();
realMinute=now.getMinutes();
totalDiff=((realMinute*60)+(realSecond))-((nowMinute*60)+(nowSecond));
if(totalDiff>0){
for (i=1;i<=totalDiff;i++) {
func();
nowSecond++;
countingVar++;
};
} else if (totalDiff==0){
func();
nowSecond++;
countingVar++;
} else if (totalDiff<0) {
};
} else {
func();
nowSecond++;
countingVar++;
};
countingVar=countingVar%control;
nowSecond=nowSecond%60;
},interval);
};

Related

Stopwatch not working, it's going way too faster

As I was looking for a simple Stopwatch implementation in JS, I found this code http://codepen.io/_Billy_Brown/pen/dbJeh/
The problem is that it's not working fine, the clock go way too fast. I got 30 seconds on the screen when i got only 23 seconds on my watch.
And I don't understand why. The timer function is called every millisecond and should be updating the time correctly.
setInterval(this.timer, 1);
Is the problem coming from the browser or from the JS code.
Thanks in advance.
The timers in Javascript doesn't have millisecond precision.
There is a minimum time for the interval, which differs depending on the browser and browser version. Typical minimums are 4 ms for recent browsers and 10 ms for a little older browsers.
Also, you can't rely on the callback being called at exact the time that you specify. Javascript is single threaded, which means that if some other code is running when the timer triggers a tick, it has to wait until that other code finishes.
In fact the code you gave is imitating time flow, but it is not synchronized with system time.
Every millisecond it just invokes the function this.time, which performs recounting of millis, seconds and so on
without getting native system time, but just adding 1 to variable representing "imaginary milliseconds".
So we can say that resulting pseudo-time you see depends on your CPU, browser and who knows what else.
On our modern fantastically fast computers the body of this.time function is being executed faster than millisecond (wondering what would happen on Pentium 2 with IE5 on board).
Anyhow there is no chance for the this.time to be executed exactly in particular fixed period on all computers and browsers.
The simplest correct way to show the time passed since startpoint according to the system time is:
<body>
<script>
var startTime = new Date() // assume this initialization to be start point
function getTimeSinceStart()
{
var millisSinceStart = new Date() - startTime
, millis = millisSinceStart % 1000
, seconds = Math.floor(millisSinceStart / 1000)
return [seconds, millis].join( ':' )
}
(function loop()
{
document.title = getTimeSinceStart() // look on the top of page
setTimeout( loop, 10 )
}())
</script>
</body>
P.S. What #Guffa says in his answer is correct (generally for js in browsers), but in this case it does not matter and not affect the problem

Run a Function For Each Milliseconds

I am trying to run a function for each milliseconds, In order to achieve so, I just preferred setInterval concept in javascript. My code is given below,
HTML:
<div id=test>0.0</div>
Script:
var xVal = 0;
var xElement = null;
xElement = document.getElementById("test");
var Interval = window.setInterval(startWatch, 1);
function startWatch(){
xVal += 1;
xElement.innerHTML = xVal;
}
so the above code is working fine. But while I am testing the result with a real clock, the real clock requires 1000 milliseconds to complete 1 second, at the same time the result require more than 1000 milliseconds to complete a second.
DEMO
Can anybody tell me,
Is there any mistakes with my code? If yes then tell me, How to display milliseconds accurately.?
There are no mistakes in your code, but JavaScript timers (setInterval and setTimeout) are not precise. Browsers cannot comply with such a short interval. So I'm afraid there is no way to precisely increment the milliseconds by one, and display the updates, on a web browser. In any case, that's not even visible to the human eye!
A precise workaround would involve a larger interval, and timestamps to calculate the elapsed time in milliseconds:
var start = new Date().getTime();
setInterval(function() {
var now = new Date().getTime();
xElement.innerHTML = (now - start) + 'ms elapsed';
}, 40);
You can't. There is a minimum delay that browsers use. You cannot run a function every millisecond.
From Mozilla's docs:
...4ms is specified by the HTML5 spec and is consistent across browsers...
Source: https://developer.mozilla.org/en-US/docs/Web/API/window.setTimeout#Minimum.2F_maximum_delay_and_timeout_nesting
The DOM can't actually update 1000 times per second. Your monitor can't even display 1000 frames in one second, for that matter. Calculate the difference between the start time and current time in milliseconds within your function and use that:
(function(){
var xElement = document.getElementById("test");
var start = new Date;
(function update(){
xElement.innerHTML = (new Date - start);
setTimeout(update, 0);
})();
}();
Updated fiddle
You can't do so using your method because of the delay rendering the HTML and running the interval. Doing it this way will display the time correctly at about 60FPS.
http://jsfiddle.net/3hEs4/3/
var xElement = null;
var startTime = new Date();
xElement = document.getElementById("test");
var Interval = window.setInterval(startWatch, 17);
function startWatch(){
var currentTime = new Date();
xElement.innerHTML = currentTime - startTime;
}
You might also want to look into using requestanimationframe instead of a hardcoded setInterval like that.
The setInterval callback probably does not happen with millisecond accuracy, since the thread the timer is running on might not even actually be running when the time is up, or the browser throttles events, or any other of quite a few things.
In addition, since most Javascript engines are single threaded, what the implementation of setInterval might do is once it triggers, run your callback, and then reset the clock for the next call. Since you're doing some DOM manipulation, that might take several milliseconds on its own.
In short, you're expecting a Real Time Operating System behavior from an interpreter running inside of another application, on top of what is more than likely not an RTOS.
I had the same question and couldn't find any working solution, so I created one myself. The code below essentially calls five setTimouts every 5 ms, for each ms between 5 and 10. This circumvents the minimum 4 ms constraint, and (having checked in Firefox, Chrome, and Opera) works fairly well.
const start = performance.now();
let newNow = 0;
let oldNow = 0;
const runner = function(reset) {
// whatever is here will run ca. every ms
newNow = performance.now();
console.log("new:", newNow);
console.log(" diff:", newNow - oldNow);
oldNow = newNow
if (newNow - start < 1000 && reset) {
setTimeout(function() {
runner(true);
}, 5);
for (let i = 6; i < 11; i++) {
setTimeout(function() {
runner(false);
}, i);
}
}
};
runner(true);
It could of course be written more elegantly, e.g. so that you can more easily customize things like the graduation (e.g. 0.5 ms or 2 ms instead of 1 ms), but anyway the principle is there.
I know that in theory you could call 5 setIntervals instead, but that would in reality cause a drift that would quickly ruin the ms precision.
Note also that there are legitimate cases for the use. (I for one need continual measurement of touch force, which is not possible otherwise.)

can setInterval drift over time?

I have 2 node.js webservers. I cache data inside webservers. I sync the cache load/clear based on system time. I have done time sync of all my hosts.
Now I clear cache every 15 mins using following code:
millisTillNexthour = "Calculate millis remaining until next hour"
setTimeout(function() {
setInterval(function() {
cache.clear();
}, 60000*15);
}, millisTillNexthour);
My expectation is even if this process runs for ever, cache will be cleared every 15th minute of each hour of the day.
My question is: can setInterval drift over time?
For eg: right now it clears cache at 10:00 10:15 10:30 10:45 11:00 ......
Can it happen that instead of 10:15 system time, setInterval gets executed at 10:20 system time when it was supposed to clear cache at 10:15??
I am not sure how this works. Please shed some light. I hope I explained my question well.
I'm probably more than a bit late to the party here, but this is how I solved this particular time-slipping problem just now, using a recursively called setTimeout() function instead of using setInterval().
var interval = 5000;
var adjustedInterval = interval;
var expectedCycleTime = 0;
function runAtInterval(){
// get timestamp at very start of function call
var now = Date.now();
// log with time to show interval
console.log(new Date().toISOString().replace(/T/, ' ').replace(/Z/, '') + " runAtInterval()");
// set next expectedCycleTime and adjustedInterval
if (expectedCycleTime == 0){
expectedCycleTime = now + interval;
}
else {
adjustedInterval = interval - (now - expectedCycleTime);
expectedCycleTime += interval;
}
// function calls itself after delay of adjustedInterval
setTimeout(function () {
runAtInterval();
}, adjustedInterval);
}
On each iteration, the function checks the actual execution time against the previously calculated expected time, and then deducts the difference from 'interval' to produce 'adjustedInterval'. This difference may be positive or negative, and the results show that actual execution times tend to oscillate around the 'true' value +/- ~5ms.
Either way, if you've got a task that is executing once a minute, and you run it for an entire day, using this function you can expect that - for the entire day - every single hour will have had 60 iterations happen. You won't have that occasional hour where you only got 59 results because eventually an entire minute had slipped.
setInterval is definitely drifting (although I agree that it should not be). I'm running a Node.js server with an interval of 30 seconds. On each tick, a few async web requests are made which from beginning to end take roughly 1 second. No other user-level/application processing happens in the intervening 29 seconds.
However, I notice from my server logs that over the course of 30 minutes, a drift of 100ms occurs. Of course, the underlying operating system is not to blame for the drift and it can only be some defect of Node.js's design or implementation.
I am very disappointed to notice that there is a bug in the NodeJS implementation of setInterval. Please take a look at here:
https://github.com/nodejs/node/issues/7346#issuecomment-300432730
You can use Date() object to set specific time and then add a certain number of milliseconds to the date.
It definitly can because of how Javascript works (See Event Loop)
Javascript event loop executes the setInterval queue when other queued events are finished. These events will take some time and it will effect your setInterval function's execute time and it will eventually drift away as time passes.
setInterval should not drift in a perfect world. It might be delayed due to other things taking up system resources. If you need a more precise solution to what you have, use the clock() function to " calibrate " your nodes.

JavaScript: Is this timer reliable?

Today I was introduced to the world of Web Workers in JavaScript. This made me rethink about timers. I used to program timers the ugly way, like this.
var time = -1;
function timerTick()
{
time++;
setTimeout("timerTick()",1000);
$("#timeI").html(time);
}
I know this could be improved by saving the date when you start the timer, but I've never been a fan of that.
Now I came up with a method using Web Workers, I did a little benchmark and found it much more reliable. Since I am not an expert on JavaScript I would like to know if this function works correct or what problems it might have thanks in advance.
My JavaScript code (please note I use JQuery):
$(function() {
//-- Timer using web worker.
var worker = new Worker('scripts/task.js'); //External script
worker.onmessage = function(event) { //Method called by external script
$("#timeR").html(event.data)
};
};
The external script ('scripts/task.js'):
var time = -1;
function timerTick()
{
time++;
setTimeout("timerTick()",1000);
postMessage(time);
}
timerTick();
You can also view a live demo on my website.
If you're trying to reliably display seconds ticking by, then the ONLY reliable way to do that is to get the current time at the start and use the timer ONLY for updating the screen. On each tick, you get the current time, compute the actual elapsed seconds and display that. Neither setTimeout() nor setInterval() are guaranteed or can be used for accurately counting time.
You can do it like this:
var start = +(new Date);
setInterval(function() {
var now = +(new Date);
document.getElementById("time").innerHTML = Math.round((now - start)/1000);
}, 1000);
If the browser gets busy and timers are erratically spaced, you may get a slightly irregular update on screen, but the elapsed time will remain accurate when the screen is updated. Your method is susceptible to accumulating error in the elapsed time.
You can see this work here: http://jsfiddle.net/jfriend00/Gfwze/
The most accurate timer would be a comparison of two time stamps. You could increase the precision of your timer by updating more frequently (such as every 100ms). I prefer using setInterval() over setTimeout().

Getting certain frequency with setInterval method

In a javascript code I develop, some function should be called every 1 second. But to be sure that this operation takes place every 1 second, the following code is utilized:
setInterval(doIt, 500);
function doIt() {
var now = (new Date()).getTime();
if(lastUpdate + 1000 >= now) {
/// code...
lastUpdate = now;
}
}
As far as I know setInterval(doIt, 1000) doesn't always mean that it's called every one second.
Is the above solution is a valid one? If not, what do you recommend?
You could use setTimeout instead of setInterval, and make dynamic adjustments each time your function is called. The idea is to set the timeout for a number of milliseconds sufficient to carry you to the next second boundary.
function timeoutFunc() {
// do interesting things
var time = new Date().getTime();
setTimeout(timeoutFunc, 1000 - time % 1000);
}
You'd start it off with:
setTimeout(timeoutFunc, 1000 - new Date().getTime() % 1000);
Synchronizing with the server seems like a bad idea, because you have no way of knowing whether the client clock is synchronized to anything (like the NTP server network). If it's not, then you server synchronizations are going to make things look wrong at the client, because the client clock will always be what seems right.
well setInterval IS defined in milliseconds. so it means its called every X millisdconds.
however the system can freeze or something like that!
but theres no practical better solution, you approach is fine.
if you really have an extensive javascript client application the results could stretch a little bit.
a possible solution for that is to get the system time and have a counter in your function. then ever X executions you align with the system clock, calculate how many function calls you should have until now and speed up the interval or slow it down.
this is as far as you can get to perfection. but it will be only a matter of milliseconds and probably not worth the effort.
may i ask what you are developing?

Categories

Resources