setInterval delays not accurate - javascript

I'm currently creating a countdown using setInterval though at the moment it runs slower than it should. According to the MDN, the delay parameter is in milliseconds however it isn't accurate.
I compared my countdown to the one on my phone and the phone runs nearly 5 times faster.
var count = setInterval( function() {
if (iMil == 0) {
if (iS == 0) {
if (iMin == 0) {
if (iH == 0) {
// DONE
} else {
iH--;
iMin = 59;
iS = 59;
iMil = 999;
}
} else {
iMin--;
iS = 59;
iMil == 999;
}
} else {
iS--;
iMil = 999;
}
} else {
iMil--;
}
hours.text(iH);
minutes.text(iMin);
seconds.text(iS);
milliseconds.text(iMil);
}, 1 );
This is the main part of my script. The variables hours, minutes, seconds and milliseconds are jQuery object elements.
What I'm getting at is, is there a reason that it runs slower than it is supposed too?

setInterval() is not guaranteed to run perfectly on time in javascript. That's partly because JS is single threaded and partly for other reasons. If you want to display a time with setInterval() then get the current time on each timer tick and display that. The setInterval() won't be your timer, but just a recurring screen update mechanism. Your time display will always be accurate if you do it that way.
In addition, no browser will guarantee a call to your interval at 1ms intervals. In fact, many browsers will never call setInterval more often than every 5ms and some even longer than that. Plus, if there are any other events happening in the browser with other code responding to those events, the setInterval() call might be delayed even longer. The HTML5 spec proposes 4ms as the shortest interval for setTimeout() and 10ms as the shortest interval for setInterval(), but allows the implementor to use longer minimum times if desired.
In fact, if you look at this draft spec for timers, step 5 of the algorithm says:
If timeout is less than 10, then increase timeout to 10.
And, step 8 says this:
Optionally, wait a further user-agent defined length of time.
And, it includes this note:
This is intended to allow user agents to pad timeouts as needed to
optimise the power usage of the device. For example, some processors
have a low-power mode where the granularity of timers is reduced; on
such platforms, user agents can slow timers down to fit this schedule
instead of requiring the processor to use the more accurate mode with
its associated higher power usage.

All timeout/interval/schedule functions are excepted to be run slower.
It is a nature of computer and very common in OS that there are many things CPU need to handle and too costly(and not possible) as a real-time system.
If you read theirs API https://developer.mozilla.org/en/docs/Web/API/window.setTimeout and https://developer.mozilla.org/en/docs/Web/API/window.setInterval , it said "AFTER a specified delay" and "fixed time delay between each call". They are not saying not "on a specified time" nor "called on fixed period"

Related

Javascript Date.now() function [duplicate]

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

Can setTimeout's delay argument be effected at all by sub-millisecond inputs?

I understand that setTimeout doesn't necessarily fire at the exact delay you specify, because there could be other items in queue at the instant that the timeout occurs and the engine will do those things first (further delaying the time you've
specified).
However, I'm wondering if it does take sub-millisecond inputs into consideration at all. For example, if I input 1.12345678ms, behind the scenes does it attempt to fire at that exact time, or does it parseInt the sub-millisecond value I've inputed before even truly setting the actual timeout (under the hood)?
Furthermore, let's say I'm determining the ms delay with long division and that division produces an exponent like 1.2237832530049438e-9. Do I need to parseInt that exponent before handing it to setTimeout(()=>{},ms) or will setTimeout do the right thing (as long as it is some type of number) without me ever having to worry about prepping the input?
Update: Here's a snippet of setTimeout dealing with smaller and smaller sub-millisecond delay values:
let count = 0;
function consoleLog(timeOut)
{
let now = Date.now();
timeOut = (timeOut / 1.12345);
setTimeout(()=>
{
count += 1;
if (count <= 6444)
{
console.log(`Timestamp ${now}`,`Timeout: ${timeOut}`,`Count: ${count}`);
consoleLog(timeOut);
}
});
}
consoleLog(1000);
Warning, the code above recurses 6,444 times in order to show that there comes a point where the timeOut value no longer gets smaller from dividing it further: after count 6440, the timeout sustains 2e-323 thereafter.
Modern browsers throttle setTimeout/setInterval calls to a minimum of once every 4 ms.
Also, MDN says that:
The delay argument is converted to a signed 32-bit integer. This
effectively limits delay to 2147483647 ms, since it's specified as a
signed integer in the IDL.
So, any fractions of milliseconds are not going to be effective.
The times are not JS specifications - they are specified in the DOM standards.
4 ms is specified by the HTML5 spec and is consistent across browsers
released in 2010 and onward. Prior to (Firefox 5.0 / Thunderbird 5.0 /
SeaMonkey 2.2), the minimum timeout value for nested timeouts was 10
ms.
However in Node JS, the timers used are the system-specific high precision timers. They (the system timers) can go in resolutions up to nanoseconds. It should be experimented in Node JS if the timer delays are saved as integers.
ExceptionOr<int> DOMWindow::setTimeout(JSC::JSGlobalObject& state, std::unique_ptr<ScheduledAction> action, int timeout, Vector<JSC::Strong<JSC::Unknown>>&& arguments)
{
auto* context = scriptExecutionContext();
if (!context)
return Exception { InvalidAccessError };
// FIXME: Should this check really happen here? Or should it happen when code is about to eval?
if (action->type() == ScheduledAction::Type::Code) {
if (!context->contentSecurityPolicy()->allowEval(&state))
return 0;
}
action->addArguments(WTFMove(arguments));
return DOMTimer::install(*context, WTFMove(action), Seconds::fromMilliseconds(timeout), true);
}
According to source code for setTimeout it takes int as input. Which is a 32-bit signed integer.
So, the answer is, no. It does not takes into consideration.

The javascript timing resolution in my browsers seems to be ~8ms. How can I increase it? [duplicate]

Something that has always bugged me is how unpredictable the setTimeout() method in Javascript is.
In my experience, the timer is horribly inaccurate in a lot of situations. By inaccurate, I mean the actual delay time seems to vary by 250-500ms more or less. Although this isn't a huge amount of time, when using it to hide/show UI elements the time can be visibly noticeable.
Are there any tricks that can be done to ensure that setTimeout() performs accurately (without resorting to an external API) or is this a lost cause?
Are there any tricks that can be done
to ensure that setTimeout() performs
accurately (without resorting to an
external API) or is this a lost cause?
No and no. You're not going to get anything close to a perfectly accurate timer with setTimeout() - browsers aren't set up for that. However, you don't need to rely on it for timing things either. Most animation libraries figured this out years ago: you set up a callback with setTimeout(), but determine what needs to be done based on the value of (new Date()).milliseconds (or equivalent). This allows you to take advantage of more reliable timer support in newer browsers, while still behaving appropriately on older browsers.
It also allows you to avoid using too many timers! This is important: each timer is a callback. Each callback executes JS code. While JS code is executing, browser events - including other callbacks - are delayed or dropped. When the callback finishes, additional callbacks must compete with other browser events for a chance to execute. Therefore, one timer that handles all pending tasks for that interval will perform better than two timers with coinciding intervals, and (for short timeouts) better than two timers with overlapping timeouts!
Summary: stop using setTimeout() to implement "one timer / one task" designs, and use the real-time clock to smooth out UI actions.
.
REF; http://www.sitepoint.com/creating-accurate-timers-in-javascript/
This site bailed me out on a major scale.
You can use the system clock to compensate for timer inaccuracy. If you run a timing function as a series of setTimeout calls — each instance calling the next — then all you have to do to keep it accurate is work out exactly how inaccurate it is, and subtract that difference from the next iteration:
var start = new Date().getTime(),
time = 0,
elapsed = '0.0';
function instance()
{
time += 100;
elapsed = Math.floor(time / 100) / 10;
if(Math.round(elapsed) == elapsed) { elapsed += '.0'; }
document.title = elapsed;
var diff = (new Date().getTime() - start) - time;
window.setTimeout(instance, (100 - diff));
}
window.setTimeout(instance, 100);
This method will minimize drift and reduce the inaccuracies by more than 90%.
It fixed my issues, hope it helps
I had a similar problem not long ago and came up with an approach which combines requestAnimationFrame with performance.now() which works very effectively.
Im now able to make timers accurate to approx 12 decimal places:
window.performance = window.performance || {};
performance.now = (function() {
return performance.now ||
performance.mozNow ||
performance.msNow ||
performance.oNow ||
performance.webkitNow ||
function() {
//Doh! Crap browser!
return new Date().getTime();
};
})();
http://jsfiddle.net/CGWGreen/9pg9L/
If you need to get an accurate callback on a given interval, this gist may help you:
https://gist.github.com/1185904
function interval(duration, fn){
var _this = this
this.baseline = undefined
this.run = function(){
if(_this.baseline === undefined){
_this.baseline = new Date().getTime()
}
fn()
var end = new Date().getTime()
_this.baseline += duration
var nextTick = duration - (end - _this.baseline)
if(nextTick<0){
nextTick = 0
}
_this.timer = setTimeout(function(){
_this.run(end)
}, nextTick)
}
this.stop = function(){
clearTimeout(_this.timer)
}
}
shog9's answer is pretty much what I'd say, although I'd add the following about UI animation/events:
If you've got a box that's supposed to slide onto the screen, expand downwards, then fade in its contents, don't try to make all three events separate with delays timed to make them fire one after another - use callbacks, so once the first event is done sliding it calls the expander, once that's done it calls the fader. jQuery can do it easily, and I'm sure other libraries can as well.
If you're using setTimeout() to yield quickly to the browser so it's UI thread can catch up with any tasks it needs to do (such as updating a tab, or to not show the Long Running Script dialog), there is a new API called Efficient Script Yielding, aka, setImmediate() that may work a bit better for you.
setImmediate() operates very similarly to setTimeout(), yet it may run immediately if the browser has nothing else to do. In many situations where you are using setTimeout(..., 16) or setTimeout(..., 4) or setTimeout(..., 0) (i.e. you want the browser to run any outstanding UI thread tasks and not show a Long Running Script dialog), you can simply replace your setTimeout() with setImmediate(), dropping the second (millisecond) argument.
The difference with setImmediate() is that it is basically a yield; if the browser has sometime to do on the UI thread (e.g., update a tab), it will do so before returning to your callback. However, if the browser is already all caught up with its work, the callback specified in setImmediate() will essentially run without delay.
Unfortunately it is only currently supported in IE9+, as there is some push back from the other browser vendors.
There is a good polyfill available though, if you want to use it and hope the other browsers implement it at some point.
If you are using setTimeout() for animation, requestAnimationFrame is your best bet as your code will run in-sync with the monitor's refresh rate.
If you are using setTimeout() on a slower cadence, e.g. once every 300 milliseconds, you could use a solution similar to what user1213320 suggests, where you monitor how long it was from the last timestamp your timer ran and compensate for any delay. One improvement is that you could use the new High Resolution Time interface (aka window.performance.now()) instead of Date.now() to get greater-than-millisecond resolution for the current time.
You need to "creep up" on the target time. Some trial and error will be necessary but in essence.
Set a timeout to complete arround 100ms before the required time
make the timeout handler function like this:
calculate_remaining_time
if remaining_time > 20ms // maybe as much as 50
re-queue the handler for 10ms time
else
{
while( remaining_time > 0 ) calculate_remaining_time;
do_your_thing();
re-queue the handler for 100ms before the next required time
}
But your while loop can still get interrupted by other processes so it's still not perfect.
Here's an example demoing Shog9's suggestion. This fills a jquery progress bar smoothly over 6 seconds, then redirects to a different page once it's filled:
var TOTAL_SEC = 6;
var FRAMES_PER_SEC = 60;
var percent = 0;
var startTime = new Date().getTime();
setTimeout(updateProgress, 1000 / FRAMES_PER_SEC);
function updateProgress() {
var currentTime = new Date().getTime();
// 1000 to convert to milliseconds, and 100 to convert to percentage
percent = (currentTime - startTime) / (TOTAL_SEC * 1000) * 100;
$("#progressbar").progressbar({ value: percent });
if (percent >= 100) {
window.location = "newLocation.html";
} else {
setTimeout(updateProgress, 1000 / FRAMES_PER_SEC);
}
}
This is a timer I made for a music project of mine which does this thing. Timer that is accurate on all devices.
var Timer = function(){
var framebuffer = 0,
var msSinceInitialized = 0,
var timer = this;
var timeAtLastInterval = new Date().getTime();
setInterval(function(){
var frametime = new Date().getTime();
var timeElapsed = frametime - timeAtLastInterval;
msSinceInitialized += timeElapsed;
timeAtLastInterval = frametime;
},1);
this.setInterval = function(callback,timeout,arguments) {
var timeStarted = msSinceInitialized;
var interval = setInterval(function(){
var totaltimepassed = msSinceInitialized - timeStarted;
if (totaltimepassed >= timeout) {
callback(arguments);
timeStarted = msSinceInitialized;
}
},1);
return interval;
}
}
var timer = new Timer();
timer.setInterval(function(){console.log("This timer will not drift."),1000}
Hate to say it, but I don't think there is a way to alleviate this. I do think that it depends on the client system, though, so a faster javascript engine or machine may make it slightly more accurate.
To my experience it is lost effort, even as the smallest reasonable amount of time I ever recognized js act in is around 32-33 ms. ...
There is definitely a limitation here. To give you some perspective, the Chrome browser Google just released is fast enough that it can execute setTimeout(function() {}, 0) in 15-20 ms whereas older Javascript engines took hundreds of milliseconds to execute that function. Although setTimeout uses milliseconds, no javascript virtual machine at this point in time can execute code with that precision.
Dan, from my experience (that includes implementation of SMIL2.1 language in JavaScript, where time management is in subject) I can assure you that you actually never need high precision of setTimeout or setInterval.
What does however matter is the order in which setTimeout/setInterval gets executed when queued - and that always works perfectly.
JavaScript timeouts have a defacto limit of 10-15ms (I'm not sure what you're doing to get 200ms, unless you're doing 185ms of actual js execution). This is due to windows having a standard timer resolution of 15ms, the only way to do better is to use Windows' higher resolution timers which is a system wide setting so can screw with other applications on the system and also chews battery life (Chrome has a bug from Intel on this issue).
The defacto standard of 10-15ms is due to people using 0ms timeouts on websites but then coding in a way that assumes that assumes a 10-15ms timeout (eg. js games which assume 60fps but ask 0ms/frame with no delta logic so the game/site/animation goes a few orders of magnitude faster than intended). To account for that, even on platforms that don't have windows' timer problems, the browsers limit timer resolution to 10ms.
Here are what I use. Since it's JavaScript, I will post both my Frontend and node.js solutions:
For both, I use the same decimal rounding function that I highly recommend you keep at arms length because reasons:
const round = (places, number) => +(Math.round(number + `e+${places}`) + `e-${places}`)
places - Number of decimal places at which to round, this should be safe and should avoid any issues with floats (some numbers like 1.0000000000005~ can be problematic). I Spent time researching the best way to round decimals provided by high-resolution timers converted to milliseconds.
that + symbol - It is a unary operator that converts an operand into a number, virtually identical to Number()
Browser
const start = performance.now()
// I wonder how long this comment takes to parse
const end = performance.now()
const result = (end - start) + ' ms'
const adjusted = round(2, result) // see above rounding function
node.js
// Start timer
const startTimer = () => process.hrtime()
// End timer
const endTimer = (time) => {
const diff = process.hrtime(time)
const NS_PER_SEC = 1e9
const result = (diff[0] * NS_PER_SEC + diff[1])
const elapsed = Math.round((result * 0.0000010))
return elapsed
}
// This end timer converts the number from nanoseconds into milliseconds;
// you can find the nanosecond version if you need some seriously high-resolution timers.
const start = startTimer()
// I wonder how long this comment takes to parse
const end = endTimer(start)
console.log(end + ' ms')
You could consider using the html5 webaudio clock which uses the system time for better accuracy

How to compensate for setInterval timing drift in Javascript audio

I have two instances of setInterval. Each is triggering a different function ( these two functions are title quarterNoteFunc & eighthNoteFunc ) at repeated intervals. The interval for quarterNoteFunc is 600 milliseconds. The interval for eighthNoteFunc is 300 milliseconds. Both of these functions each trigger a different audio file at repeat intervals hence creating a basic music rhythm. The rhythm between the two function calls eventually "drifts" in Google Chrome making the rhythm between the two sounds dissolve. My question is:
It seems that even though browser based timing is garbage their should be a way to create some kind of "hard" timing reference so that the sounds are locked even if the "global" timing gets offset hence keeping the sounds in sync. I thought assigning the same variable milliseconds (code below) would inhibit this - but I was wrong.
The (abbreviated) code looks like this
milliseconds = 600;
quarterNote = setInterval(quarterNoteFunc, milliseconds);
eighthNote = setInterval(eighthNoteFunc, milliseconds/2);
Probably the best way to do this is to have a single, always active 1/8 note interval, then call the quarter-note every other tick:
// wrapped in a closure to allow for a private tickCount variable
// alternatively, you could use a more advanced object with start/stop methods, etc.
(function() {
var tickCount = 0,
tick = function() {
eighthNoteFunc();
if(tickCount %2 == 0) {
quarterNoteFunc();
}
tickCount++;
};
setInterval(tick, 300);
})();
This ensures that the methods are always called on the same tick. You can also expand this to support half notes (tickCount % 4 == 0) and whole notes (tickCount % 8 == 0).
This interested me, so I decided to create a fully-working sample (except, using animated backgrounds instead of audio): http://jsfiddle.net/SycBm/
This allows you to see the eighth-, quarter-, and half- notes ticking in sync, as well as start & stop the timer, and independently enable or disable the notes.
Enjoy!

How can I make my setTimout functions run at the same speed?

Preface: I have a demo of the problem on my personal site (I hope this is ok. If not, I can try to set it up on jsfiddle). I'm intending this question to be a little fun, while also trying to understand the time functions take in javascript.
I'm incrementing the value of progress bars on a timeout. Ideally (if functions run instantaneously) they should fill at the same speed, but in the real world, they do not. The code is this:
function setProgress(bar, myPer) {
bar.progressbar({ value: myPer })
.children('.ui-progressbar-value')
.html(myPer.toPrecision(3) + '%')
.attr('align', 'center');
myPer++;
if(myPer == 100) { myPer = 0; }
}
function moveProgress(bar, myPer, inc, delay){
setProgress(bar, myPer);
if(myPer >= 100) { myPer = 0; }
setTimeout(function() { moveProgress(bar, myPer+inc, inc, delay); }, delay);
}
$(function() {
moveProgress($(".progressBar#bar1"), 0, 1, 500);
moveProgress($(".progressBar#bar2"), 0, 1, 500);
moveProgress($(".progressBar#bar3"), 0, .1, 50);
moveProgress($(".progressBar#bar4"), 0, .01, 5);
});
Naively, one would think should all run (fill the progress bar) at the same speed.
However, in the first two bars, (if we call "setting the progress bar" a single operation) I'm performing one operation every 500 ms for a total of 500 operations to fill the bar; in the third, I'm performing one operation every 50ms for a total of 5,000 operations to fill the bar; in the fourth, I'm performing one operation every 5ms for a total of 50,000 operations to fill the bar.
What part of my code is takes the longest, causes these speed differences, and could be altered in order to make them appear to function in the way that they do (the fourth bar gets smaller increments), but also run at the same speed?
The biggest problem with using setTimeout for things like this is that your code execution happens between timeouts and is not accounted for in the value sent to setTimeout. If your delay is 5 ms and your code takes 5 ms to execute, you're essentially doubling your time.
Another factor is that once your timeout fires, if another piece of code is already executing, it will have to wait for that to finish, delaying execution.
This is very similar to problems people have when trying to use setTimeout for a clock or stopwatch. The solution is to compare the current time with the time that the program started and calculate the time based on that. You could do something similar. Check how long it has been since you started and set the % based on that.
What causes the speed difference two things: first is the fact that you executing more code to fill the bottom bar (as you allude to in the 2nd to last paragraph). Also, every time you set a timeout, your browser queues it up... the actual delay may be longer than what you specify, depending on how much is in the queue (see MDN on window.setTimeout).
Love the question, i don't have a very precise answer but here are my 2 cents:
Javascript is a very fast language that deals very well with it's event loop and therefore eats setTimeouts and setIntervals for breakfast.
There are limits though, and they depend on a large number of factors, such as browser and computer speed, quantity of functions you have on the event loop, complexity of the code to execute and timeout values...
In this case, i think it's obvious that if you try to execute one function every 500ms, it is going to behave a lot better than executing it every 50ms, therefore a lot better than every 5ms. If you take into account that you are running them all on top of each other, you can predict that the performance will not be optimal.
You can try this exercise:
take the 500ms one, and run it alone. mark the total time it took to fill the bar (right here you will see that it's going to take a little longer than predicted).
try executing two 500ms timeouts at the same time, and see that the total time just got a bit longer.
If you add the 50ms to it, and then the 5ms one, you will see that you will lose performance everytime...

Categories

Resources