I see in some question What is minimum millisecond value of setTimeout? people talk about the "minimal timeout of setTimeout", but I can't really understand it.
It says the minimal timeout value in HTML5 spec is 4ms, so I think, if I run following code in browsers (say Chrome):
setTimeout(function() { console.log("333"); }, 3);
setTimeout(function() { console.log("222"); }, 2);
setTimeout(function() { console.log("111"); }, 1);
setTimeout(function() { console.log("000"); }, 0);
the output should be:
333
222
111
000
But actually it is:
111
000
222
333
Seems like they still be run according to the specified timeout even if they are less than 4 (expect the 0 and 1)
How should I understand the value 4ms?
The limit of 4 ms is specified by the HTML5 spec and is consistent across browsers released in 2010 and onward.
http://www.whatwg.org/specs/web-apps/current-work/multipage/timers.html#timers
To implement a 0 ms timeout in a modern browser, you can use window.postMessage()
https://developer.mozilla.org/en-US/docs/Web/API/Window/postMessage
More info
https://developer.mozilla.org/en-US/docs/Web/API/WindowTimers/setTimeout
https://developer.mozilla.org/en/docs/Web/JavaScript/EventLoop#Event_loop
After reading the recommended articles (thanks to #Rayon and #rafaelcastrocouto), and also this one:
http://www.adequatelygood.com/Minimum-Timer-Intervals-in-JavaScript.html
I realise that maybe I misunderstood the meaning of "minimal" delay value.
The specified timeout in setTimeout has two meanings:
The function will run when or after specified timeout (not before that)
The order will be decided by the value of the specified timeout, the smaller the earlier (and in some platform, 0 is considered the same as 1), and first added will be earlier if they have the same timeout
We don't need to concern the "minimal" delay value (e.g. 4ms) in this layer.
Then the tasks will be executed by the Javascript runtime. The runtime will take the tasks from the event queue one by one (also check if the timeout is OK), and execute them. But for some performance issue, the runtime can't really run the next task immediately after the previous one finished, there might be a tiny delay(according to different runtime implementations), and in HTML5 spec, the delay should be >= 4ms (it was 10ms in earlier browsers)
Related
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
I have a bit of node.js code as such:
var start = Date.now();
setTimeout(function() {
console.log(Date.now() - start);
for (var i = 0; i < 100000; i++) {
}
}, 1000);
setTimeout(function() {
console.log(Date.now() - start);
}, 2000);
Something strange happens when I run it on my machine. The times I get are something between 970 and 980, and something between 1970 and 1980. Why am I getting times that are earlier than the timeout times?
I believe you're experiencing these issues because of the Date precision. It can vary across platforms and browsers.
Here's a more detailed read on the accuracy of Date.
There are more high precision timers available on some platforms, but you will often have to do some mix and match (detect what's available and regress accordingly).
From the node.js setTimeOut documentation:
It is important to note that your callback will probably not be called
in exactly delay milliseconds - Node.js makes no guarantees about the
exact timing of when the callback will fire, nor of the ordering
things will fire in. The callback will be called as close as possible
to the time specified.
However, there is a nanotimer to play with:
https://www.npmjs.org/package/nanotimer
Some related questions:
node.js setTimeout resolution is very low and unstable
Why does node.js handle setTimeout(func, 1.0) incorrectly?
However, I do think +/- 30ms early is a bit much (compared to most browsers I've played with, they are usually no more than 10ms later (as long as the cpu isn't maxed out, that is)).
I have a method changeColor that updates the CSS on some elements in my HTML.
I also have a timer that controls this being applied i.e:
var timer = setInterval(changeColor,0);
The problem i'm facing is using that time interval of 0 results in the changeColor method not being run, however if i change it something minimal like:
var timer = setInterval(changeCalendarColor,1);
it works.
Now i'd be happy to use this however in IE8 this causes a slight delay in the colors appearing.
Any ideas on how to resolve this?
Thanks.
The setInterval function takes a function to call and a delay in milliseconds. You cannot have a delay of 0 milliseconds; there is a minimum delay in place (which according to the specs is 4ms). Take a look at the documentation for more.
// To call a function every second:
var timer = setInterval(myFunction, 1000);
// This doesn't throw an error as the 0 is being overridden by a default minimum:
var timer = setInterval(myFunction, 0);
If you want to call the function initially and ALSO every second after that, you should call the function when you set the interval:
var timer = setInterval(myFunction, 1000);
myFunction();
Here's what the Mozilla docs say about the minimum delay:
"In fact, 4ms is specified by the HTML5 spec and is consistent across browsers released in 2010 and onward. Prior to (Firefox 5.0 / Thunderbird 5.0 / SeaMonkey 2.2), the minimum timeout value for nested timeouts was 10 ms."
Regarding the slowness on IE8, the setInterval "lag" is probably being caused by IE8 being too slow to keep up with what the function is trying to do. At each time interval, the function is called, but IE8 is queue is being overloaded as a result -- to the point that IE8 can't keep up. Increasing the delay would mask this issue I'd imagine.
As Vasile says on this Google Code forum:
"When a new call is triggered if the previous call is not ended then the new call is queued and will wait for it's time to be executed; and here is the problem... (the timer will slow down when the queue will grow and the performance will go down over time)"
Note that this is a common problem for low delays in IE8; check this post for more on this specific issue.
Also, a quick point to note about setInterval delays is that inactive tabs are occasionally treated differently:
"In (Firefox 5.0 / Thunderbird 5.0 / SeaMonkey 2.2) and Chrome 11, timeouts are clamped to firing no more often than once per second (1000ms) in inactive tabs..."
See this related SO post for more.
I'm currently creating a countdown using setInterval though at the moment it runs slower than it should. According to the MDN, the delay parameter is in milliseconds however it isn't accurate.
I compared my countdown to the one on my phone and the phone runs nearly 5 times faster.
var count = setInterval( function() {
if (iMil == 0) {
if (iS == 0) {
if (iMin == 0) {
if (iH == 0) {
// DONE
} else {
iH--;
iMin = 59;
iS = 59;
iMil = 999;
}
} else {
iMin--;
iS = 59;
iMil == 999;
}
} else {
iS--;
iMil = 999;
}
} else {
iMil--;
}
hours.text(iH);
minutes.text(iMin);
seconds.text(iS);
milliseconds.text(iMil);
}, 1 );
This is the main part of my script. The variables hours, minutes, seconds and milliseconds are jQuery object elements.
What I'm getting at is, is there a reason that it runs slower than it is supposed too?
setInterval() is not guaranteed to run perfectly on time in javascript. That's partly because JS is single threaded and partly for other reasons. If you want to display a time with setInterval() then get the current time on each timer tick and display that. The setInterval() won't be your timer, but just a recurring screen update mechanism. Your time display will always be accurate if you do it that way.
In addition, no browser will guarantee a call to your interval at 1ms intervals. In fact, many browsers will never call setInterval more often than every 5ms and some even longer than that. Plus, if there are any other events happening in the browser with other code responding to those events, the setInterval() call might be delayed even longer. The HTML5 spec proposes 4ms as the shortest interval for setTimeout() and 10ms as the shortest interval for setInterval(), but allows the implementor to use longer minimum times if desired.
In fact, if you look at this draft spec for timers, step 5 of the algorithm says:
If timeout is less than 10, then increase timeout to 10.
And, step 8 says this:
Optionally, wait a further user-agent defined length of time.
And, it includes this note:
This is intended to allow user agents to pad timeouts as needed to
optimise the power usage of the device. For example, some processors
have a low-power mode where the granularity of timers is reduced; on
such platforms, user agents can slow timers down to fit this schedule
instead of requiring the processor to use the more accurate mode with
its associated higher power usage.
All timeout/interval/schedule functions are excepted to be run slower.
It is a nature of computer and very common in OS that there are many things CPU need to handle and too costly(and not possible) as a real-time system.
If you read theirs API https://developer.mozilla.org/en/docs/Web/API/window.setTimeout and https://developer.mozilla.org/en/docs/Web/API/window.setInterval , it said "AFTER a specified delay" and "fixed time delay between each call". They are not saying not "on a specified time" nor "called on fixed period"
This question already has answers here:
Why does setTimeout() "break" for large millisecond delay values?
(7 answers)
Closed 8 years ago.
I found myself on this jsperfs' page, why does that happen?
JSPerf smallest timeout
Any clues?
And why is 4 faster than 0 too?
There is a comment about this, you should read the article closely.
Info
The smallest setTimeout timeout value allowed by the HTML5 specification is 4 ms. Smaller
values should clamp to 4 ms.
Therefore, the first two tests below should have about the same result.
P.S. Some browsers freak out when you use a value greater than 599147937791 for the
timeout (i.e. they use 0 or 4 ms instead), hence the last test.
Essentially, Javascript has internal handling for 0 and 599147937792 as they qualify for over/underflow values for setTimeout and they are rounded to a default minimum accepted value 4 ms. This is probably because it is unreasonable to ask for a 0 ms delay as it would probably take longer than this to even process the function and determine this is what the user wants. The error on the larger value is probably due to the fact that computers have limits how big/small of a number that you can represent.
To understand why the large and small values return after 4 for example is that the internal handling takes time as well, a very small amount, but time. Consider these two timelines:
Timeline 1
setTimeout(...,0) is called
The function checks boundary conditions (something like if (time < 4) {// use 4})
The function needs an extra step here to change the value from 0 -> 4.
Now it sets the timeout for 4 ms
Timeline 2
setTimeout(...,4) is called
The function checks boundary conditions (something like if (time < 4) {// use 4})
Everything is ok, moves along.
Now it sets the timeout for 4 ms
Step 3 in the two timelines takes longer in the first case as there is the extra step of changing the value. Both will wait for the same amount of time, but the second one will start its timing ever so slightly sooner. This is much the same with 599147937792, except the check will be for the upper bound.
The phrasing "freaks out" makes me think it might look more like
try {
// try with the given input
} catch (Exception) {
// Ahh I freaked out, just use 4 instead!!!!111!!!
}
From MDN:
Browsers including Internet Explorer, Chrome, Safari, and Firefox store the delay as a 32-bit signed Integer internally. This causes an Integer overflow when using delays larger than 2147483647, resulting in the timeout being executed immediately.