I'm considering adding an alert() to our Javascript utility assert function.
We're an ajax-heavy application, and the way our framework (Ext) implements ajax by polling for the ajax response with setInterval instead of waiting for readystate==4, causes all of our ajax callbacks to execute in a setInterval stack context -- and an exception/assert blowing out of it usually fails silently.
How does a low-level alert() impact the browser event loop? The messagebox by definition must allow the win32 event loop to pump (to respond to the mbox button). Does that mean other browser events, like future setIntervals generated by our framework, resize events, etc, are going to fire? Can this cause trouble for me?
IIRC: you can use Firefox2 and Firefox3.5 to see the difference I'm talking about.
alert('1');
setTimeout(function(){alert('2');}, 10);
alert('3');
Firefox3.5 shows 1-3-2. Firefox2[1] shows 1-2&3 (2 and 3 stacked on top of each other simultaneously). We can replicate 1-2&3 in IE8 with a win32 mbox launched from ActiveX as well, instead of an alert, which wreaked havoc on us back in the day, and I want to make sure we don't go down that path again.
Can anyone point me to specific low level resources that explain this behavior, what the expected behavior is here, and what exactly is going on at a low level, including why the behavior changed across Firefox versions?
[1] you can replicate this on Spoon.net which I can't get working right now. I just reproduced it in a VM with Firefox 2.0.0.20.
First, timers in javascript are not very precise. Intervals smaller than 30ms might be considered all the same, and implementations vary. Don't rely on any implicit ordering.
An alert() will always halt the event loop. If an event or timer fires during the alert, they will be queued and called after the event loop resumes (the alert box is closed).
Take this example:
var hello = document.getElementById('hello')
setTimeout(function(){
hello.style.backgroundColor = 'lime'
}, 5000)
alert('Stop!')
setTimeout(function(){
hello.innerHTML = 'collaborate'
}, 20)
setTimeout(function(){
hello.innerHTML = 'listen'
}, 1000)
There are two possible outcomes:
You close the alert box in under 5 seconds. The two timers that follow will be set and fire at specified intervals. You can see that the event loop is halted because regardless of how long you wait to close the alert, "listen" will always take 1s to execute.
You take longer than 5 seconds to close the alert. The first interval (bgColor) will have passed, so it executes immediately, followed by the two timers being set and called.
http://jsbin.com/iheyi4/edit
As for intervals, while the event loop is stopped it also "stops time", so in this case:
i = 0
setInterval(function(){
document.getElementById('n').innerHTML = ++i
}, 1000)
setTimeout(function(){
alert('stop')
}, 5500)
Regardless of how long you take to close the alert, the next number will always be 6 - the setInterval won't fire multiple times.
http://jsbin.com/urizo6/edit
I haven't been able to replicate the 1-2&3 case, but here is a fiddle that may help you debug what is going on in the different browsers.
Related
Assumptions: rAF now time is calculated at the time the set of callbacks are all triggered. Therefore any blocking that happens before the first callback of that frame is called doesn't affect the rAF now and it's accurate--at least for that first callback.
Any performance.now() measurements made before a rAF set is triggered should be earlier than rAF now.
Test: Record before (a baseline time before anything happens). Set the next rAF. Compare rAF now and actual performance.now() to before to see how different they are.
Expected results:
var before = performance.now(), frames = ["with blocking", "with no blocking"], calls = 0;
requestAnimationFrame(function frame(rAFnow) {
var actual = performance.now();
console.log("frame " + (calls + 1) + " " + frames[calls] + ":");
console.log("before frame -> rAF now: " + (rAFnow - before));
console.log("before frame -> rAF actual: " + (actual - before));
if (++calls < frames.length) { before = actual; requestAnimationFrame(frame); }
});
// blocking
for (var i = 0, l = 0; i < 10000000; i++) {
l += i;
}
Observations: When there is blocking before the frame starts, the rAF now time is at times incorrect, even for that first frame. Sometimes the first frame's now is actually an earlier time than the recorded before time.
Whether there is blocking happening before the frame or not, every so often the in-frame time rAFnow will be earlier than the pre-frame time before--even when I setup the rAF after I take my first measurement. This can also happen without any blocking whatsoever, though this is rarer.
(I get a timing error on the first blocking frame most of the time. Getting an issue on the others is rarer, but does happen occasionally if you try running it a few times.)
With more extensive testing, I've found bad times with blocking prior to callback: 1% from 100 frames, no blocking: 0.21645021645021645% from ~400 frames, seemingly caused by opening a window or some other potentially CPU-intensive action by the user.
So it's fairly rare, but the problem is this shouldn't happen at all. If you want to do useful things with them, simulating time, animation, etc., then you need those times to make sense.
I've taken into account what people have said, but maybe I am still not understanding how things work. If this is all per-spec, I'd love some psuedo-code to solidify it in my mind.
And more importantly, if anyone has any suggestions for how I could get around these issues, that would be awesome. The only thing I can think of is taking my own performance.now() measurement every frame and using that--but it seems a bit wasteful, having it effectively run twice every frame, on top of any triggered events and so on.
The timestamp passed in to the requestAnimationFrame() callback is the time of the beginning of the animation frame. Multiple callbacks being invoked during the same frame all receive the same timestamp. Thus, it would be really weird if performance.now() returned a time before the parameter value, but not really weird for it to be after that.
Here's the relevant specification:
When the user agent is to run the animation frame callbacks for a Document document with a timestamp now, it must run the following steps:
If the value returned by the document object’s hidden attribute is true, abort these steps. [PAGE-VISIBILITY]
Let callbacks be a list of the entries in document’s list of animation frame callbacks, in the order in which they were added to the list.
Set document’s list of animation frame callbacks to the empty list.
For each entry in callbacks, in order: invoke the Web IDL callback function, passing now as the only argument, and if an exception is thrown, report the exception.
So you've registered a callback (let's say just one) for the next animation frame. Tick tick tick, BOOM, time for that animation frame to happen:
The JavaScript runtime makes a note of the time and labels that now.
The runtime makes a temporary copy of the list of registered animation frame callbacks, and clears the actual list (so that they're not accidentally invoked if things take so long that the next animation frame comes around).
The list has just one thing in it: your callback. The system invokes that with now as the parameter.
Your callback starts running. Maybe it's never been run before, so the JavaScript optimizer may need to do some work. Or maybe the operating system switches threads to some other system process, like starting up a disk buffer flush or handling some network traffic, or any of dozens of other things.
Oh right, your callback. The browser gets the CPU again and your callback code starts running.
Your code calls performance.now() and compares it to the now value passed in as a parameter.
Because a brief but non-ignorable amount of time may pass between step 1 and step 6, the return value from performance.now() may indicate that a couple of microseconds, or even more than a couple, have elapsed. That is perfectly normal behavior.
I encountered the same issue on chrome, where calls to performance.now () would return a higher value than the now value passed into subsequent callbacks made by window.requestAnimationFrame ()
My workaround was to set the before using the now passed to the callback in the first window.requestAnimationFrame () rather than performance.now (). It seems that using only one of the two functions to measure time guarantees progressing values.
I hope this helps anyone else suffering through this bug.
if i do:
setTimeout(function(){ alert('antani'); },400);
setTimeout(function(){ alert('supercazzola'); },400);
why does this script generates queue between these two timeouts?
Shouldn't they alerted in same moment?
as i can see testing it out, first, the first alert is executed, then the second.
Background
JavaScript has only the one thread in which the interpreter is running. This means that events are never processed at the same time. When you set a timeout you actually subscribe to some internal event of the browser that fires each time browser is in the idle mode (when not busy with rendering, parsing or executing some script, etc.). The handlers of this event as well as time they were scheduled are posted to the queue according to the order setTimeout appeared in the script. Each time this internal event is fired, the browser checks each handler and decides whether to execute and remove it from the queue or to skip it.
Same Time
When you schedule the tasks one after another with the same estimation time
setTimeout(function(){ $('.element').hide(); },400);
setTimeout(function(){ $('.element').show(); },400);
the .element will be first hidden and then shown. Note that it does not mean that the browser will render the change to .element after it's hidden. Most browsers will only render after the script has finished executing.
Different Time
When you schedule tasks with different estimation times:
setTimeout(function(){ $('.element').hide(); },401);
setTimeout(function(){ $('.element').show(); },400);
the result may be unpredictable. The following cases may occur:
more or equal to 400 and less than 401 milliseconds have passed and browser is starting to process event handlers. In this case .element will first be shown and then hidden. Note that there may be other setTimeouts scheduled to be executed after 400 milliseconds and they will run prior to the hide .element.
browser was busy for 401 milliseconds or more before it first starts to process event handlers. In this case most likely (depending on browser implementation) the .element will first be hidden and then shown, despite the fact that according to estimation time it should be vice versa!
Regarding your question: is it the same to set timeouts with the same time or some positive delta the answer is NO. It is not the same, when you set timeouts with delta there is always a possibility that another event or timeout will be processed between them.
Please read: http://ejohn.org/blog/how-javascript-timers-work/
Here's a similar example:
function a(){
var num = 5;
console.log( ++num );
setTimeout( a, 100 );
};
setTimeout(a,2000);
In chronological order:
you are defining function a without calling it
you are scheduling a to be invoked after two seconds: setTimeout(a,2000)
it is called
when it is called, it schedules itself for invocation after 100 milliseconds
Your code basically sleeps for 2 seconds and then executes a with 100 millisecond pauses[*].
However judging by your context you are asking what is the priority in the following situation:
setTimeout(a, 2000);
setTimeout(b, 100);
Well, most likely b will be called first (assuming there is no unpredictable pause between first and second line, e.g. due to overall OS performance problem).
If you use the same timeouts:
setTimeout(a, 100);
setTimeout(b, 100);
a will most likely be called first. However I don't think this is guaranteed and depends on the JS engine (whether it uses a strict FIFO list for upcoming events, what is the internal clock resolution, etc.)
I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.
_.debounce() fires at most evevry x milliseconds with _.debounce(function,x) .. I want to adapt this to only execute a method x millis after the last _.debounce().
How do I go about this? (I've read that $.debounce does exactly that btw.)
I've tried to do this, but it isn't bullet-proof (not to mention butt-ugly)
var timeout;
$(window).on("resize",_.debounce(function(){
if(timeout){
clearTimeout(timeout);
}
//when debounce comes in we cancel it.. this means only the latest debounce actually fires.
//not bullet proof
timeout = setTimeout(resizeMap,100);
},50));
How to do this elegantly?
After reading your comment, this is clearer now.
well perhaps it's my browser (infrequent resize events, causing _debounce to be called? testing on Chrome), but while resizing, I keep getting multiple calls to the body of the debounced function. As if it's behaving exactly as _.throttle now I come to think of it.. Weird stuff.
50ms is a pretty low debounce time. I'm betting it was working as intended, and you just need a longer debounce time. 50ms is 1/20th of a second. I'm not sure the window resize event fires that quickly. But even if it does, the tiniest pause in mouse movement while resizing could triggers this.
Remove all this setTimeout nonsense in your debounced function and set the debounce time to something more like 250 and I bet it will work just like you want.
From http://underscorejs.org/#debounce:
_.debounce(function, wait, [immediate])
Pass true for the immediate parameter to cause debounce to trigger the
function on the leading instead of the trailing edge of the wait
interval. Useful in circumstances like preventing accidental
double-clicks on a "submit" button from firing a second time.
So, $(window).on("resize",_.debounce(resizeMap,100)) should just work.
I have a complex animation sequence involving fades and transitions in JavaScript. During this sequence, which consists of four elements changing at once, a setTimeout is used on each element.
Tested in Internet Explorer 9, the animation works at realtime speed (it should take 1.6 seconds and it took exactly 1.6 seconds). ANY other browser will lag horribly, with animation times of 4 seconds (Firefox 3 and 4, Chrome, Opera) and something like 20 seconds in IE 8 and below.
How can IE9 go so fast while all other browsers are stuck in the mud?
I have tried to find ways of merging the elements into one, so as to one have one setTimeout at any given time, but unfortunately it wouldn't stand up to any interference (such as clicking a different link to start a new animation before the current one has finished).
EDIT: To elaborate in response to comments, here's the outline of the code:
link.onclick = function() {
clearTimeout(colourFadeTimeout);
colourFadeTimeout = setTimeout("colourFade(0);",25);
clearTimeout(arrowScrollTimeout);
arrowScrollTimeout = setTimeout("arrowScroll(0);",25);
clearTimeout(pageFadeOutTimeout);
pageFadeOutTimeout = setTimeout("pageFadeOut(0);",25);
clearTimeout(pageFadeInTimeout);
pageFadeInTimeout = setTimeout("pageFadeIn(0);",25);
}
Each of the four functions progress the fade by one frame, then set another timeout with the argument incremented, until the end of the animation.
You can see the page at http://adamhaskell.net/cw/index.html (Username: knockknock; Password: goaway) (it has sound and music, which can be disabled, but be warned!) - my JavaScript is very messy since I haven't really organised it properly, but it is commented a bit so hopefully you can see what the general idea is.
Several things:
Your timeout is 25ms. This translates to 40fps which is a very high framerate to try to achieve via javascript. Especially for things involving DOM manipulation that may trigger reflows. Increase it to 50 or 60. 15fps should be more than fluid enough for the kinds of animation you're doing. You're not trying to display videos here, just move things around the page.
Don't use strings as the first parameter to setTimeout(). Especially if you care about performance. That will force javascript to recompile the string each frame of animation. Use a function instead. If you need to pass an argument use an anonymous function to wrap the function you want to execute:
setTimeout(function(){
pageFadeIn(0)
},50);
this will only get compiled once when the script is loaded.
As mentioned by Ben, it is cheaper to use a single setTimeout to schedule the functions. For that matter, code clarity may improve by using setInterval instead (or it may not, depends on your coding style).
Additional answer:
Programming javascript animation is all about optimisation and compromise. It's possible to animate lots of things on the page with little slow-down but you need to know how to do it right and decide what to sacrifice. As an example of just how much can be animated at once is a demo real-time strategy game I wrote a couple of years ago.
Among the things I did to optimize the game are:
The walking soldiers are made up of only two frames of animation and I simply toggle between the two images. But the effect is very convincing nonetheless. You don't need perfect animation, just one that looks convincing.
I use a single setInterval for everything. It's cheaper CPU-wise and easier to manage. Just decide on a base frame rate and then schedule for different animation to start at different times.
Well, that's a lot of javascript (despite the "quadruple-dose of awesomeness" :)
You're firing a lot of setTimeout sequence, I'm not sure how well JS engines are optimised for this.. particularly IE <= 8
Ok, maybe just a rough suggestion... You could maybe write a small timing engine.
Maintain a global object that stores your current running timed events with the function to run, and the delay...
Then have a single setTimeout handler that check against that global object, and decreases the delay at each iteration and call the function when the delay becomes < 0
you global event would looks something like that:
var events = {
fade1 : {
fn : func_name,
delay : 25,
params : {}
}
fadeArrow : {
fn : func_name,
delay : 500,
params : {}
}
slideArrow : {
fn : func_name,
delay : 500,
params : {
arrow:some_value
}
}
}
then create a function to loop through these at an interval (maybe 10 or 20 ms) and decrease your delays until they complete and fire the function with params as a paramer to the function (check Function.call for that).
Once down, fire setTimeout again with the same delay..
To cancel an event just unset the property from the events list..
Build a few method to add / remove queued events, update params and so on..
That would reduce everything to just one timeout handler..
(just an idea)