Most efficient way to throttle continuous JavaScript execution on a web page - javascript

I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.

Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)

web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers

You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.

One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.

Related

is setInterval slowing down my site

I want to know if setInterval slowing down my site or not?
setInterval(function(){
var uploadbtndiv = document.getElementById("imagesmaindiv");
if (uploadbtndiv.childElementCount == 1) {
document.getElementsByClassName("plusupload")[0].style.top = "17px";
}else{
document.getElementsByClassName("plusupload")[0].style.top = "-81px";
}
}, 10);
setInterval doesn't slow down your site. Using it incorrectly can. In your code, you're scheduling an operation to happen roughly every 10ms. That's a lot. Even an efficient operation (and yours is tolerably efficient, though it could be more so) done 100 times a second can add up.
You probably don't want setInterval in your example. You appear to want to change where something is depending on how many elements there are in imagesmaindiv. I'd probably do that one of three different ways:
By putting that if/else in the code that adds/removes elements to/from imagesmaindiv
By using CSS, but it depends on the structure
By using a mutation observer on imagesmaindiv, so I only do the work when its contents change instead of 100 times a second

Waiting for animations in a SetInterval gameloop

So my problem is I want to insert a custom animation but I don't want to ruin my gameloop.
My initial gameloop is stated here:
function init(){
if(!gameOver){
if(resetInterval>-1) clearInterval(resetInterval);
createBlock();
resetInterval = setInterval(moveDownCheck,gameSpeed);
}
}
My game is a tetris like game except instead of dropping tetriminos, I drop 2x1 blocks of different color. The moveDownCheck method checks if there are any blocks under my 2x1 block and then drops it by 1 row. This works fine until I have a block hanging without a block underneath since the 2x1 blocks are connected. I want to insert a drop animation that would take about a second and drop the hanging block by the same gameSpeed increment.
Here is my attempt that doesn't work:
function moveFallingDown(){
fbDownFlag = false;
clearInterval(resetInterval);
fbInterval = setInterval(function(){
fallingBlock.row++;
console.log("Dropped One Row");
},gameSpeed);
while(landscape[fallingBlock.row+1][fallingBlock.col]==0){
console.log("Waiting to Drop Falling Block");
}
clearInterval(fbInterval);
resetInterval = setInterval(moveDownCheck,gameSpeed);
}
Here I am attempting to wait for the function(){fallingBlock.row++;}, but my game just crashes and in the console "Dropped One Row" yet "Waiting to Drop Falling Block" will display thousands of times.
I guess I shouldn't be using a while loop here, but the only other solution I can think of would be a complete rework of my design, or nested setInterval methods which would just make my head hurt too much.
You can't do this with a while loop, you need to use a recursive function. window.setTimeout would work, however this seems like a good use case for requestAnimationFrame. Check it out here: https://developer.mozilla.org/en-US/docs/Web/API/window.requestAnimationFrame
You can use that to call your moveFallingDown method, and check how long has passed since the last animation frame to move your animation the right amount according to the game speed, by using the high precision timestamp passed to the requestAnimationFrame callback.
#Adrien Delessert's advice is correct but I'll just add that you're definitely confusing setTimeout and setInterval.
First of all you don't need a while loop.
setInterval IS a loop. So, if you wanted to use it you'd need to wrap (basically) the whole game in a method that moves the game forward (whatever that means) and pass that to setInterval.
However, what you're doing (and this is actually not a terrible approach) is to use it as a means to animate specific things. In that case you should have a recursive(ish) function that keeps calling setTimeout when it's done, if the conditions for another round of animation are met.
I haven't ever used requestAnimationFrame, but that does sound like a much more elegant approach to the problem.
The reason it's better is that it leverages the browser's own refresh timer (about 60 times per second) and will slot your animation frames in along with its own refresh.
So yes, you will listen for that callback and then react to it as necessary. If 60x per second is too fast, you'll need to put in a % based counter for how many of those frames you wish to actually react to.

Why is this for loop blocking if it is called afterwards?

Why does the div[id=box] not get updated until the for loop finishes? If I comment out the for loop, the div displays instantly.
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// loop after element update
for (var i = 0; i < 2000000000; ++i) {}
};
http://jsfiddle.net/472BU/
Simply, ALL browser processes (JS, repainting the page, even responding to user-clicks/key-presses and in most cases refreshes page-changes... even closing the tab) all happen in the same process thread.
Thankfully this isn't 100% true, 100% of the time, anymore.
Certain browser-vendors are working to move different parts of the web-platform to different threads, for a smoother experience, but typically, if you lock your JS up, you lock everything.
This simply means that the browser won't actually repaint until JS has finished running, and gives control back to the DOM.
The good news is that it means you can measure elements by unhiding them, grabbing their dimensions and hiding them again, at the end of the function. The width/height that they would take up is calculated on the spot, but a large portion of the page might have to be painted if you change an element, so if it's possible to change 30000 elements in a loop, then painting them all as it happens would be a very bad thing.
The cause is already explained by others. If you want the box to be painted instantly, the solution is simple. Put the loop in a timeout:
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// no delay anymore
setTimeout( function(){for (var i = 0; i < 2000000000; ++i) {}},10);
};
jsFiddle
Also check web workers
That amount of iterations running continuously will use up all of the browser's resources and it won't be able to worry with applying styles.
Your javascript is executed in the order it appears there, but behind the scenes there is a queue for rendering style changes. In any normal usage, you wouldn't notice this behavior, but since you're running an poor performant loop, it becomes evident.
Problem
It's because JavaScript is single-threaded and will only be able to run that loop.
Anything else will be on hold for as long as the loop lasts. As the DOM is wired into the JavaScript the DOM will be blocked as well (in general, except in browsers where DOM runs on a separate thread and will generate an event for the event queue instead which will be on hold until the current executing scope has finished).
Solution
To avoid this you need to split your functions into several asynchronous operations (not the same as multi-threaded) which will enable the browser to invoke some of the events queued up in the event queue (for example paint events).
You can do this by splitting up your function to perform iteration in segments using an inner mechanism to dispatch batches instead.
For example:
Live demo
function busyLoop(callback) {
var segCounter = 0, /// keep track of segment
totCounter = 0, /// keep track of total count
max = 2000000000, /// max count
segment = 1000000; /// segment size (smaller = better response)
/// invoke first batch
(function nextBatch() {
segCounter = 0; /// reset segment counter for each time
for(; segCounter < segment && totCounter <= max; segCounter++, totCounter++) {
///...work here...
}
if (totCounter < max) {
/// call setTimeout() which makes it async, +/- 11ms gives browser
/// chance to process other events such as paint events:
setTimeout(nextBatch, 11);
/// optional progress callback here
} else
callback();
})();
}
Then call it with a callback function:
busyLoop(doneFunction);
Notice that you can now interact with DOM as well as getting feedback.
Tip: The smaller segments the more responsive the DOM but the longer the total time as the delay in-between accumulates. Experiment to find a balance that suits your solution.
Hope this helps.

Why do multiple setTimeout() calls cause so much lag?

I have a complex animation sequence involving fades and transitions in JavaScript. During this sequence, which consists of four elements changing at once, a setTimeout is used on each element.
Tested in Internet Explorer 9, the animation works at realtime speed (it should take 1.6 seconds and it took exactly 1.6 seconds). ANY other browser will lag horribly, with animation times of 4 seconds (Firefox 3 and 4, Chrome, Opera) and something like 20 seconds in IE 8 and below.
How can IE9 go so fast while all other browsers are stuck in the mud?
I have tried to find ways of merging the elements into one, so as to one have one setTimeout at any given time, but unfortunately it wouldn't stand up to any interference (such as clicking a different link to start a new animation before the current one has finished).
EDIT: To elaborate in response to comments, here's the outline of the code:
link.onclick = function() {
clearTimeout(colourFadeTimeout);
colourFadeTimeout = setTimeout("colourFade(0);",25);
clearTimeout(arrowScrollTimeout);
arrowScrollTimeout = setTimeout("arrowScroll(0);",25);
clearTimeout(pageFadeOutTimeout);
pageFadeOutTimeout = setTimeout("pageFadeOut(0);",25);
clearTimeout(pageFadeInTimeout);
pageFadeInTimeout = setTimeout("pageFadeIn(0);",25);
}
Each of the four functions progress the fade by one frame, then set another timeout with the argument incremented, until the end of the animation.
You can see the page at http://adamhaskell.net/cw/index.html (Username: knockknock; Password: goaway) (it has sound and music, which can be disabled, but be warned!) - my JavaScript is very messy since I haven't really organised it properly, but it is commented a bit so hopefully you can see what the general idea is.
Several things:
Your timeout is 25ms. This translates to 40fps which is a very high framerate to try to achieve via javascript. Especially for things involving DOM manipulation that may trigger reflows. Increase it to 50 or 60. 15fps should be more than fluid enough for the kinds of animation you're doing. You're not trying to display videos here, just move things around the page.
Don't use strings as the first parameter to setTimeout(). Especially if you care about performance. That will force javascript to recompile the string each frame of animation. Use a function instead. If you need to pass an argument use an anonymous function to wrap the function you want to execute:
setTimeout(function(){
pageFadeIn(0)
},50);
this will only get compiled once when the script is loaded.
As mentioned by Ben, it is cheaper to use a single setTimeout to schedule the functions. For that matter, code clarity may improve by using setInterval instead (or it may not, depends on your coding style).
Additional answer:
Programming javascript animation is all about optimisation and compromise. It's possible to animate lots of things on the page with little slow-down but you need to know how to do it right and decide what to sacrifice. As an example of just how much can be animated at once is a demo real-time strategy game I wrote a couple of years ago.
Among the things I did to optimize the game are:
The walking soldiers are made up of only two frames of animation and I simply toggle between the two images. But the effect is very convincing nonetheless. You don't need perfect animation, just one that looks convincing.
I use a single setInterval for everything. It's cheaper CPU-wise and easier to manage. Just decide on a base frame rate and then schedule for different animation to start at different times.
Well, that's a lot of javascript (despite the "quadruple-dose of awesomeness" :)
You're firing a lot of setTimeout sequence, I'm not sure how well JS engines are optimised for this.. particularly IE <= 8
Ok, maybe just a rough suggestion... You could maybe write a small timing engine.
Maintain a global object that stores your current running timed events with the function to run, and the delay...
Then have a single setTimeout handler that check against that global object, and decreases the delay at each iteration and call the function when the delay becomes < 0
you global event would looks something like that:
var events = {
fade1 : {
fn : func_name,
delay : 25,
params : {}
}
fadeArrow : {
fn : func_name,
delay : 500,
params : {}
}
slideArrow : {
fn : func_name,
delay : 500,
params : {
arrow:some_value
}
}
}
then create a function to loop through these at an interval (maybe 10 or 20 ms) and decrease your delays until they complete and fire the function with params as a paramer to the function (check Function.call for that).
Once down, fire setTimeout again with the same delay..
To cancel an event just unset the property from the events list..
Build a few method to add / remove queued events, update params and so on..
That would reduce everything to just one timeout handler..
(just an idea)

How to tell what's causing slow HTML5 Canvas performance?

How can I tell if the canvas's slow performance is caused by the drawing itself, or the underlying logic that calculates what should be drawn and where?
The second part of my question is: how to calculate canvas fps? Here's how I did it, seems logical to me, but I can be absolutely wrong too. Is this the right way to do it?
var fps = 0;
setInterval(draw, 1000/30);
setInterval(checkFps, 1000);
function draw() {
//...
fps++;
}
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
Edit:
I replaced the above with the following, according to Nathan's comments:
var lastTimeStamp = new Date().getTime();
function draw() {
//...
var now = new Date().getTime();
$("#fps").html(Math.floor(1000/(now - lastTimeStamp)));
lastTimeStamp = now;
}
So how's this one? You could also calculate only the difference in ms since the last update, performance differences can be seen that way too. By the way, I also did a side-by-side comparison of the two, and they usually moved pretty much together (a difference of 2 at most), however, the latter one had bigger spikes, when performance was extraordinarily low.
Your FPS code is definitely wrong
setInterval(checkFps, 1000);
No-one assures this function will be called exactly every second (it could be more than 1000ms, or less - but probably more), so
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
is wrong (if fps is 32 at that moment it is possible that you have 32 frames in 1.5s (extreme case))
beter is to see what was the real time passes since the last update and calculate the realtimepassed / frames (I'm sure javascript has function to get the time, but I'm not sure if it will be accurate enough = ms or better)
fps is btw not a good name, it contains the number of frames (since last update), not the number of frames per second, so frames would be a better name.
In the same way
setInterval(draw, 1000/30);
is wrong, since you want to achieve a FPS of 30, but since the setInterval is not very accurate (and is probably going to wait longer than you say, you will end up with lower FPS even if the CPU is able to handle the load)
Webkit and Firebug both provide profiling tools to see where CPU cycles are being spent in your javascript code. I'd recommend starting there.
For the FPS calculation, I don't think your code is going to work, but I don't have any good recommendation :(
Reason being: Most (all?) browsers use a dedicated thread for running javascript and a different thread for running UI updates. If the Javascript thread is busy, the UI thread won't be triggered.
So, you can run some javascript looping code that'll "update" the UI 1000 times in succession (for instance, setting the color of some text) - but unless you add a setTimeout to allow the UI thread to paint the change, you won't see any changes until the 1000 iterations are finished.
That said, I don't know if you can assertively increment your fps counter at the end of the draw() routine. Sure, your javascript function has finished, but did the browser actually draw?
Check if you dont use some innerHTML method to debug your project. This can slow your project in a way you can't imagine, especially if you do some concatenation like this innerHTML += newDebugValues;
Or like desau said, profile your cpu usage with firebug or webkit inner debugger.

Categories

Resources