several changes to DOM in the same browser tick cycle - javascript

If I add an element to the DOM, are the changes immediate? If I remove the same element in the next line of my code, will the element appear on the screen (for a short period of time)? Or does the display get updated when the current browser cycle ends?

It will NEVER show no matter how fast your machine is. Javascript will run to completion blocking the UI until its done.
Try this
HTML
<div id='d'></div>
JS
var d = document.getElementById('d');
var p = document.createElement('p');
p.innerText = "One";
d.appendChild(p);
for (i = 0; i < 1000000; i++) {
for (z = 0; z < 10; z++){
// this is nonsense that runs for a sec or two to block the JS thread
// please never do this in production
}
}
p.innerText = "Two"
will pause your browser and then show Two ... never One

Obviously appearance of elements depends on the power of CPU, browser algorithms, graphic card render time, monitor frequency and many other factors. However The programs (e.g JavaScript) may continue the actions by considering virtual elements without errors.
On the other hand the browser algorithm may decide to render the code line by line or not. As an experience if you run a heavy loop to append items to the body, the Opera browser displays the items one by one however the Chrome will render the page at the end of loop. However if you do the loop using the JavaScript setTimeout, in all browsers you will see the elements appearing one by one.

Related

How to delay rendering of DOM element to prevent unresponsiveness

I am fetching huge list of people (1000) from server as an API call.
Based on data I have to render "cards" for each user, which will contain their name, picture, age etc.
But when I do that, browser gets stuck for a while till all the cards are rendered. Is there any way to tell browser that its not necessary to render everything at once, it can do so one by one, without crashing itself ?
Take a look here
If you add an element to the DOM, the page will be repainted. If you add 100 elements one by one, it will be repainted 100 times, and that's slow. Create a single container and add nodes for the items to it before appending it to your page.
var container = $('<div>');
$.each(items, function (index, itm) {
var el = $('<div>');
el.attr('id', ID_PREFIX + index);
el.html(createHtml(itm));
container.append(el);
});
$('#list-container').append(container);
Something like the above (using jQuery, but you can be fine with plain JS).
(of course createHtml is an utility function you can define to get the markup for your item and items is the array with your data)
That said, I agree with #Bergi above: do you really need to show all items at the same time? Could you set a scrolling view that populates as you scroll down?
Next thing you can do, use RactiveJs or React to efficiently manage data-binding.
There are a few options here. The best options are, as others have mentioned, not to render at all:
Use an "infinite scroll" technique to only render what you need. The basic idea is to remove DOM elements that go offscreen and add those that come onscreen by inspecting the scroll position.
Use a different user-driven pagination mechanism.
Barring that, you can get better performance by rendering all in one go, either through constructing and setting innerHTML or by using a document fragment. But with 1000's of elements, you'll still get poor performance.
At this point, you probably want batch processing. This won't be faster, but it will free up the UI so that things aren't locked while you're rendering. A very basic batching approach might look like this:
// things to render
var items = [...];
var index = 0;
var batchSize = 100;
// time between batches, in ms
var batchInterval = 100;
function renderBatch(batch) {
// rendering logic here
}
function nextBatch() {
var batch = items.slice(index, index + batchSize);
renderBatch(batch);
index += batchSize;
if (index < items.length - 1) {
// Render the next batch after a given interval
setTimeout(nextBatch, batchInterval);
}
}
// kick off
nextBatch();
It's worth noting, though, that there are limits to this - rendering is a bottleneck, but every DOM element is going to impact client memory too. With 10s of 1000s of elements, things will be slow and unresponsive even after rendering is complete, because the memory usage is so high.

Why is this for loop blocking if it is called afterwards?

Why does the div[id=box] not get updated until the for loop finishes? If I comment out the for loop, the div displays instantly.
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// loop after element update
for (var i = 0; i < 2000000000; ++i) {}
};
http://jsfiddle.net/472BU/
Simply, ALL browser processes (JS, repainting the page, even responding to user-clicks/key-presses and in most cases refreshes page-changes... even closing the tab) all happen in the same process thread.
Thankfully this isn't 100% true, 100% of the time, anymore.
Certain browser-vendors are working to move different parts of the web-platform to different threads, for a smoother experience, but typically, if you lock your JS up, you lock everything.
This simply means that the browser won't actually repaint until JS has finished running, and gives control back to the DOM.
The good news is that it means you can measure elements by unhiding them, grabbing their dimensions and hiding them again, at the end of the function. The width/height that they would take up is calculated on the spot, but a large portion of the page might have to be painted if you change an element, so if it's possible to change 30000 elements in a loop, then painting them all as it happens would be a very bad thing.
The cause is already explained by others. If you want the box to be painted instantly, the solution is simple. Put the loop in a timeout:
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// no delay anymore
setTimeout( function(){for (var i = 0; i < 2000000000; ++i) {}},10);
};
jsFiddle
Also check web workers
That amount of iterations running continuously will use up all of the browser's resources and it won't be able to worry with applying styles.
Your javascript is executed in the order it appears there, but behind the scenes there is a queue for rendering style changes. In any normal usage, you wouldn't notice this behavior, but since you're running an poor performant loop, it becomes evident.
Problem
It's because JavaScript is single-threaded and will only be able to run that loop.
Anything else will be on hold for as long as the loop lasts. As the DOM is wired into the JavaScript the DOM will be blocked as well (in general, except in browsers where DOM runs on a separate thread and will generate an event for the event queue instead which will be on hold until the current executing scope has finished).
Solution
To avoid this you need to split your functions into several asynchronous operations (not the same as multi-threaded) which will enable the browser to invoke some of the events queued up in the event queue (for example paint events).
You can do this by splitting up your function to perform iteration in segments using an inner mechanism to dispatch batches instead.
For example:
Live demo
function busyLoop(callback) {
var segCounter = 0, /// keep track of segment
totCounter = 0, /// keep track of total count
max = 2000000000, /// max count
segment = 1000000; /// segment size (smaller = better response)
/// invoke first batch
(function nextBatch() {
segCounter = 0; /// reset segment counter for each time
for(; segCounter < segment && totCounter <= max; segCounter++, totCounter++) {
///...work here...
}
if (totCounter < max) {
/// call setTimeout() which makes it async, +/- 11ms gives browser
/// chance to process other events such as paint events:
setTimeout(nextBatch, 11);
/// optional progress callback here
} else
callback();
})();
}
Then call it with a callback function:
busyLoop(doneFunction);
Notice that you can now interact with DOM as well as getting feedback.
Tip: The smaller segments the more responsive the DOM but the longer the total time as the delay in-between accumulates. Experiment to find a balance that suits your solution.
Hope this helps.

Javascript animation with recursion, strange behavior

Im trying to do code in javascript a ruzzle solver. For now it just dig through the maze and find every possible path ( in the future I will match them against a dictionary to find the real valid words in it)
You can see it here : http://178.239.177.105/ruzzle/
I wanted to do it with an animation that show how the algorithm works on it, but im issuing a problem.
If you load it, the page just dont show anything, and my browser crash after a while.
BUT...
if you set an alert("") function, somewhere in the middle of the recursion function, you would be able to go through any step in the algorithm.
Especially if you set the browser to prevent to show any further alert messages, you'll finally see the animation working on the maze.
I was actually trying to do this via setInterval(), but is not working.
So I have two questions:
- Why do the script cause the page to crash, or not if there's an alert?
- How can I properly show the animation using some kind on wait() mechanism?
Thanks
You can see all the code by going on the page and look at the source code, however for the sake of clarity I'll paste the relevant code here:
You can also play with the code here : http://jsfiddle.net/Gcw2U/
(you will have to uncomment the last line in the to make it run)
//this matrix of chars rapresent the 4x4 puzzle
var ruzle_model = [["w","a","l","k"],["m","o","o","n"],["h","a","t","e"],["r","o","p","e"]];
// ""offsets" rapresent the four motion vector(up,down,left,right)
// used to visit the matrix
var offsets = [[1,0],[0,1],[-1,0],[0,-1]];
//recursive function to dig the maze
function path(m,i,j,paths,checkeds){
alert("SET BROWSER TO AVOID NEXT ALERTS MSGs!");
//base case, if not hitting a wall or already checked cell
if ( ! (i<=3 && i>=0 && j>=0 && j<=3) || isChecked(checkeds,i,j)){
terminal.innerHTML = terminal.innerHTML + "-"+ paths;
uncheckAllCells();
return paths;
}
//call path for every direction (up,down,left,right) stored in offsets
var tmp = [];
for (var c=0; c<offsets.length;++c){
var offset = offsets[c];
checkCells(i,j);
checkeds.push(new Array(i,j));
tmp.push(path(m,i+offset[0],j+offset[1],paths + m[i][j],copy(checkeds)));
}
return tmp;
}
//call path on every cell in the maze
function ruzzle(r){
var sol = []
for(var i=0; i<4; ++i){
for(var j=0; j<4; ++j){
var checkeds = new Array();
sol.push(path(r,i,j,'',checkeds));
}
}
terminal.innerHTML = sol;
return sol;
}
Javascript loops and recursions inhibit rendering of the page, so any changes made will stay invisible until the script stops executing, like when you spawn an alert. When a user sets "do not show alert messages", the alert still yields execution time to the underlying eventloop, which will update the page.
For as-fast-as-possible (high fps) animations, use requestAnimationFrame().
In your case, setTimeout() is the best way to go. Set a timeout on the recursive call to path.
function recursive(args) {
// do stuff to args
setTimeout(function () {
recursive(args);
}, 5);
}
Example

Most efficient way to throttle continuous JavaScript execution on a web page

I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.

How to optimally render large amounts of DOM elements using javascript?

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.
I re-render the list using following routine:
for (var i = 0, l = this.entries.length; i < l; i++) {
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}
dict is the hash of allowed items' ids
This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?
Thanks for your answers in advance.
Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.
then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.
another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?
You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:
for (var i = 0, l = this.entries.length; i < l; i++) {
setTimeout(function(){
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}, 0);
}
Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.
If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)

Categories

Resources