How to optimally render large amounts of DOM elements using javascript? - javascript

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.
I re-render the list using following routine:
for (var i = 0, l = this.entries.length; i < l; i++) {
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}
dict is the hash of allowed items' ids
This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?
Thanks for your answers in advance.

Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.
then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.
another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?

You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:
for (var i = 0, l = this.entries.length; i < l; i++) {
setTimeout(function(){
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}, 0);
}

Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.
If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)

Related

is setInterval slowing down my site

I want to know if setInterval slowing down my site or not?
setInterval(function(){
var uploadbtndiv = document.getElementById("imagesmaindiv");
if (uploadbtndiv.childElementCount == 1) {
document.getElementsByClassName("plusupload")[0].style.top = "17px";
}else{
document.getElementsByClassName("plusupload")[0].style.top = "-81px";
}
}, 10);
setInterval doesn't slow down your site. Using it incorrectly can. In your code, you're scheduling an operation to happen roughly every 10ms. That's a lot. Even an efficient operation (and yours is tolerably efficient, though it could be more so) done 100 times a second can add up.
You probably don't want setInterval in your example. You appear to want to change where something is depending on how many elements there are in imagesmaindiv. I'd probably do that one of three different ways:
By putting that if/else in the code that adds/removes elements to/from imagesmaindiv
By using CSS, but it depends on the structure
By using a mutation observer on imagesmaindiv, so I only do the work when its contents change instead of 100 times a second

several changes to DOM in the same browser tick cycle

If I add an element to the DOM, are the changes immediate? If I remove the same element in the next line of my code, will the element appear on the screen (for a short period of time)? Or does the display get updated when the current browser cycle ends?
It will NEVER show no matter how fast your machine is. Javascript will run to completion blocking the UI until its done.
Try this
HTML
<div id='d'></div>
JS
var d = document.getElementById('d');
var p = document.createElement('p');
p.innerText = "One";
d.appendChild(p);
for (i = 0; i < 1000000; i++) {
for (z = 0; z < 10; z++){
// this is nonsense that runs for a sec or two to block the JS thread
// please never do this in production
}
}
p.innerText = "Two"
will pause your browser and then show Two ... never One
Obviously appearance of elements depends on the power of CPU, browser algorithms, graphic card render time, monitor frequency and many other factors. However The programs (e.g JavaScript) may continue the actions by considering virtual elements without errors.
On the other hand the browser algorithm may decide to render the code line by line or not. As an experience if you run a heavy loop to append items to the body, the Opera browser displays the items one by one however the Chrome will render the page at the end of loop. However if you do the loop using the JavaScript setTimeout, in all browsers you will see the elements appearing one by one.

Opening 500+ nodes at once in d3.js

Currently I am trying to expand a d3.js tree which contains over 100,000 nodes. Many leafs exist under multiple parents, as they fit multiple sections/items/regions. Searches performed by the user results in the tree opening up to all leafs with that node ID. This can result in the graph trying to open up, on rare occasions, up to 2000 leaf nodes at once. Currently, I have found the only way to do this without crashing Chrome is to use the following setInterval code.
var timeout = setInterval(function(){
for(var j = i; j < i + 10 ; j++){
makeEl(d[j]);
Search.rules += (j+1) + ") " + Graph.findNode(d[j]) + "<br><br>";
highlightPathTo(d[j].id);
if(j >= d.length - 1){
//When all of the elements have been itterated through.
clearInterval(timeout);
highlight.selected = d;
$('#highlights').removeClass('empty');
break;
}
}
i+=10;
}, 500);
However, this takes minutes and is very laggy. I there any other way to accomplish opening this number of nodes in one go that would result in a quicker completion?
Not likely. JavaScript and Browsers can't magically transcend the limits of physics (or your computer). Browsers have come a long way handling huge documents. But there are limits. I don't know for sure how much memory each tree node needs but if each of them needs just 1 KB, we're talking about 100 MB raw data which needs to be rendered on the screen. If each nodes just takes 10 ms to render, simply drawing the page will take 1000 seconds.
So my guess without looking more closely at the problem is that you can't do it. And you probably shouldn't: The human brain isn't able to process that much information nor is there a computer screen which could display it all. Think harder about what you really want to achieve and find a better representation.
You can dump a ton of data on your poor user but that will just drown them. Find a way to present just the important bits, the few gems under the ton of garbage.

How to delay rendering of DOM element to prevent unresponsiveness

I am fetching huge list of people (1000) from server as an API call.
Based on data I have to render "cards" for each user, which will contain their name, picture, age etc.
But when I do that, browser gets stuck for a while till all the cards are rendered. Is there any way to tell browser that its not necessary to render everything at once, it can do so one by one, without crashing itself ?
Take a look here
If you add an element to the DOM, the page will be repainted. If you add 100 elements one by one, it will be repainted 100 times, and that's slow. Create a single container and add nodes for the items to it before appending it to your page.
var container = $('<div>');
$.each(items, function (index, itm) {
var el = $('<div>');
el.attr('id', ID_PREFIX + index);
el.html(createHtml(itm));
container.append(el);
});
$('#list-container').append(container);
Something like the above (using jQuery, but you can be fine with plain JS).
(of course createHtml is an utility function you can define to get the markup for your item and items is the array with your data)
That said, I agree with #Bergi above: do you really need to show all items at the same time? Could you set a scrolling view that populates as you scroll down?
Next thing you can do, use RactiveJs or React to efficiently manage data-binding.
There are a few options here. The best options are, as others have mentioned, not to render at all:
Use an "infinite scroll" technique to only render what you need. The basic idea is to remove DOM elements that go offscreen and add those that come onscreen by inspecting the scroll position.
Use a different user-driven pagination mechanism.
Barring that, you can get better performance by rendering all in one go, either through constructing and setting innerHTML or by using a document fragment. But with 1000's of elements, you'll still get poor performance.
At this point, you probably want batch processing. This won't be faster, but it will free up the UI so that things aren't locked while you're rendering. A very basic batching approach might look like this:
// things to render
var items = [...];
var index = 0;
var batchSize = 100;
// time between batches, in ms
var batchInterval = 100;
function renderBatch(batch) {
// rendering logic here
}
function nextBatch() {
var batch = items.slice(index, index + batchSize);
renderBatch(batch);
index += batchSize;
if (index < items.length - 1) {
// Render the next batch after a given interval
setTimeout(nextBatch, batchInterval);
}
}
// kick off
nextBatch();
It's worth noting, though, that there are limits to this - rendering is a bottleneck, but every DOM element is going to impact client memory too. With 10s of 1000s of elements, things will be slow and unresponsive even after rendering is complete, because the memory usage is so high.

Most efficient way to throttle continuous JavaScript execution on a web page

I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.

Categories

Resources