How to delay rendering of DOM element to prevent unresponsiveness - javascript

I am fetching huge list of people (1000) from server as an API call.
Based on data I have to render "cards" for each user, which will contain their name, picture, age etc.
But when I do that, browser gets stuck for a while till all the cards are rendered. Is there any way to tell browser that its not necessary to render everything at once, it can do so one by one, without crashing itself ?

Take a look here
If you add an element to the DOM, the page will be repainted. If you add 100 elements one by one, it will be repainted 100 times, and that's slow. Create a single container and add nodes for the items to it before appending it to your page.
var container = $('<div>');
$.each(items, function (index, itm) {
var el = $('<div>');
el.attr('id', ID_PREFIX + index);
el.html(createHtml(itm));
container.append(el);
});
$('#list-container').append(container);
Something like the above (using jQuery, but you can be fine with plain JS).
(of course createHtml is an utility function you can define to get the markup for your item and items is the array with your data)
That said, I agree with #Bergi above: do you really need to show all items at the same time? Could you set a scrolling view that populates as you scroll down?
Next thing you can do, use RactiveJs or React to efficiently manage data-binding.

There are a few options here. The best options are, as others have mentioned, not to render at all:
Use an "infinite scroll" technique to only render what you need. The basic idea is to remove DOM elements that go offscreen and add those that come onscreen by inspecting the scroll position.
Use a different user-driven pagination mechanism.
Barring that, you can get better performance by rendering all in one go, either through constructing and setting innerHTML or by using a document fragment. But with 1000's of elements, you'll still get poor performance.
At this point, you probably want batch processing. This won't be faster, but it will free up the UI so that things aren't locked while you're rendering. A very basic batching approach might look like this:
// things to render
var items = [...];
var index = 0;
var batchSize = 100;
// time between batches, in ms
var batchInterval = 100;
function renderBatch(batch) {
// rendering logic here
}
function nextBatch() {
var batch = items.slice(index, index + batchSize);
renderBatch(batch);
index += batchSize;
if (index < items.length - 1) {
// Render the next batch after a given interval
setTimeout(nextBatch, batchInterval);
}
}
// kick off
nextBatch();
It's worth noting, though, that there are limits to this - rendering is a bottleneck, but every DOM element is going to impact client memory too. With 10s of 1000s of elements, things will be slow and unresponsive even after rendering is complete, because the memory usage is so high.

Related

Should I use a for loop to iterate over my object's array?

I have created a DomObject class that can display boxes that move around.
jFiddle
Naturally, if I increase the number of boxes in the DomObject, the movement function takes longer to run since it is a for loop.
beginMovement = () => {
if (!this.timer) {
// console.log("Begin Movement");
this.timer = setInterval(this.movement, this.time);
}
};
movement = () => {
// here we know that its running
let i = 0;
while (i < this.boxes.length) {
this.boxes[i].move();
i++;
}
}
In the case that I increase the length of this.boxes, I notice that there are performance issues.
So my main question is, should I be using a for loop in this instance? Or should I avoid using basic html for moving items all together and move onto using canvas
Depends what your goal is. You seem to be trying to do some sort of an animation in which case using canvas/WebGL is a better option if you are going for raw speed.
Now your objects are living on the DOM, which wasn't originally designed to display fancy graphics and animations. Every design consideration for canvas was made explicitly for animation and complex bitmap operations like colorization and blurring. You never have to "find" a canvas element. You can access them far more quickly than even a cached DOM element. Even updating text elements in canvas is faster then doing so on the DOM.
Here is a related SO discussion on canvas vs DOM.

is setInterval slowing down my site

I want to know if setInterval slowing down my site or not?
setInterval(function(){
var uploadbtndiv = document.getElementById("imagesmaindiv");
if (uploadbtndiv.childElementCount == 1) {
document.getElementsByClassName("plusupload")[0].style.top = "17px";
}else{
document.getElementsByClassName("plusupload")[0].style.top = "-81px";
}
}, 10);
setInterval doesn't slow down your site. Using it incorrectly can. In your code, you're scheduling an operation to happen roughly every 10ms. That's a lot. Even an efficient operation (and yours is tolerably efficient, though it could be more so) done 100 times a second can add up.
You probably don't want setInterval in your example. You appear to want to change where something is depending on how many elements there are in imagesmaindiv. I'd probably do that one of three different ways:
By putting that if/else in the code that adds/removes elements to/from imagesmaindiv
By using CSS, but it depends on the structure
By using a mutation observer on imagesmaindiv, so I only do the work when its contents change instead of 100 times a second

Why is this for loop blocking if it is called afterwards?

Why does the div[id=box] not get updated until the for loop finishes? If I comment out the for loop, the div displays instantly.
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// loop after element update
for (var i = 0; i < 2000000000; ++i) {}
};
http://jsfiddle.net/472BU/
Simply, ALL browser processes (JS, repainting the page, even responding to user-clicks/key-presses and in most cases refreshes page-changes... even closing the tab) all happen in the same process thread.
Thankfully this isn't 100% true, 100% of the time, anymore.
Certain browser-vendors are working to move different parts of the web-platform to different threads, for a smoother experience, but typically, if you lock your JS up, you lock everything.
This simply means that the browser won't actually repaint until JS has finished running, and gives control back to the DOM.
The good news is that it means you can measure elements by unhiding them, grabbing their dimensions and hiding them again, at the end of the function. The width/height that they would take up is calculated on the spot, but a large portion of the page might have to be painted if you change an element, so if it's possible to change 30000 elements in a loop, then painting them all as it happens would be a very bad thing.
The cause is already explained by others. If you want the box to be painted instantly, the solution is simple. Put the loop in a timeout:
document.getElementById('click').onclick = function() {
document.getElementById('box').style.display = 'block';
// no delay anymore
setTimeout( function(){for (var i = 0; i < 2000000000; ++i) {}},10);
};
jsFiddle
Also check web workers
That amount of iterations running continuously will use up all of the browser's resources and it won't be able to worry with applying styles.
Your javascript is executed in the order it appears there, but behind the scenes there is a queue for rendering style changes. In any normal usage, you wouldn't notice this behavior, but since you're running an poor performant loop, it becomes evident.
Problem
It's because JavaScript is single-threaded and will only be able to run that loop.
Anything else will be on hold for as long as the loop lasts. As the DOM is wired into the JavaScript the DOM will be blocked as well (in general, except in browsers where DOM runs on a separate thread and will generate an event for the event queue instead which will be on hold until the current executing scope has finished).
Solution
To avoid this you need to split your functions into several asynchronous operations (not the same as multi-threaded) which will enable the browser to invoke some of the events queued up in the event queue (for example paint events).
You can do this by splitting up your function to perform iteration in segments using an inner mechanism to dispatch batches instead.
For example:
Live demo
function busyLoop(callback) {
var segCounter = 0, /// keep track of segment
totCounter = 0, /// keep track of total count
max = 2000000000, /// max count
segment = 1000000; /// segment size (smaller = better response)
/// invoke first batch
(function nextBatch() {
segCounter = 0; /// reset segment counter for each time
for(; segCounter < segment && totCounter <= max; segCounter++, totCounter++) {
///...work here...
}
if (totCounter < max) {
/// call setTimeout() which makes it async, +/- 11ms gives browser
/// chance to process other events such as paint events:
setTimeout(nextBatch, 11);
/// optional progress callback here
} else
callback();
})();
}
Then call it with a callback function:
busyLoop(doneFunction);
Notice that you can now interact with DOM as well as getting feedback.
Tip: The smaller segments the more responsive the DOM but the longer the total time as the delay in-between accumulates. Experiment to find a balance that suits your solution.
Hope this helps.

Most efficient way to throttle continuous JavaScript execution on a web page

I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.

How to optimally render large amounts of DOM elements using javascript?

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.
I re-render the list using following routine:
for (var i = 0, l = this.entries.length; i < l; i++) {
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}
dict is the hash of allowed items' ids
This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?
Thanks for your answers in advance.
Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.
then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.
another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?
You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:
for (var i = 0, l = this.entries.length; i < l; i++) {
setTimeout(function(){
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}, 0);
}
Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.
If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)

Categories

Resources