I have a single-page application (SPA) that dynamically shows different parts of DOM by adding and removing DOM subtrees (nodes with their children).
They act like a "pages", switched by the menus and app logic, and may contain a lot of child elements, like controls, text, tables etc.
The problem is, that when the new node added, it takes a long for browser to draw it, and that leads to the flickering. I tried to add a node with the opacity=0 and to change the opacity to 1 then, but how could I understand the timeout required?
Is it possible to detect when redraw is completed, and to make new node visible only after that?
I tried to use MutationObserver but it just calls my callback immediately after node was added, not when it is actually drawn. Now I use "setTimeout" but it is not possible to find out the timeout value to use.
I don't want to make app too laggy, and at the other hand, short timeouts for complex nodes with a lot of children ain't preventing this redraw flickering.
Please help.
Related
I have recently started learning the React framework and read that React creates a virtual DOM that keeps track of changes. When React updates the original DOM, it only updates the objects that have changed on the virtual DOM. Does this mean that when I program with just plain javascript and I append a new object, for example a new list item, the entire DOM is updated even though I had only added one new item?
In short, it depends. There are a few types of changes you can make to the DOM. From this article on rendering performance by Google:
If you change a “layout” property, so that’s one that changes an element’s geometry, like its width, height, or its position with left or top, the browser will have to check all the other elements and “reflow” the page.
These changes will require the entire DOM to repaint. However:
If you changed a “paint only” property, like a background image, text color, or shadows, in other words one that does not affect the layout of the page, then the browser skips layout, but it will still do paint.
So, adjusting say, the color property of some text would just require that element to repaint without needing to repaint other parts of the DOM. There are also changes you can make that go straight to compositing and do not require any kind of repaint.
The browser does its best to do as little work as necessary.
When you update the DOM, the reflow and repaint happen.
Every time the DOM changes, the browser needs to recalculate the CSS, do a layout and repaint the web page.
React doesn’t really do anything new. It’s just a strategic move.
What it does is It stores a replica of real DOM in memory. When you modify the DOM, it first applies these changes to the in-memory DOM. Then, using it’s diffing algorithm, figures out what has really changed.
Finally, it batches the changes and call applies them on real-dom in one go.
Thus, minimizing the re-flow and re-paint.
We're using Angular together with Angular-material design. We noticed that during rendering some more complicated views, the scripting and rendering blocks the event loop thus it makes the whole page unresponsive for a few seconds. I created a simple page that demonstrates this problem: https://quirky-hugle-91193d.netlify.com (source code: https://github.com/lukashavrlant/angular-lag-test) If you click on the button, it will generate 200 rows with a single mat-checkbox component. There is also a counter that is refreshed every 10 ms if everything is fine. You can clearly see that the counter stops during the rendering because the event loop is blocked by the *ngFor rendering the rows. I tried to measure the length of the block, it blocks it for 400 ms on my machine.
My question is: is it completely "normal" behavior? Can it be avoided? Is it really so much data that it has to block event loop for that long? Because the delay is easily noticeable by user.
The only solution we found was to render it incrementally. I. e. render one row, wait 1 ms, render the next row, wait 1 ms etc. Isn't there a better solution?
EDIT 1: I tried also the Chrome's Performance tool. It says it spent most of the time in "scripting" phase. I tried to look at the stack traces there but wasn't able to identify the problem.
Adding element to the DOM has a cost that you won't be able to reduce ever.
However if you look at the performance tab of the devtools while you are adding the elements you will notice a few things.
Right now it looks like you are build in dev mode, not prod. A prod build should be significantly faster.
Secondly, while there's an associated cost to adding elements in the DOM, there's also a much higher cost to instantiating those elements. Especially when it comes to Angular Material elements. They are really pretty and well made, but they are also on the heavy side when it comes to both code and html nodes. If you have to including lots of checkboxes at once, using regular html elements with a bit of css to make them look similar instead of an actual angular element might be they way to go.
In my case the actual operation of inserting everything in the DOM is ~90ms, out of the whole 350ms including the click event triggering the creation of all the elements in the ngFor loop.
The last part is further supported by the cost of elements after adding them, that you can also easily notice when looking at the performances. This is exacerbated by the dev mode once again, but it's present nonetheless.
So I would recommand trying it out in a more optimized build mode, and if it's still not enough maybe swap the Material component for a basic checkbox with css (maybe even a bit of svg).
I have a container widget which is updated very often. When some event comes, I need to append a child to container, or remove one of its children. Think of this as of a list of chat participants which listens for the event of joining/leaving the chat.
If the number of children is really big (several hundreds) along with big number of events arriving, the drawing of DOM starts to slow down. User notices delays. I.e. adding/removing of sub-element to container starts takes more and more time.
How can I speed up this process?
I think of using a kind of a 'batch update' technique, well-known from database applications. Indeed, if I would collect events of updating my list and run the list of updates one time in a second - will this speed up my computations? In fact the number of widgets to add would not change, but probably by running this task one by one I can get the 'batch' effect somehow?
This is just a guess.
I am using GWT, but its not the point. Execution is especially slow in IE8, but my question is cross-browser.
Set display:none for container before adding children. you can set it back to its original state after
I have a little problem with conditions and its triggering. I have 2 object in my HTML (div and img), that I am trying to constantly align by JS. By constantly I mean so that when window size changes, they realign (since one is aligned to the center - and no :), I can't center-align the second one as well, because I also need to match the size, which definitely requires JS).
I made a little function that aligns them and sets proper dimensions to it and I am triggering the function on every window.onresize event (as well as on document ready). But I found out, that it does not trigger on zoom action and besides that it would be more suitable for me not to be dependent on window.onresize.
So I thought there would be a posibility to write a condition like
if (div.width() != img.widht()) {
// do something to match it again
}
But it turned out to only run this condition on the ready event (resp. load event, since I have a picture). So my question is, if there is any way, so that the condition would be checking its state just continuosly? I know, I can probably set Interval to take care of that, but a) I guess that like 99% of all executions would be pointless and b) unless I set it to like very quick repetition, it would not even fix the div's and img's mismatch problem immediately.
Thank you very much.
You can certainly define you own custom event and execute the aligning code when it occurs, but you still need a way to fire the event at appropriate time.
That can only happen during the ordinary execution flow of the program (not an option here) or in the handler for one of the existing events: if none of those events is consistently fired when the trigger condition occurs, then you're only left with timers.
I'd be happy to be wrong on this, tho'.
Consider requestAnimationFrame as an alternative to setInterval.
do you have any experiences with the following problem: JavaScript has to run hundreds of performance intensive function calls which cannot be skipped and causing the browser to feel crashed for a few seconds (e.g. no scrolling and clicking)? Example: Imagine 500 calls for getting an elements height and then doing hundreds of DOM modifications, e.g. setting classes etc.
Unfortunately there is no way to avoid the performance intensive tasks. Web workers might be an approach, but they are not very well supported (IE...). I'm thinking of a timeout or callback based step by step rendering giving the browser time to do something in between. Do you have any experiences you can share on this?
Best regards
Take a look at this topic this is some thing related to your question.
How to improve the performance of your java script in your page?
If your doing that much DOM manipulation, you should probably clone the elements in question or the DOM itself, and do the changes on a cached version, and then replace the whole ting in one go or in larger sections, and not one element at the time.
What takes time is'nt so much the calculations and functions etc. but the DOM manipulation itself, and doing that only once, or a couple of times in sections, will greatly improve the speed of what you're doing.
As far as I know web workers aren't really for DOM manipulation, and I don't think there will be much of an advantage in using them, as the problem probably is the fact that you are changing a shitload of elements one by one instead of replacing them all in the DOM in one batch instead.
Here is what I can recommend in this case:
Checking the code again. Try to apply some standard optimisations as suggested, e.g. reducing lookups, making DOM modifications offline (e.g. with document.createDocumentFragment()...). Working with DOM fragments only works in a limited way. Retrieving element height and doing complex formating won't work sufficient.
If 1. does not solve the problem create a rendering solution running on demand, e.g. triggered by a scroll event. Or: Render step by step with timeouts to give the browser time to do something in between, e.g. clicking a button or scrolling.
Short example for step by step rendering in 2.:
var elt = $(...);
function timeConsumingRendering() {
// some rendering here related to the element "elt"
elt = elt.next();
window.setTimeout((function(elt){
return timeConsumingRendering;
})(elt));
}
// start
timeConsumingRendering();