do you have any experiences with the following problem: JavaScript has to run hundreds of performance intensive function calls which cannot be skipped and causing the browser to feel crashed for a few seconds (e.g. no scrolling and clicking)? Example: Imagine 500 calls for getting an elements height and then doing hundreds of DOM modifications, e.g. setting classes etc.
Unfortunately there is no way to avoid the performance intensive tasks. Web workers might be an approach, but they are not very well supported (IE...). I'm thinking of a timeout or callback based step by step rendering giving the browser time to do something in between. Do you have any experiences you can share on this?
Best regards
Take a look at this topic this is some thing related to your question.
How to improve the performance of your java script in your page?
If your doing that much DOM manipulation, you should probably clone the elements in question or the DOM itself, and do the changes on a cached version, and then replace the whole ting in one go or in larger sections, and not one element at the time.
What takes time is'nt so much the calculations and functions etc. but the DOM manipulation itself, and doing that only once, or a couple of times in sections, will greatly improve the speed of what you're doing.
As far as I know web workers aren't really for DOM manipulation, and I don't think there will be much of an advantage in using them, as the problem probably is the fact that you are changing a shitload of elements one by one instead of replacing them all in the DOM in one batch instead.
Here is what I can recommend in this case:
Checking the code again. Try to apply some standard optimisations as suggested, e.g. reducing lookups, making DOM modifications offline (e.g. with document.createDocumentFragment()...). Working with DOM fragments only works in a limited way. Retrieving element height and doing complex formating won't work sufficient.
If 1. does not solve the problem create a rendering solution running on demand, e.g. triggered by a scroll event. Or: Render step by step with timeouts to give the browser time to do something in between, e.g. clicking a button or scrolling.
Short example for step by step rendering in 2.:
var elt = $(...);
function timeConsumingRendering() {
// some rendering here related to the element "elt"
elt = elt.next();
window.setTimeout((function(elt){
return timeConsumingRendering;
})(elt));
}
// start
timeConsumingRendering();
Related
We're using Angular together with Angular-material design. We noticed that during rendering some more complicated views, the scripting and rendering blocks the event loop thus it makes the whole page unresponsive for a few seconds. I created a simple page that demonstrates this problem: https://quirky-hugle-91193d.netlify.com (source code: https://github.com/lukashavrlant/angular-lag-test) If you click on the button, it will generate 200 rows with a single mat-checkbox component. There is also a counter that is refreshed every 10 ms if everything is fine. You can clearly see that the counter stops during the rendering because the event loop is blocked by the *ngFor rendering the rows. I tried to measure the length of the block, it blocks it for 400 ms on my machine.
My question is: is it completely "normal" behavior? Can it be avoided? Is it really so much data that it has to block event loop for that long? Because the delay is easily noticeable by user.
The only solution we found was to render it incrementally. I. e. render one row, wait 1 ms, render the next row, wait 1 ms etc. Isn't there a better solution?
EDIT 1: I tried also the Chrome's Performance tool. It says it spent most of the time in "scripting" phase. I tried to look at the stack traces there but wasn't able to identify the problem.
Adding element to the DOM has a cost that you won't be able to reduce ever.
However if you look at the performance tab of the devtools while you are adding the elements you will notice a few things.
Right now it looks like you are build in dev mode, not prod. A prod build should be significantly faster.
Secondly, while there's an associated cost to adding elements in the DOM, there's also a much higher cost to instantiating those elements. Especially when it comes to Angular Material elements. They are really pretty and well made, but they are also on the heavy side when it comes to both code and html nodes. If you have to including lots of checkboxes at once, using regular html elements with a bit of css to make them look similar instead of an actual angular element might be they way to go.
In my case the actual operation of inserting everything in the DOM is ~90ms, out of the whole 350ms including the click event triggering the creation of all the elements in the ngFor loop.
The last part is further supported by the cost of elements after adding them, that you can also easily notice when looking at the performances. This is exacerbated by the dev mode once again, but it's present nonetheless.
So I would recommand trying it out in a more optimized build mode, and if it's still not enough maybe swap the Material component for a basic checkbox with css (maybe even a bit of svg).
I would like to know what is the maximum data that angular framework can handle. Say, I am displaying a chart using angular and some charting framework like chartjs. I'd like to know up to how many data can the browser display properly, with slowness, or up to when it crashes.
Your question has no simple answer, but I will try to flatten it and give a simple answer, or at least simple things to consider...
Angular (at runtime), like many other frameworks is simply JavaScript,
So let us reduce the question to "Limitation of JavaScript and browsers with regards to data loaded",
JavaScript has no upper limit of memory or storage it can handle,
I've seen JavaScript applications that require more than 15GB of RAM,
and they performed well too.
So assume data size itself is not an issue (unless your application is poorly implemented, leaking memory or just not very efficient, of course).
The main challenge as I see it, is displaying and manipulating the information
without causing unnecessary delay or unresponsiveness.
Displaying the information - let's say you have a list (or a table) containing 1,000,000 possible gifts which you then want to display for the user to select.
Adding the list items to the document one by one is tempting, but will require the browser to repaint after each addition (causing a delay or full unresponsiveness until finished), another way is adding the elements to some DOM element (denoted by N) still being kept in memory, then adding all elements corresponding the list items to the element N (still, just an in memory operation), finally adding N to the document containing the entire list - the will be a much better solution for displaying the large amount of data.
Manipulating the information - displaying is indeed not enough. you would like to move, drag, sort and filter the data being displayed. And as mentioned before, it is a bad idea removing many elements directly from DOM. You should instead remove container from the document's DOM to memory, manipulate the data in it, and then add the container right back to the document. Angular does this kind of magic for you.
(Toggling the 'display:none\block' css attribute of many elements has a similar blocking effect as I recall).
A good practice is implementing an application/page showing only the amount of data that can be processed by a human at a single glance. The rest of it should be considered in the application data-layer, in memory, and should be loaded to display given the appropriate need or request.
To conclude, you can deal with huge amounts of data as long as you provide a mechanism that efficiently filter the displayed information.
I hope it helps...
for further reading:
Slow and fast ways of adding elements to DOM
A question emphasizes the lack of memory limit used by JS
CSS display attribute performance
A good discussion about the reasons for slow DOM
About using HTML5 correctly - old but still true
Once the DOM creation procedure is understood - it much easier to display data without affecting performance / user experience
I have already read 10 articles about React and Virtual DOM.
I understand that virtual DOM uses the diffing algorithm and only update the UI that was changed. But I still don't understand why that is faster than updating the actual DOM.
Following is an example:
<div id="test">
Hello React!
</div>
Let's say we created a component and changed it using React. Let's say we changed the text to Hello World!
I can do the same thing using plain JS right ? document.getElementById('test').innerHTML = Hello World!
My question:
Why is React faster ? I feel like React is doing exactly same thing under the hood right ?
I feel like I am missing something fundamental here.
In your case the plain js function will be definetly faster. React is just very good for really complicated UIs. The more complicated your UI gets, you either need to write a lot code to update it or you just rebuild the whole UI on every rerender. However those DOM updates are quite slow. React allows you to completely rerender your data but actually not rerender the whole DOM but just update some parts of it.
Actually the Virtual Dom is not faster than the actual Dom , The real DOM itself is fast, it can search, remove and modify elements from the DOM tree quickly. However the layout and painting elements in html tree is slow. But React virtual DOM is not faster. The real benefit from Virtual DOM is it allows calculation the different between each changes and make make minimal changes into the HTML document.
Now why react is better when it come to manipulating the DOM?,your browser does a lot of work to update the DOM. Changing the DOM can trigger reflows and repaints; when one thing changes, the browser has to re-calculate the position of other elements in the flow of the page, and also has to do work re-drawing.
The browser has its own internal optimization to reduce the impact of DOM changes (e.g. doing repaints on the GPU, isolating some repaints on their own layers, etc), but broadly speaking, changing a few things can trigger expensive reflows and repaints.
It's common even when writing everything from scratch to build UI off the DOM, then insert it all at once (e.g. document.createElement a div and insert a whole tree under it for attaching to the main DOM), but React is engineered to watch changes and intelligently update small parts of the DOM to minimize the impact of reflows and repaints
A few reasons off the top of my head:
React uses a virtual DOM, which are just JS objects, to represent the DOM. The "current" version of the virtual DOM are objects with references to the actual DOM elements while the "next" vDOM are just objects. Objects are incredibly fast to manipulate because they are just memory changes whereas real DOM changes require expensive style layout, paint and rasterization steps.
React diffs the current vDOM against the next vDOM to produce the smallest number of changes required to make the real DOM reflect the next vDOM. This is called reconciliation. The fewer changes you make to the DOM, the faster layout calculations will be.
React batches DOM changes together so that it touches the real DOM as few times as possible. It also uses requestAnimationFrame everywhere to ensure that real DOM changes play "nicely" with the browser's layout calculation cycles.
Finally (probably React's least appreciated feature), React has increasingly sophisticated scheduling step to distinguish between low- and high-priority updates. Low priority updates are UI updates that can afford to take longer e.g. data fetched from servers whereas high-priority updates are things that the user will notice right away e.g. user input fields. Low priority updates use the very new requestIdleCalback API to ensure that they run when the browser's main thread is actually idle and that they frequently yield back to the main thread to avoid locking up the UI.
why that is faster than updating the actual DOM.
It is not faster. Explicit and controllable DOM updates are faster than anything else.
React may schedule better update graphs on sites similar to facebook but with the cost of diff processing O(D*N). On other sites React could be just a waste of CPU power.
No silver bullet here - each framework is optimal for the site it was created for initially. For others you will be lucky if particular framework is at least sub-optimal.
Real and complex Web App UIs are not using any "framework" but their own primitives: GMail, Google Docs, etc.
You might just use vanilla-js as you described:
document.getElementById('test').innerHTML = Hello World!
that's great, but super hard for medium \ big projects.
why? because react handle all your dom interaction and minmize it (as much as it could), when you would use vanilla-js most of your code would be just manipulation on the dom, extract data and insert data, with react you can put those worries aside and put all your afforts to create the best site\project.
more over, the virtual dom makes all the calculation behinde the sense, when you would try to do it manulay you have to deal with all the calculation every time (when you update an list and when you update another one), most probably from some point most of your code would be calculation about the dom updates
sound familiar? well, just for that you already got react.
Don't Repeat Yourself
if someone already done it, reuse it.
Separation Of Concerns
one part of your project should manipulate the ui, another should manipulate the logic
and the list can go on and on, but in conclusion the most important thing, vanilla-js is faster, in the end with the virtual dom and without it, you would have to use vanilla-js in the end. make your life simpler, make your code readable.
React is NOT faster then PURE Javascript.
The big difference between them is :
Pure Javascript: if you can master the Javascript language and have time to spend to find the best solution (with the lowest impact in browser performance) you definitively got an UI render system more powerful then React (because you have created a specific engine oriented to your necessity)
React: if you want to spend more time in data structure with no worries on UI update performance React (or Vue.js the future candidate for UI developing) is the best choice
I'm currently debugging a ajax chat that just endlessly fills the page with DOM-elements. If you have a chat going for like 3 hours you will end up with god nows how many thousands of DOM-nodes.
What are the problems related to extreme DOM Usage?
Is it possible that the UI becomes totally unresponsive (especially in Internet Explorer)?
(And related to this question is off course the solution, If there are any other solutions other than manual garbage collection and removal of dom nodes.)
Most modern browser should be able to deal pretty well with huge DOM trees. And "most" usually doesn't include IE.
So yes, your browser can become unresponsive (because it needs too much RAM -> swapping) or because it's renderer is just overwhelmed.
The standard solution is to drop elements, say after the page has 10'000 lines worth of chat. Even 100'000 lines shouldn't be a big problem. But I'd start to feel uneasy for numbers much larger than that (say millions of lines).
[EDIT] Another problem is memory leaks. Even though JS uses garbage collection, if you make a mistake in your code and keep references to deleted DOM elements in global variables (or objects references from a global variable), you can run out of memory even though the page itself contains only a few thousand elements.
Just having lots of DOM nodes shouldn't be much of an issue (unless the client is short on RAM); however, manipulating lots of DOM nodes will be pretty slow. For example, looping through a group of elements and changing the background color of each is fine if you're doing this to 100 elements, but may take a while if you're doing it on 100,000. Also, some old browsers have problems when working with a huge DOM tree--for example, scrolling through a table with hundreds of thousands of rows may be unacceptably slow.
A good solution to this is to buffer the view. Basically, you only show the elements that are visible on the screen at any given moment, and when the user scrolls, you remove the elements that get hidden, and show the ones that get revealed. This way, the number of DOM nodes in the tree is relatively constant, but you don't really lose anything.
Another similar solution to this is to implement a cap on the number of messages that are shown at any given time. This way, any messages past, say, 100 get removed, and to see them you need to click a button or link that shows more. This is sort of what Facebook does with their profiles, if you need a reference.
Problems with extreme DOM usage can boil down to performance. DOM scripting is very expensive, so constantly accessing and manipulating the DOM can result in a poor performance (and user experience), particularly when the number of elements becomes very large.
Consider HTML collections such as document.getElementsByTagName('div'), for example. This is a query against the document and it will be reexecuted every time up-to-date information is required, such as the collection's length. This could lead to inefficiencies. The worst cases will occur when accessing and manipulating collections inside loops.
There are many considerations and examples, but like anything it depends on the application.
Recently we redesigned one of our pages and suddenly page has been increased from 1MB to 1.98MB.
I compared the no of DOM elements and its increased from 1600 to 2300. I found the no of elements from the below command
document.getElementsByTagName('*').length
We did a load test and found the load time also increased from 1.1 to 2 seconds. Is this the reason for all problems.
I think the above line won't consider any inline css and js right , as they are not DOM elements.
Can you please suggest
Without knowing exactly what you redesigned, it's impossible to know what change caused the increase. But even a 1MB page is pretty large. JavaScript (and particularly jQuery) can change the number of DOM objects... consider this:
$('p').append('<span>Blah</span> <span>blah</span> <span>blah</span>');
That will add 3 DOM objects for each p tag on the page (which could be a lot!) and yet it adds only 71 bytes to your page. jQuery can similarly remove DOM objects. So I don't think the number of DOM objects is really much of a consideration.
The javascript that runs can manipulate the dom and create new nodes which would affect your count. However it shouldn't make the page load any slower as it's rendered on the client side.
I think you need to include more information if you expect to get a better answer.
Also you should look into browser plugins (for firefox) like Yslow, or firebug (net tab) that show you all the files being loaded and how long they load.
Anytime that you have more information crossing the wire, it will take longer. Therefore, with more DOM elements in the page, the loading time will be slower. I hope this answers your question because I'm not really sure of what you are actually asking.