We have a timer that removes the top item in an unordered list and moves it to the bottom of the list. Each item has images, custom fonts, rollovers, etc.
For some reason, the longer the page runs, the slower it gets. You can notice the latency when hovering over the ribbons. The ribbons are supposed to turn red on hover, but when it slows down you'll notice it can take several seconds to see the hover state.
I have no idea why this is happening. I believe we're properly cleaning everything up, but something is obviously wrong.
Here's the page in question...
http://gmfg.trailerparkinteractive.com/
Let me know if I can provide any additional detail.
It seems you have a memory leak and here's how you detect one.
It seems one of your scripts is allocating and deallocating a lot of memory over a short period of time.
Further drilling into the retaining tree we find that some HTML element nodes are being deleted from the DOM but not released.
My advice is, try running your site while disabling different scripts, and retest with this method to get a guestimate of which plugin is doing that.
Related
We're using Angular together with Angular-material design. We noticed that during rendering some more complicated views, the scripting and rendering blocks the event loop thus it makes the whole page unresponsive for a few seconds. I created a simple page that demonstrates this problem: https://quirky-hugle-91193d.netlify.com (source code: https://github.com/lukashavrlant/angular-lag-test) If you click on the button, it will generate 200 rows with a single mat-checkbox component. There is also a counter that is refreshed every 10 ms if everything is fine. You can clearly see that the counter stops during the rendering because the event loop is blocked by the *ngFor rendering the rows. I tried to measure the length of the block, it blocks it for 400 ms on my machine.
My question is: is it completely "normal" behavior? Can it be avoided? Is it really so much data that it has to block event loop for that long? Because the delay is easily noticeable by user.
The only solution we found was to render it incrementally. I. e. render one row, wait 1 ms, render the next row, wait 1 ms etc. Isn't there a better solution?
EDIT 1: I tried also the Chrome's Performance tool. It says it spent most of the time in "scripting" phase. I tried to look at the stack traces there but wasn't able to identify the problem.
Adding element to the DOM has a cost that you won't be able to reduce ever.
However if you look at the performance tab of the devtools while you are adding the elements you will notice a few things.
Right now it looks like you are build in dev mode, not prod. A prod build should be significantly faster.
Secondly, while there's an associated cost to adding elements in the DOM, there's also a much higher cost to instantiating those elements. Especially when it comes to Angular Material elements. They are really pretty and well made, but they are also on the heavy side when it comes to both code and html nodes. If you have to including lots of checkboxes at once, using regular html elements with a bit of css to make them look similar instead of an actual angular element might be they way to go.
In my case the actual operation of inserting everything in the DOM is ~90ms, out of the whole 350ms including the click event triggering the creation of all the elements in the ngFor loop.
The last part is further supported by the cost of elements after adding them, that you can also easily notice when looking at the performances. This is exacerbated by the dev mode once again, but it's present nonetheless.
So I would recommand trying it out in a more optimized build mode, and if it's still not enough maybe swap the Material component for a basic checkbox with css (maybe even a bit of svg).
I am working on a simple sorting algorithm demo app. I have a few sorting algorithms written in JavaScript, and an HTML page to try them out. I have tested the algorithms separately and they work fine.
You can input your own array, or a random one can be generated for you. If the size of the array was too large, whether you were generating it or sorting it, this would hang the UI, so I added a webworker that performs these operations in a separate thread.
The app logs what it is doing. Before the web worker, when working with really large arrays the log would appear all of a sudden only after the process was completed, but thanks to the webworker, logs appear as soon as each individual step is completed. Also, the page remains scrollable, while without the worker everything would freeze until the job was done.
Having delegated the computationally heavy elements to the webworker, I was expecting elements on the page to remain fully interactive, but this doesn't seem to be the case. All buttons etc on the page become unresponsive anyway. To be exact, they stay interactive for a really short while and then freeze--for example, if I mouseover a button, the cursor will change shape, but then it will stay frozen in that shape no matter where I move it, and if I click, the click won't register until after the webworker has done its job. This is a problem, because I have 'stop' buttons the user may click to stop the webworker if it is taking too long to generate an array or sort it. As things stand, the stop buttons freeze, and if you click them, this will have an effect only after the worker has finished anyway, effectively making them useless.
How can I make sure my controls on the page remain usable? Thanks!
So, I ran into what I think is the same problem, and was having a bear of a time with it for days. Thankfully I ran across this: https://nolanlawson.com/2016/02/29/high-performance-web-worker-messages/
For some reason it seems on both Chrome and Firefox, when webworkers post messages, it freezes the main thread when in the process of packing the data.
If you use JSON to fully serialize all of your objets when posting messages, it seems to make everything run much, much more smoothly.
I have a use case where there are thousands of items on a list shown to the user. They are loaded a small batch at a time and I see the network traffic going in and out, I see data getting loaded. I see the DOM getting bigger, but the list itself in the UI stops updating (Chrome).
When I examine it, I see thousands of items in the code, when I select the items through console and make it count them, I see the proper number. But in the page itself, I don't see these items get displayed. The list uses drag-and-drop to put items from it into another list (and load additional data about them).
Not using jquery.datatables at the moment, but been meaning to migrate to them a long time ago. I can’t find a good example, though, everything I see uses pagination to split, but what if this is not an option?
How can I pinpoint what it is that is preventing the items from display? The number of entries will vary between 500 and 20 000.
Never mind. everything works as intended, duh. I was stupid and missed something very obvious: things had "display: none" for a very good reason about which I totally forgot (has to do with the core logic of the application). Next time hit me with a stick so I could remember to pay more attention.
Not sure what you meant by saying 'DOM getting bigger' but 'don't see items get displayed'.
Typically JS has a main thread which will handle functions/callbacks as well as view-refresh. So if you operation is blocking , the view will not be refreshing.
As for the pagination is not an option thing, you can consider using DOM-lazy-Loading mechanism where you only put what should be in the current viewport into the dom. As user scroll, you calculate the scroll height dynamically to add/remove items to/from the DOM. One thing to remember is you typically need to define a fixed height for your rows so that you could do the calculation. This lazy-loading way is a common way of solving this type of problem and is widely used by different frameworks like GXT, angular-gird..etc.
I have a very basic ajax slideshow on my website. On every scroll, the new images and response content continually increase the amount of memory used by the browser.
I've done my research and tried all suggestions to reset the XHR object on each new request, but this does absolutely nothing to help.
The slideshows are basic but may contain hundreds of slides. I want a user to be able to navigate the slideshow indefinitely without crashing their browser. Is this even possible?
Thanks, Brian
Increasing memory usage is normal. You are, after all, loading more data each time - the HTML from your AJAX response, as well as the images that are being displayed. Unless you're using Adobe Pagemill-generated HTML, that's only going to be a few hundreds of bytes of HTML/text. It's the images that will suck up the most space. Everything get stuffed into the browser's cache.
Since you're not doing anything fancy with the DOM (building sub-trees and whatnot) directly, just replacing a chunk of HTML repetitively, eventually the browser will do a cleanup and chuck some of the unused/old/stale image data from memory/cache and reclaim some of that memory.
Now, if you were doing some highly complex DOM manipulations and generating lots of new nodes on the fly, and were leaking some nodes here and there, THEN you'd have a memory problem, as those leaked nodes will eventually bury the browser.
But, just increasing memory usage by loading images is nothing to worry about, it's just like a normal extended surfing session, except you're just loading some new pictures.
If its a slideshow, are you only showing one image at a time? If you do only show one at a time and you're never getting rid of the last one you show, it will always increase the memory. If you remove the slides not being shown, it should help.
I am experimenting with jQuery and the animate() functionality. I don't believe the work is a final piece however I have problem that I can't seem to figure out on my own or by trolling search engines.
I've created some random animate block with a color array etc and everything is working as intended including the creation and deletion of the blocks (div's). My issue is within 2mins of running the page, Firefox 4 is already at more than a 500,000k according to my task manager. IE9 & Chrome have very little noticeable impact yet the processes still continue to increase.
Feel free to check out the link here: http://truimage.biz/wip300/project%202/
My best guess are the div's are being created at a greater speed than the 2000ms they are being removed however I was hoping an expert might either have a solution or could explain what I am doing wrong and some suggestions.
On another note, from the start of my typing this out till now the process is at 2,500,000k. Insane!m
It could be a lot of things other than just your script there. It could be a mem leak in one of the jQuery things you use, pretty hard to say.
Something you could try is this though:
Instead of creating new squares, use a "square pool". Let's say you create 20 squares and just keep re-using them instead of creating new ones.
You'd basically just have an array for the pool and take elements out from it when they are displayed, and put them back to it when the animation finishes.