Page hangs after too many DOM creation - javascript

I currently make some async ajax calls and creates rows in table based on returning data. If rows are around 400-500, page hangs after this dom creations. eg. if i click any text box or drop-downs, it stuck forever. If rows are around 100-200 and then i click any text box, it is still slow but at least it does not stuck.
So, I think the problem is there are too many dom to be created and this causes some problems in browser or page or whatever.
Do you have any ideas or any solutions to improve this performance?

You need to lazy load your data somehow. Ever noticed on sites like Twitter, Facebook, and others that when you scroll to the bottom of the page it will begin loading more records from the server? Good apps will start to garbage collect old records that have been scrolled up as well.
When you scroll through your Facebook news feed it's not loading all your friends post since 2007 into the browser all at the same time. Once a maximum number of posts exists in the DOM Facebook will start removing the oldest ones you scrolled up to make room for more and grab fresh posts from the server so you can continue scrolling. You can even see your browser scroll bar jump up as you scroll down because more posts are being added to the DOM.
No browser is going to be able to handle that much data. You're going to have to sit down and think of a better way to show that data to the user. Only you will know what experience your users will actually want, but no matter what you'll definitely have to reduce the amount of elements you're including on the page.
Example:
Notice how the browser scroll bar jumps up a bit when it gets to the bottom. Twitter gets to the bottom and then loads more data to scroll through. It will eventually start cleaning up data at the top of the page as well if I scroll far enough.
The simplest solution is probably going to be for you to pass up a page number with your ajax requests and have your server only return the results for that page of data.

Related

API with paginated data, seeking advice

So i'm seeking for a couple of questions to be answered. I am using a api which returns a list of products (15000+) how ever they use pagination so it only returns 20 per page.
I would like to be able to show all of this data on my shop so users can search through it etc... however, issue... it takes A LONG time to loop through it etc etc.
Is there a good method to do this? Shall I just loop through all the data and allow it to be added into an array once loaded? Is there something "special" we can do with the pagination?
I am new to this, and just seeking advice on the above.
Kind Regards,
Josh
There are a few thoughts that strike me straight away so let's cover those first:
From a pure UX perspective, it's very VERY unlikely that any user will ever need or click through 15k+ rows of whatever. So loading them all doesn't serve your user even if you could figure out how to do this in a efficient way.
Instead, look at what serves your users which likely in this case is some sort of filtering or search options. I would look into if your API has any support for things like categories (that should be a set smaller than 1 request maybe) which is much easier to display to get the user to cut down a lot of the data set. Then also look into if they offer some sort of query or search filter, maybe the names of whatever is being displayed. This further lets your users zoom down to a dataset that is manageable (roughly 100 items max). From there, 20 items per page is just 5 pages. Still though, you should only really load 1 page at a time and focus on better ways to offer SORTING, if you can find what the user needs on the first page, you don't need to load those 4 other pages. Hope that gives you some ideas of what to look for inside your API, or what to add if you can add it yourself.
If not, perhaps it would be worth considering loading the data into your own database and set up some background/nightly task that fetches any updates from the API and stores them. Then you build your own API around your own database that has functionality for filtering/searching.
A final option is indeed to simply ask for the first page, and then display that while you wait for the 2nd page to load. But this risks making an awful amount of wasted requests which wastes not only your users bandwidth but also puts pressure on the API for what is likely going to be wasted work. So there are a few other UX ideas around this as well, like infinite scrolling. Load the first 1 or 2 pages then stop, until the users scrolls past the first page and a half, then request page 3 etc. This way you only load pages as the user scrolls but it's a bit more fluid than pagination with numbering. Still, you'd likely want to offer some way to sort this set so that it becomes more likely that they'll find what they need in the first few "pages".

Render big state data in reactjs

My customer give me a table with 12k records (about 20->30MB) and then they want to show all in screen, at this time they dont want to pagination.
When the page is loaded, I call api and update new state for component but I take about 10s to finish render and when I scroll this list, it's slow too.
My question is How do I make it faster?
This is second case, when I try with 33k records (about 51MB), memory leak occur and white screen appear.
My question is What is the limitation of state? Do I update state with bigger data?
First of all, what you need is Infinity Scroll.
It's like what Netflix or Prime Videos Does.
First You call 20 Records and when you scroll to the bottom it will call 20 more records and then so on.
So It will start with 20 and as soon as you are about to hit the bottom of the scrollbar you will call the API to fetch 20 more and add it to the old List.
Now If you have scrolled a lot and you have like 2000+ Records and It slows down, then use react-window or react-virtualized package, what this does is only render the content which you are viewing in the dom.
Check this video for reference https://www.youtube.com/watch?v=QhPn6hLGljU.
For the first question, the reason why it becomes slow is because the DOM you are rendering is giant, so it consumes way too much memory, hence your browser starts to hog your RAM, you should implement virtual scroll so that only visible elements are loaded in the DOM.

List with 3500+ items is loaded via ajax a small bunch at a time. Stops updating in the UI, but I see the li tags being appended in DOM

I have a use case where there are thousands of items on a list shown to the user. They are loaded a small batch at a time and I see the network traffic going in and out, I see data getting loaded. I see the DOM getting bigger, but the list itself in the UI stops updating (Chrome).
When I examine it, I see thousands of items in the code, when I select the items through console and make it count them, I see the proper number. But in the page itself, I don't see these items get displayed. The list uses drag-and-drop to put items from it into another list (and load additional data about them).
Not using jquery.datatables at the moment, but been meaning to migrate to them a long time ago. I can’t find a good example, though, everything I see uses pagination to split, but what if this is not an option?
How can I pinpoint what it is that is preventing the items from display? The number of entries will vary between 500 and 20 000.
Never mind. everything works as intended, duh. I was stupid and missed something very obvious: things had "display: none" for a very good reason about which I totally forgot (has to do with the core logic of the application). Next time hit me with a stick so I could remember to pay more attention.
Not sure what you meant by saying 'DOM getting bigger' but 'don't see items get displayed'.
Typically JS has a main thread which will handle functions/callbacks as well as view-refresh. So if you operation is blocking , the view will not be refreshing.
As for the pagination is not an option thing, you can consider using DOM-lazy-Loading mechanism where you only put what should be in the current viewport into the dom. As user scroll, you calculate the scroll height dynamically to add/remove items to/from the DOM. One thing to remember is you typically need to define a fixed height for your rows so that you could do the calculation. This lazy-loading way is a common way of solving this type of problem and is widely used by different frameworks like GXT, angular-gird..etc.

Setting up infinite scroll for tabular display of data and rendering only a small subset of data

I am working on a project that fetches at least 50,000 data sets that have to be displayed as tabular data. The data sets are fetched from our MySQL database. The users have the following requirements...
The users don't want to set up pagination but they want to render more and more table rows as they scroll down the page.
If they scroll up, the users should not see the table rows that were previously displayed when they scrolled down. That means that I have to either delete the table rows that were previously created or hide them.
The lag or load time so as to not detect any obvious latency or lag.
Our project uses a LAMP (Python) stack with Django as a framework. We use Django templates which are server side templates. I don't know how to translate the same experience to client side. I have an idea for an approach, can someone either validate it or correct it?
My intended approach is given below:
Fetch a certain subset of rows of the original data set upon load (say 5000). This will be cached server side with memcached. On page load, only a certain subset of it (say 100) will be displayed. Each page will have certain state associated with such as page number or page size. These will be initialized using HTML5 history pushstate API.
When an ajax request is made, then additional data sets will be fetched and additional table rows will be appended to the existing table rows.
When the user scrolls up and reaches what would have been a previous page, the table rows get deleted or hidden.
I don't know how to implement step 3. What event should I listen to? What preconditions should I check for in order to trigger step 3. I also don't know if step 2 is the correct approach.
Here's what I would do:
Add a class (scrolled-past) to any row that is scrolled past the window top. (If you're using jQuery, jQuery Waypoints is a great tool to detect this.)
Fetch more data when reaching bottom.
When the data has been fetched, hide all .scrolled-past elements, append new rows and scroll to the top of the table (where to first row now is the one that was previously the uppermost visible row).
There might be glitches when hiding, appending and scrolling, but I bet you'll nail it with an hour of tweaking. Adding the exact top offset of the uppermost visible row to the scroll offset in step 3 is one way of making it more neat.

How to free used memory after loading differents page using AJAX? [duplicate]

I have a very basic ajax slideshow on my website. On every scroll, the new images and response content continually increase the amount of memory used by the browser.
I've done my research and tried all suggestions to reset the XHR object on each new request, but this does absolutely nothing to help.
The slideshows are basic but may contain hundreds of slides. I want a user to be able to navigate the slideshow indefinitely without crashing their browser. Is this even possible?
Thanks, Brian
Increasing memory usage is normal. You are, after all, loading more data each time - the HTML from your AJAX response, as well as the images that are being displayed. Unless you're using Adobe Pagemill-generated HTML, that's only going to be a few hundreds of bytes of HTML/text. It's the images that will suck up the most space. Everything get stuffed into the browser's cache.
Since you're not doing anything fancy with the DOM (building sub-trees and whatnot) directly, just replacing a chunk of HTML repetitively, eventually the browser will do a cleanup and chuck some of the unused/old/stale image data from memory/cache and reclaim some of that memory.
Now, if you were doing some highly complex DOM manipulations and generating lots of new nodes on the fly, and were leaking some nodes here and there, THEN you'd have a memory problem, as those leaked nodes will eventually bury the browser.
But, just increasing memory usage by loading images is nothing to worry about, it's just like a normal extended surfing session, except you're just loading some new pictures.
If its a slideshow, are you only showing one image at a time? If you do only show one at a time and you're never getting rid of the last one you show, it will always increase the memory. If you remove the slides not being shown, it should help.

Categories

Resources