browser not responding while rendering data to create table - javascript

I am using ajax call to get json data.The data contains array of say 10000 size.
data.LocationQuality.forEach(function(datum){
}
I am using this loop to go through all the element and create tr with filling all the td's using datum.
Problem I am facing is that because of the large amount of data.The loop is taking lots of time and ultimately the browser shows not responding and I have to kill the page.
How to solve this problem?

There are a few things to be said here.
Firstly, as mentioned, pagination would definitely be the best solution for this. Keeping 10000 rows in memory is not a problem for browsers nowadays, but displaying 10000 rows in a table often does take ages. Only displaying 20 rows at a time for example, is much slower.
A solution like jqGrid does not require ajax (but does support it). You can load you data once and the grid will keep the data in memory and only displays a number of items at once, with navigational buttons to flip through pages.
You can get it here: http://www.trirand.com/blog/?page_id=6
Secondly, something should be said about the dom. I'm expecting that you are currently adding a row to the dom 10000 times. This is much much slower than building the table in memory and then adding the one table with 10000 rows in one step. It will still be slow, but not as slow as hitting the dom so many times.
So rather than
$.each(data, function (index, row) {
$('#yourTable').append('your row definition'); // hitting the dom for each row
}
do this instead
var table = $('<table></table>');
$.each(data, function (index, row) {
table.append('your row definition'); // adding it to the table in memory
}
$('#yourTable').replace(table); // hit the dom once

Without code this is a stab in the dark:
1: optimise your page creation loop. DOM operations are very slow - you might be able to improve things by doing them in a different way
2: download your data bit by bit. As the user scrolls your page make another AJAX call to get the next portion of the data.
3: download all the data, but then render only a page at a time.

You also might want to look at this: https://developer.mozilla.org/en-US/docs/Web/API/document.createDocumentFragment?redirectlocale=en-US&redirectslug=DOM%2Fdocument.createDocumentFragment
It allows you to create large parts of the DOM without adding them yet. That way you can build your table and append it to the DOM in one go, should be a lot faster then just appending rows all the time

You have problem displaying large number of data, and you can't get data in multiple queries?
So, get all data in one call, parse them and place in array, and do pagination with that "local" array.
Instead of requesting data from server to do pagination, just get data from "local" array.

Related

What is faster? Refresh the whole table with constant freq. or refresh necessary rows with same freq

For example, I have a table with N rows.
The table is attached to WebSocket. Let it be a quotes table.
There are two ways.
1) Update the whole table when any quote received, but limit updates to 1 per second.
2) Update only the row that represent the quote, but limit the frequency to 1 per second for any particular row.
In first case I have fixed amount of redraws per second. But each redraw takes more time while table grows.
In second case I have fixed time per redraw of a single row. But the amount of redraws grow when the table grows.
What is better?
Important to say, that I use a cluster of DIVs instead of table element to represent the data. I'm not sure may it also affect the result or not.
Changing a particular row is much faster than redrawing the whole table. Redrawing a small part of the dom is much more efficient than re-drawing the whole thing.
This is why the concept of Virtual DOM is a thing (like in react and vue).
DOM manipulation is the heart of the modern, interactive web. Unfortunately, it is also a lot slower than most JavaScript operations.
As an example, let's say that you have a list that contains ten items. You check off the first item. Most JavaScript frameworks would rebuild the entire list. That's ten times more work than necessary! Only one item changed, but the remaining nine get rebuilt exactly how they were before.
https://www.codecademy.com/articles/react-virtual-dom
Using something like react as a framework will abstract these problems from you, so you dont have to worry what you are updating, react will update only what is needed. Doing it manually is also possible but might be tiresome.
The bottomline is that changing the smallest parts of the dom is better than redrawing the whole thing.

ExtJS Grid storing Cell State automagically

I am porting quite a huge piece of software to an ExtJS Grid. Lots of data (and I mean lots of data) is loaded on-demand into spans that are placed inside grid's cells.
Imagine grid cells having <span id="foo_bar'></span> as content, and special ajax handlers are polling the backend for updated information and once available the spans are filled with it.
Now, in case I collapse some part of the grid and then re-exand them again I loose all automatically filled cell content, and am left with empty spans (which I started from in the first place).
I know the correct way is to setup a store and push all data into the store. But as I've mentioned above: I am porting quite a huge piece of legacy software to ExtJS, and I do not really have much choice here.
Is there a way to automagically push grid cell values to the store?
Update:
A grid is loaded with, suppose, 2000 cells (this can vary tremendously). Every cell contains various grades of HTML, mostly this is , but this can be pretty much anything (including several spans or divs in one cell). In the background there is a comet process pushing new data to the HTML page almost in real time. This data is populated to the corresponding SPANS and DIVs either based on their IDs or class or both.
What I want to achieve is that either:
a) the model for the grid is atomagically updated with the new html content of the cells (how can I achieve this)?
b) when collapsing/expanding tree's nodes the model data is NOT reloaded afresh.
Is either a or b possible? if so — how?
Just update your store when the 'special ajax handlers' successfully return values. In staid of manually messing with the dom.
so in the success callback do something like -> loadData()
Edit:
Seems like you are working with very bad legacy code :) If adding a simple line of code to the ajax handler is a tremendous effort.. You can however attach dom listeners, but it's very bad practice. Here are some events to listen to
Edit:
Here is an example of the use of how you could listen to dom events, it's rather a pure js solution but, meh.. it works..

Does pagination on static HTML table help improve performance

I have a big HTML table (~10K rows, 4 columns, text only) dumped from a database. I'm experiencing poor performance when opening it in Chrome/Firefox.
I do not have direct access to database, so it is impossible to load page by page. All data is static HTML.
Does pagination with some jQuery plugin help improve performance in this case? Any other suggestions?
When applicable, setting table-layout: fixed helps in reducing rendering time a lot. The slowness is mostly caused by the fact that in the default table layout method, the browser has to parse and process the entire table, calculating width requirements for all cells, before it can render any part of the table.
Fixed layout tells the browser to set column widths according to the first row (paying attention to any CSS that may apply to it). This may, of course, result in a mess, if the widths are not suitable for other rows. For a data table where rows are generally very similar, the widths can probably be set suitably.
This does not change the issue that very few people, if any, are going to read all the 10,000 rows. People will probably be looking for some specific data there. It might be better to set up a search form that lets the user get just what he wants.
I had a similar problem and made a jQuery plugin:
https://github.com/lperrin/infinitable
To use it, you can load all your data with Ajax call, turn it into a huge array, and pass it to the plugin.
It basically hides cells that are not visibles, while making it easy to sort or filter cells. There are other solutions to achieve that, but this one has no dependencies besides jQuery.
The project contains a demo with a table containing 100,000 elements.
Pagination would most certainly solve this problem. You could also try initially setting the table style to display: none. Although the pagination should likely take effect before the browser attempts to render the table.
You could also store the table in a separate html file, do an ajax call to the document, and implement live scrolling. Although, this depends on how you expect the user to explore the data. If jumping to a particular rage like 100-199 is useful, a paginated table would be ideal.

Huge Data in Listbox(Table)

I want to show a listbox(Table) with nearly 20 Million rows.
How can I do so, with lower memory usage and not letting my server die(stop responding) while doing so.
Even you have any Theoretical idea please do share(I will try to implement).
Need solution very urgently.
I know I cannot load all the rows at once. I need to ask new rows from server every time I scroll. I have tried it but my scroll is not smooth enough.
Thanks & Regards,
Aman
Why not just retrieve the first 100 entries and then once the client scrolls to the bottom you append another 100 entries and so on.
Maybe you could wait for ZK's new feature.
Reference
http://books.zkoss.org/wiki/Small_Talks/2012/March/Handling_a_Trillion_Data_Using_ZK
You could use http://books.zkoss.org/wiki/ZK Developer's Reference/MVC/View/Renderer/Listbox Renderer.
public void render(Listitem listitem, Object data, int index)
To start, you can implement render in way so that you get element to render from datasource at hand by index from render method. You can use standard cache (if Hibernate is in place) or custom-written one if otherwise (look also at EhCache).
#Erik solution is really fast to implement. To add you could make a button, so that user would be aware that loading more records would cost some time and would think if one really needs to load more. Scrolling can make you Listbox just hang up for a moment.
And always make an upper constraint on maximal number of records you will show at one time - don't pollute your server's memory.
Make paging for the table values and retrieve specific number of records on demand.
Use can use dataTable pluging to make pagination for data records.
Notice that you can retrieve data in synchronous and asynchronous way using this library

Fastest way to append HTML content to a div using JavaScript

I have an HTML page that uses AJAX to retrieve messages from a server. I'm appending these messages to a as they are retrieved by setting its innerHTML property.
This works fine while the amount of text is small, but as it grows it causes Firefox to use all available CPU and the messages slow down to a crawl. I can't use a textbox, because I want some of the text to be highlighted in colour or using other HTML formatting. Is there any faster way to do this that wouldn't cause the browser to lock up?
I've tried using jQuery as well, but from what I've read setting .innerHTML is faster than its .html() function and that seems to be the case in my own experience, too.
Edit: Perceived performance is not an issue - messages are already being written as they are returned (using Comet). The issue is that the browser starts locking up. The amount of content isn't that huge - 400-500 lines seems to do it. There are no divs within that div. The entire thing is inside a table, but hopefully that shouldn't matter.
You specifically said that you were appending meaning that you are attaching it to the same parent. Are you doing something like:
myElem.innerHTML += newMessage;
or
myElem.innerHTML = myElem.innerHTML + newMessage;
because this is extremely inefficient (see this benchmark: http://jsben.ch/#/Nly0s). It would cause the browser to first do a very very large string concat (which is never good) but then even worse would have to re-parse insert and render everything you had previously appended. Much better than this would be to create a new div object, use innerHTML to put in the message and then call the dom method appendChild to insert the newly created div with the message. Then the browser will only have to insert and render the new message.
Can you break up your text and append it in chunks? Wrap portions of the text into div tags and then split them apart adding the smaller divs into the page.
While this won't speed the overall process, the perceived performance likely will be better as the user sees the first elements more quickly while the rest loads later, presumably off the visible page.
Additionally you probably should reconsider how much data you are sending down and embedding into the page. If the browser is slow chances are the amount of data is bound to be huge - does that really make sense to the user? Or would a paging interface (or other incremental load mechanism) make more sense?
You can use insertAdjacentHTML("beforeend", "<HTML to append>") for this.
Here is an article about its performance which has benchmarks showing insertAdjacentHTML is ~150x faster than the .innerHTML += operation. This might have been optimised by browsers in the years after the article was written though.
Going along with Rick's answer, why not just pass the info back as JSON, so you can just go through it, using setTimeout, and display perhaps 2-5 messages, then call setTimeout, then it will do the next batch, until the JSON array has been processed.
You should use innerHTML though, so your javascript can create that dynamically and add to the div, but, I would only do that for the first batch, to get everything up quickly.
After that I would go with cloning the first batch and changing the innerhtml for each of the other messages, along with other info, and add that to the dom tree.
Cloning will be faster than creating new elements and you won't have problems if anything else changes the dom tree while you are processing.
"The entire thing is inside a table, but hopefully that shouldn't matter."
Actually it matters alot. Due to the nature of tables a cell can often not be rendered until the width and height of all cells in the column and row are calculated. table-layout: fixed overcomes this at the cost of locking cell width and height based on the first row.
In short it may be best not to wrap in a table or if the data really is tabular try fixed layout rendering.
http://www.w3schools.com/Css/pr_tab_table-layout.asp
I wrote something similar, and it was challenging. First, you need to isolate the problem.
Is it the rendering code? Try commenting out all the rendering and see if the AJAX itself slows down Firefox. If so, try different approaches to the rendering, as outlined above.
Is it the network? Try commenting out the Ajax, and just run your innerHTML setting periodically. If this is the problem, you may need to experiment with different timing settings.

Categories

Resources