Setting innerHTML/html() on DOM-elements within nested loop - IE performance - javascript

I am working on some horrible legacy code where a JSON-object is fetched from the server. The object is a list of IDs that correspond to DOM-Nodes. Each list-item contains new HTML-Code-Snippets to be inserted for that node. Basically it looks like this:
var $table = $("#theTable");
var tableData = getData();
for (var row in tableData) {
// find the TR
var $row = $table.find('tr[data-id=' + row + ']');
// update many sub-elements of the TR (~100-1000 nodes)
for (var el in row) {
var $el = $row.find("td[data-element='" + el + "']");
if ($el.length) {
$el[0].innerHTML = row[el];
}
}
}
Obviously it is bad practice to select nodes or call .innerHTML/.html() within a loop, but at this point there is no other choice. This results in very bad performance especially in IE. I tried to hide/detach the node, before doing the updates, reattaching/showing it again when everything is done, but it did not improve performance what so ever - it just got worse.
Is there any way to easily speed this up?

Related

Javascript performance optimization

I created the following js function
function csvDecode(csvRecordsList)
{
var cel;
var chk;
var chkACB;
var chkAF;
var chkAMR;
var chkAN;
var csvField;
var csvFieldLen;
var csvFieldsList;
var csvRow;
var csvRowLen = csvRecordsList.length;
var frag = document.createDocumentFragment();
var injectFragInTbody = function () {tblbody.replaceChild(frag, tblbody.firstElementChild);};
var isFirstRec;
var len;
var newEmbtyRow;
var objCells;
var parReEx = new RegExp(myCsvParag, 'ig');
var tblbody;
var tblCount = 0;
var tgtTblBodyID;
for (csvRow = 0; csvRow < csvRowLen; csvRow++)
{
if (csvRecordsList[csvRow].startsWith(myTBodySep))
{
if (frag.childElementCount > 0)
{
injectFragInTbody();
}
tgtTblBodyID = csvRecordsList[csvRow].split(myTBodySep)[1];
newEmbtyRow = getNewEmptyRow(tgtTblBodyID);
objCells = newEmbtyRow.cells;
len = newEmbtyRow.querySelectorAll('input')[0].parentNode.cellIndex; // Finds the cell index where is placed the first input (Check-box or button)
tblbody = getElById(tgtTblBodyID);
chkAF = toBool(tblbody.dataset.acceptfiles);
chkACB = toBool(tblbody.dataset.acceptcheckboxes) ;
chkAN = toBool(tblbody.dataset.acceptmultiplerows) ;
tblCount++;
continue;
}
csvRecordsList[csvRow] = csvRecordsList[csvRow].replace(parReEx, myInnerHTMLParag); // Replaces all the paragraph symbols ΒΆ used into the db.csv file with the tag <br> needed into the HTML content of table cells, this way will be possible to use line breaks into table cells
csvFieldsList = csvRecordsList[csvRow].split(myEndOfFld);
csvFieldLen = csvFieldsList.length;
for (csvField = 0; csvField < csvFieldLen; csvField++)
{
cel = chkAN ? csvField + 1 : csvField;
if (chkAF && cel === 1) {objCells[cel].innerHTML = makeFileLink(csvFieldsList[csvField]);}
else if (chkACB && cel === len) {objCells[cel].firstChild.checked = toBool(csvFieldsList[csvField]);}
else {objCells[cel].innerHTML = csvFieldsList[csvField];}
}
frag.appendChild(newEmbtyRow.cloneNode(true));
}
injectFragInTbody();
var recNum = getElById(tgtTblBodyID).childElementCount;
customizeHtmlTitle();
return csvRow - tblCount + ' (di cui '+ recNum + ' record di documenti)';
}
More than 90% of records could contain file names that have to be processed by the following makeFileLink function:
function makeFileLink(fname)
{
return ['<a href="', dirDocSan, fname, '" target="', previewWinName, '" title="Apri il file allegato: ', fname, '" >', fname, '</a>'].join('');
}
It aims to decode a record list from a special type of *.db.csv file (= a comma-separated values where commas are replaced by another symbol I hard-coded into the var myEndOfFld). (This special type of *.db.csv is created by another function I wrote and it is just a "text" file).
The record list to decode and append to HTML tables is passed to the function with its lone parameter: (csvRecordsList).
Into the csv file is hosted data coming from more HTML tables.
Tables are different for number of rows and columns and for some other contained data type (which could be filenames, numbers, string, dates, checkbox values).
Some tables could be just 1 row, others accept more rows.
A row of data has the following basic structure:
data field content 1|data field content 2|data field content 3|etc...
Once decoded by my algorithm it will be rendered correctly into the HTML td element even if into a field there are more paragraphs. In fact the tag will be added where is needed by the code:
csvRecordsList[csvRow].replace(par, myInnerHTMLParag)
that replaces all the char I choose to represent the paragraph symbol I have hard-coded into the variable myCsvParag.
Isn't possible to know at programming time the number of records to load in each table nor the number of records loaded from the CSV file, nor the number of fields of each record or what table field is going to contain data or will be empty: in the same record some fields could contain data others could be empty. Everything has to be discovered at runtime.
Into the special csv file each table is separated from the next by a row witch contains just a string with the following pattern: myTBodySep = tablebodyid where myTBodySep = "targettbodydatatable" that is just a hard coded string of my choice.
tablebodyid is just a placeholder that contains a string representing the id of the target table tbody element to insert new record in, for example: tBodyDataCars, tBodyDataAnimals... etc.
So when the first for loop finds into the csvRecordsList a string staring with the string into the variable myTBodySep it gets the tablebodyid from the same row: this will be the new tbodyid that has to be targeted for injecting next records in it
Each table is archived into the CSV file
The first for loop scan the csv record list from the file and the second for loop prepare what is needed to compile the targeted table with data.
The above code works well but it is a little bit slow: in fact to load into the HTML tables about 300 records from the CSV file it takes a bit more of 2.5 seconds on a computer with 2 GB ram and Pentium core 2 4300 dual-core at 1800 MHz but if I comment the row that update the DOM the function needs less than 0.1 sec. So IMHO the bottle neck is the fragment and DOM manipulating part of the code.
My aim and hope is to optimize the speed of the above code without losing functionalities.
Notice that I'm targeting just modern browsers and I don't care about others and non standards-compliant browsers... I feel sorry for them...
Any suggestions?
Thanks in advance.
Edit 16-02.2018
I don't know if it is useful but lastly I've noticed that if data is loaded from browser sessionstorage the load and rendering time is more or less halved. But strangely it is the exact same function that loads data from both file and sessionstorage.
I don't understand why of this different behavior considering that the data is exactly the same and in both cases is passed to a variable handled by the function itself before starting checking performance timing.
Edit 18.02.2018
Number of rows is variable depending on the target table: from 1 to 1000 (could be even more in particular cases)
Number of columns depending on the target table: from 10 to 18-20
In fact, building the table using DOM manipulations are way slower than simple innerHTML update of the table element.
And if you tried to rewrite your code to prepare a html string and put it into the table's innerHTML you would see a significant performance boost.
Browsers are optimized to parse the text/html which they receive from the server as it's their main purpose. DOM manipulations via JS are secondary, so they are not so optimized.
I've made a simple benchmark for you.
Lets make a table 300x300 and fill 90000 cells with 'A'.
There are two functions.
The first one is a simplified variant of your code which uses DOM methods:
var table = document.querySelector('table tbody');
var cells_in_row = 300, rows_total = 300;
var start = performance.now();
fill_table_1();
console.log('using DOM methods: ' + (performance.now() - start).toFixed(2) + 'ms');
table.innerHTML = '<tbody></tbody>';
function fill_table_1() {
var frag = document.createDocumentFragment();
var injectFragInTbody = function() {
table.replaceChild(frag, table.firstElementChild)
}
var getNewEmptyRow = function() {
var row = table.firstElementChild;
if (!row) {
row = table.insertRow(0);
for (var c = 0; c < cells_in_row; c++) row.insertCell(c);
}
return row.cloneNode(true);
}
for (var r = 0; r < rows_total; r++) {
var new_row = getNewEmptyRow();
var cells = new_row.cells;
for (var c = 0; c < cells_in_row; c++) cells[c].innerHTML = 'A';
frag.appendChild(new_row.cloneNode(true));
}
injectFragInTbody();
return false;
}
<table><tbody></tbody></table>
The second one prepares html string and put it into the table's innerHTML:
var table = document.querySelector('table tbody');
var cells_in_row = 300, rows_total = 300;
var start = performance.now();
fill_table_2();
console.log('setting innerHTML: ' + (performance.now() - start).toFixed(2) + 'ms');
table.innerHTML = '<tbody></tbody>';
function fill_table_2() {// setting innerHTML
var html = '';
for (var r = 0; r < rows_total; r++) {
html += '<tr>';
for (var c = 0; c < cells_in_row; c++) html += '<td>A</td>';
html += '</tr>';
}
table.innerHTML = html;
return false;
}
<table><tbody></tbody></table>
I believe you'll come to some conclusions.
I've got two thoughts for you.
1: If you want to know which parts of your code are (relatively) slow you can do very simple performance testing using the technique described here. I didn't read all of the code sample you gave but you can add those performance tests yourself and check out which operations take more time.
2: What I know of JavaScript and the browser is that changing the DOM is an expensive operation, you don't want to change the DOM too many times. What you can do instead is build up a set of changes and then apply all those changes with one DOM change. This may make your code less nice, but that's often the tradeoff you have when you want to have high performance.
Let me know how this works out for you.
You should start by refactoring your code in multiples functions to make it a bit more readable. Make sure that you are separating DOM manipulation functions from data processing functions. Ideally, create a class and get those variables out of your function, this way you can access them with this.
Then, you should execute each function processing data in a web worker, so you're sure that your UI won't get blocked by the process. You won't be able to access this in a web worker so you will have to limit it to pure "input/output" operations.
You can also use promises instead of homemade callbacks. It makes the code a bit more readable, and honestly easier to debug. You can do some cool stuff like :
this.processThis('hello').then((resultThis) => {
this.processThat(resultThis).then((resultThat) => {
this.displayUI(resultThat);
}, (error) => {
this.errorController.show(error); //processThat error
});
}, (error) => {
this.errorController.show(error); //processThis error
});
Good luck!

Set HTML table rows all at once

I currently have code that runs through every row of a html table and updates it with a different row.
Here is the code
function sort(index) {
var rows = $table.find('tbody tr');
var a = $table.find('tbody tr');
//Only sort if it has not been sorted yet and the index not the same
if(sortedIndex === index){
for(var i = 0; i < rows.length; i++){
a[i].outerHTML = rows[(rows.length - i)-1].outerHTML;
}
toggleSorted();
}else{
sortedIndex = index;
rows.sort(naturalSort);
for (var i = 0; i < rows.length; i++) {
a[i].outerHTML = rows[i].outerHTML;
}
sortedDown = true;
}
$('#${tableId}').trigger('repaginate');
};
What I am trying to do is to instead of going through every single row in the for loop and setting a[i].outterHTML = rows[i].outterHTML; I would like to just set all of the rows at once. Currently it takes about 1.5 seconds to set them and that is very slow.... Only issue is I cannot seem to find a way to do this. Is this actually possible? (It takes 1.5 seconds on large data sets which is what I am working with).
Since the rows are the same just reordered, you can .append them with the new order:
var $tBody = $table.find('tbody');
var $rows = $tBody.find('tr');
if(sortedIndex === index){
toggleSorted();
$tBody.append($rows.get().reverse());
}
else {
sortedIndex = index;
sortedDown = true;
$tBody.append($rows.get().sort(naturalSort));
}
Here's a fiddle that demonstrates the above: http://jsfiddle.net/k4u45Lnn/1/
Unfortunately the only way to "set all of your rows at once" is to loop through all of your rows and perform an operation on each row. There may be some libraries which have methods and functions that make it look like you're doing the operation on all of your rows at one shot, but ultimately if you want to edit every element in a set you need to iterate through the set and carry out the action on each element, seeing as HTML doesn't really provide any way to logically "link" the attributes of your elements.

Live search in html table using cache

I used this code to live search in table:
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
$.each($table.find('tbody'), function() {
if($(this).text().toLowerCase().indexOf(value) == -1)
$(this).hide();
else
$(this).show();
});
});
It worked great, but now I have 1000+ rows in table and searching is really slow on older machines so I created array from table:
function getTableData() {
var data = [];
$table.find('tbody').each(function (rowIndex, r) {
var cols = [];
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
data.push(cols);
});
return data;
};
And I don't know how to search same way in this array to get tbody index.
3 Update uses cache and a hidden results box;
Here we hide the table, and create a results list on the fly, please note the html markup is very simple and incorrect but it demonstrates the speed improvements of not touching the dom when we cycle our memory cache, the next improvement would be to cache the html of each row and have it in our cache object, thus no need to get the row from the dom when it matches.
http://jsfiddle.net/hh6aed45/5/
var $search = $('input[type="text"]');
var $table = $('table');
var $result = document.getElementById("search");
var i=1;
function uuid(){
i++;
return "A"+i;
};
$search.keyup(function() {
var value = this.value.toLowerCase().trim(), html="";
for (row in cache) {
// the getElementById can be speeded up by caching the html in our memory cache!
if (cache[row].indexOf(value) !== -1) html+=document.getElementById(row).innerHTML;
}
$result.innerHTML=html;
});
function getTableData() {
var cache = {};
$table.find('tbody').each(function (rowIndex, r) {
$(this).find("tr").each(function(rowIndex,r){
var cols = [], id = uuid();
r.id=id;
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
cache[id]=cols.join(" ").toLowerCase();
});
});
return cache;
};
var cache = getTableData();
2 Update uses just a memory cache!
http://jsfiddle.net/hh6aed45/3/
This caches the dom as you requested, I add a uuid as an id so that the search can find each row, i could of cached an object and included a live reference to the dom. in fairness as I said I would only switch on what filters in and switch in the minority, as that would speed the loop as no Dom changes would occur for all rows that are not matching, which out of 10,000 would be many. I would also toggle a css class for "in" and "out", that would allow a live list to be cached of all in items, which then could be hidden before the search starts. So in basic, hide all 10,000 rows, turn on what matches, thus the loop does not touch the Dom unless it has! That is about as fast as you will get it.
var $search = $('input[type="text"]');
var $table = $('table');
var i=1;
function uuid(){
i++;
return "A"+i;
};
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
for (row in cache) {
document.getElementById(row).style.display = (cache[row].indexOf(value) === -1) ? "none" : "table-row";
}
});
function getTableData() {
var cache = {};
$table.find('tbody').each(function (rowIndex, r) {
$(this).find("tr").each(function(rowIndex,r){
var cols = [], id = uuid();
r.id=id;
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
cache[id]=cols.join(" ").toLowerCase();
});
});
return cache;
};
var cache = getTableData();
1 uses native raw javascript with optimizations;
For this kind of thing plane java-script is a must, if your talking about performance first things first, throw jquery out in your loop. Cache the dom first, dont keep loading it.
Speed is an illusion when your building programs.
For example you could do the lookup after two or three letters, not one. You could also change the code so that if a row does not match it is skipped, not switched into view.
so step one turn all rows on, (find all rows that are off, cache the view as views are live)
step two remove rows, not turn rows on and off as you go, that will speed the loop. here is
a sped up version walking the dom, if we had 10,000 rows two test we could improve and improve.
Here is a starting point, from here we use the steps as suggested above:
var $search = $('input[type="text"]'),
$table = $('table'),
$body = $table.find('tbody')
$rows = $body.find('tr'); // this view of the dom is live!
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
//loops are costly, native is ultra fast
for (var i = 0; l=$rows.length, i < l; i++) {
$rows[i].style.display= ($rows[i].innerHTML.toLowerCase().indexOf(value) === -1) ? "none" : "table-row";
}
});
http://jsfiddle.net/hh6aed45/

jQuery .prependTo() created table causing browser to "lag"

I have a problem I am not able to fix, quite difficult to explain but I will do my best.
Basically I have created a web application (in CodeIgniter) that takes some data of an array coming from a json encoding and adds rows to a table by using the jQuery .prependTo() method in the success function.
Every row of the table contains different elements from the database (coming from the json), depending from the value of the i counter.
The code is as the following (i cut the part about the <tr> and <td> content, it's just styling)
$.ajax({
type: "get",
async: false,
....
....
success: function(data) {
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
for(i = 0; i < data.length; i++) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
var newTr = // creates the <tr>
newTr.html('//creates the <td>')
newTr.prependTo("#myTable");
}
},
I am sorry if it is a bit unclear, if you need me to update the code because I missed something important let me know and I will do it, but this code alone should explain my problem.
The application works beautifully if in the database there are just a little number of rows (for example, i = 300). They are showed and rendered correctly and I don't get any browser slowing process. But when I work with i = 4000 rows, the browser starts acting slow just like the Javascript code is too heavy to render, and i get "lag" while trying to scroll down the html table, or inputting some values in the input boxes inside the table (to later update it by clicking a button). This is my problem: I am quite sure I'm doing something wrong that is loading up too much memory, as I tested this also on very strong computers. Even totally disabling my CSS won't do the trick.
Thanks for any help you can give me, it would be really appreciated.
The problem
You are using a lot of function call inside a loop. That take a lot of juice and the more item you have, the slower it is.
To solve that, we need to reduce the number of function calls.
My suggestion
Working with native JavaScript would save on performance here. So I suggestion you use string concatenation instead of DOM manipulation methods of jQuery.
Let rework you loop. Since you want your data in a descendant order, we need to reverse the loop :
for(i = data.length; i >= 0; i--)
Then simple string concatenation to build a tr. For that you need a var outside the loop:
var myHTML = ''; //That's the one!
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
And build the tr with +=. Although, using a single line would make it faster, but less readable :
for(i = data.length; i >= 0; i--) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
myHTML += '<tr>';
myHTML += '<td>'+yourData+'</td>';//add those data here
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '</tr>';
}
Then, prepend it to your table :
var myHTML = '';
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
for(i = data.length; i >= 0; i--) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
myHTML += '<tr>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '</tr>';
}
$('#myTable').prepend(myHTML);
Limitation
From my test, the string length can be 2^28 but cannot be 2^29. That make a maximum length of approx. 268,435,456 (approx. might not be the best word here since it's between 268,435,456 and 536,870,912.
If you data character count is higher than that (but let be honnest, that would be a lot of data), you might have to split you string into 2 variables.
I know this is not a real answer to your question, but...
jQuery definitely causes the lag - no wonder. But - whether you choose jQuery to build the table or you just concatenate HTML - let's agree on that: 4000 rows is definitely a lot of data. Too much data? I would say: 200 already is. If we go thousands, the question pops up: why? Is there any application-related reason you really need to retrieve such a big number of records at once? Some of them will probably never be read.
Why not try an ajax-based lazy load approach? Loading 50-record portions every time would seem more robust IMO, and would definitely be a better UX.
My two cents: Why don't retrieve the rows composed from the server instead of making the client work to compound it? You could even cache it. The only thing would be that you'll have more traffic, but in the browser it'll go smoother
UPDATE
To achieve it, you can make your method to distinguish between ajax input or not, and show only $tbody, or the whole table according to it. It would be something like:
$tbody = $this->load->view( 'view_with_oly_tbody', $data, true );
if ( $this->input->is_ajax_request() ) {
return $tbody;
} else {
$data['tbody'] = $tbody;
$this->load->view('your_view_with_table_with_tbody_as_var', $data);
}
Note: Code above will vary according to your views, controller, if you have json in your AJAX or not, write the return in other part, etc. Code above is just for clarify my point with CI.
Ah! and you'll have to manage the answer in the client to append/prepend/substitute the body of the table.

Object - An Object in the Object - An array of those Objects

I'm new to javascript so let me just say that right up front.
A web site I frequent has 50 or so items, with details about that item, in a table. Each table row contains several td cells. Some rows have types of things that are similar, like USB drives or whatever. I want to capture each row so that I can group and reorder them to suit my tastes.
I have this object:
function vnlItemOnPage(){
this.Category = "unknown";
this.ItemClass = "vnlDefaultClass";
this.ItemBlock = {};
}
This represents one row.
What I've been trying to figure out is how to capture the block of html < tr>stuff< /tr> and save it into this.ItemBlock.
That part is pretty easy:
vnlItemOnPage.ItemBlock = element.getElementByClassName('className')[0]
?
That seems pretty straight forward. Am I missing something?
This part I am stuck:
There'll be 50 of them so I need an array of vnlItemOnPage?
vnlAllItems = ???
var vnlAllItems = [vnlItemOnPage]?
And, how would I add to the array and delete from the array? I probably wont delete from the array if that is complicated don't bother with it.
Once I capture the < tr> html, I can just append it to a table element like so:
myTable.appendChild(vnlAllItems[0].ItemBlock);
Correct?
I'm open to any suggestions if you think I'm approaching this from the wrong direction. Performance is not a big issue - at least right now. Later I may try to conflate several pages for a couple hundred items.
Thanks for your assistance!
[edit]
Perhaps the second part of the question is so basic it's hard to believe I don't know the answer.
The array could be: var vnlAllItems = []
And then it is just:
var row1 = new vnlItemOnPage;
vnlAllItems.push(row1);
var row2 = new vnlItemOnPage;
row2.ItemBlock = element.getElementByClassName('className')[0];
I'd like to close the question but I hate to do that without something about handling the array.
JQuery is your friend here.
This will give you the inner HTML for the first row in the body of your desired table:
var rowHtml = $('table#id-of-desired-table tbody tr:first').html() ;
To get the outer HTML, you need a jQuery extension method:
jQuery.fn.outerHTML = function() {
return $('<div>').append( this.eq(0).clone() ).html();
};
Usage is simple:
var rowHtml = $('table#id-of-desired-table tbody tr:first').outerHtml() ;
Enjoy!
Not sure if it is what you are looking for, but if I wanted to manipulate table rows I would store:
Row's whole html <td>1</td>...<td>n</td> as string so I can quickly reconstruct the row
For each row store actual cell values [1, ..., n], so I can do some manipulations with values (sort)
To get row as html you can use:
var rowHtml = element.getElementByClassName('className')[0].innerHTML;
To get array of cell values you can use:
var cells = [];
var cellElements = element.getElementByClassName('className')[0].cells;
for(var i=0;i<cellElements.length;i++) {
cells.push(cellElements[i].innerText);
}
So the object to store all this would look something like:
function vnlItemOnPage(){
this.Category = "unknown";
this.ItemClass = "vnlDefaultClass";
this.RowHtml = "";
this.RowCells = [];
}

Categories

Resources