I created the following js function
function csvDecode(csvRecordsList)
{
var cel;
var chk;
var chkACB;
var chkAF;
var chkAMR;
var chkAN;
var csvField;
var csvFieldLen;
var csvFieldsList;
var csvRow;
var csvRowLen = csvRecordsList.length;
var frag = document.createDocumentFragment();
var injectFragInTbody = function () {tblbody.replaceChild(frag, tblbody.firstElementChild);};
var isFirstRec;
var len;
var newEmbtyRow;
var objCells;
var parReEx = new RegExp(myCsvParag, 'ig');
var tblbody;
var tblCount = 0;
var tgtTblBodyID;
for (csvRow = 0; csvRow < csvRowLen; csvRow++)
{
if (csvRecordsList[csvRow].startsWith(myTBodySep))
{
if (frag.childElementCount > 0)
{
injectFragInTbody();
}
tgtTblBodyID = csvRecordsList[csvRow].split(myTBodySep)[1];
newEmbtyRow = getNewEmptyRow(tgtTblBodyID);
objCells = newEmbtyRow.cells;
len = newEmbtyRow.querySelectorAll('input')[0].parentNode.cellIndex; // Finds the cell index where is placed the first input (Check-box or button)
tblbody = getElById(tgtTblBodyID);
chkAF = toBool(tblbody.dataset.acceptfiles);
chkACB = toBool(tblbody.dataset.acceptcheckboxes) ;
chkAN = toBool(tblbody.dataset.acceptmultiplerows) ;
tblCount++;
continue;
}
csvRecordsList[csvRow] = csvRecordsList[csvRow].replace(parReEx, myInnerHTMLParag); // Replaces all the paragraph symbols ΒΆ used into the db.csv file with the tag <br> needed into the HTML content of table cells, this way will be possible to use line breaks into table cells
csvFieldsList = csvRecordsList[csvRow].split(myEndOfFld);
csvFieldLen = csvFieldsList.length;
for (csvField = 0; csvField < csvFieldLen; csvField++)
{
cel = chkAN ? csvField + 1 : csvField;
if (chkAF && cel === 1) {objCells[cel].innerHTML = makeFileLink(csvFieldsList[csvField]);}
else if (chkACB && cel === len) {objCells[cel].firstChild.checked = toBool(csvFieldsList[csvField]);}
else {objCells[cel].innerHTML = csvFieldsList[csvField];}
}
frag.appendChild(newEmbtyRow.cloneNode(true));
}
injectFragInTbody();
var recNum = getElById(tgtTblBodyID).childElementCount;
customizeHtmlTitle();
return csvRow - tblCount + ' (di cui '+ recNum + ' record di documenti)';
}
More than 90% of records could contain file names that have to be processed by the following makeFileLink function:
function makeFileLink(fname)
{
return ['<a href="', dirDocSan, fname, '" target="', previewWinName, '" title="Apri il file allegato: ', fname, '" >', fname, '</a>'].join('');
}
It aims to decode a record list from a special type of *.db.csv file (= a comma-separated values where commas are replaced by another symbol I hard-coded into the var myEndOfFld). (This special type of *.db.csv is created by another function I wrote and it is just a "text" file).
The record list to decode and append to HTML tables is passed to the function with its lone parameter: (csvRecordsList).
Into the csv file is hosted data coming from more HTML tables.
Tables are different for number of rows and columns and for some other contained data type (which could be filenames, numbers, string, dates, checkbox values).
Some tables could be just 1 row, others accept more rows.
A row of data has the following basic structure:
data field content 1|data field content 2|data field content 3|etc...
Once decoded by my algorithm it will be rendered correctly into the HTML td element even if into a field there are more paragraphs. In fact the tag will be added where is needed by the code:
csvRecordsList[csvRow].replace(par, myInnerHTMLParag)
that replaces all the char I choose to represent the paragraph symbol I have hard-coded into the variable myCsvParag.
Isn't possible to know at programming time the number of records to load in each table nor the number of records loaded from the CSV file, nor the number of fields of each record or what table field is going to contain data or will be empty: in the same record some fields could contain data others could be empty. Everything has to be discovered at runtime.
Into the special csv file each table is separated from the next by a row witch contains just a string with the following pattern: myTBodySep = tablebodyid where myTBodySep = "targettbodydatatable" that is just a hard coded string of my choice.
tablebodyid is just a placeholder that contains a string representing the id of the target table tbody element to insert new record in, for example: tBodyDataCars, tBodyDataAnimals... etc.
So when the first for loop finds into the csvRecordsList a string staring with the string into the variable myTBodySep it gets the tablebodyid from the same row: this will be the new tbodyid that has to be targeted for injecting next records in it
Each table is archived into the CSV file
The first for loop scan the csv record list from the file and the second for loop prepare what is needed to compile the targeted table with data.
The above code works well but it is a little bit slow: in fact to load into the HTML tables about 300 records from the CSV file it takes a bit more of 2.5 seconds on a computer with 2 GB ram and Pentium core 2 4300 dual-core at 1800 MHz but if I comment the row that update the DOM the function needs less than 0.1 sec. So IMHO the bottle neck is the fragment and DOM manipulating part of the code.
My aim and hope is to optimize the speed of the above code without losing functionalities.
Notice that I'm targeting just modern browsers and I don't care about others and non standards-compliant browsers... I feel sorry for them...
Any suggestions?
Thanks in advance.
Edit 16-02.2018
I don't know if it is useful but lastly I've noticed that if data is loaded from browser sessionstorage the load and rendering time is more or less halved. But strangely it is the exact same function that loads data from both file and sessionstorage.
I don't understand why of this different behavior considering that the data is exactly the same and in both cases is passed to a variable handled by the function itself before starting checking performance timing.
Edit 18.02.2018
Number of rows is variable depending on the target table: from 1 to 1000 (could be even more in particular cases)
Number of columns depending on the target table: from 10 to 18-20
In fact, building the table using DOM manipulations are way slower than simple innerHTML update of the table element.
And if you tried to rewrite your code to prepare a html string and put it into the table's innerHTML you would see a significant performance boost.
Browsers are optimized to parse the text/html which they receive from the server as it's their main purpose. DOM manipulations via JS are secondary, so they are not so optimized.
I've made a simple benchmark for you.
Lets make a table 300x300 and fill 90000 cells with 'A'.
There are two functions.
The first one is a simplified variant of your code which uses DOM methods:
var table = document.querySelector('table tbody');
var cells_in_row = 300, rows_total = 300;
var start = performance.now();
fill_table_1();
console.log('using DOM methods: ' + (performance.now() - start).toFixed(2) + 'ms');
table.innerHTML = '<tbody></tbody>';
function fill_table_1() {
var frag = document.createDocumentFragment();
var injectFragInTbody = function() {
table.replaceChild(frag, table.firstElementChild)
}
var getNewEmptyRow = function() {
var row = table.firstElementChild;
if (!row) {
row = table.insertRow(0);
for (var c = 0; c < cells_in_row; c++) row.insertCell(c);
}
return row.cloneNode(true);
}
for (var r = 0; r < rows_total; r++) {
var new_row = getNewEmptyRow();
var cells = new_row.cells;
for (var c = 0; c < cells_in_row; c++) cells[c].innerHTML = 'A';
frag.appendChild(new_row.cloneNode(true));
}
injectFragInTbody();
return false;
}
<table><tbody></tbody></table>
The second one prepares html string and put it into the table's innerHTML:
var table = document.querySelector('table tbody');
var cells_in_row = 300, rows_total = 300;
var start = performance.now();
fill_table_2();
console.log('setting innerHTML: ' + (performance.now() - start).toFixed(2) + 'ms');
table.innerHTML = '<tbody></tbody>';
function fill_table_2() {// setting innerHTML
var html = '';
for (var r = 0; r < rows_total; r++) {
html += '<tr>';
for (var c = 0; c < cells_in_row; c++) html += '<td>A</td>';
html += '</tr>';
}
table.innerHTML = html;
return false;
}
<table><tbody></tbody></table>
I believe you'll come to some conclusions.
I've got two thoughts for you.
1: If you want to know which parts of your code are (relatively) slow you can do very simple performance testing using the technique described here. I didn't read all of the code sample you gave but you can add those performance tests yourself and check out which operations take more time.
2: What I know of JavaScript and the browser is that changing the DOM is an expensive operation, you don't want to change the DOM too many times. What you can do instead is build up a set of changes and then apply all those changes with one DOM change. This may make your code less nice, but that's often the tradeoff you have when you want to have high performance.
Let me know how this works out for you.
You should start by refactoring your code in multiples functions to make it a bit more readable. Make sure that you are separating DOM manipulation functions from data processing functions. Ideally, create a class and get those variables out of your function, this way you can access them with this.
Then, you should execute each function processing data in a web worker, so you're sure that your UI won't get blocked by the process. You won't be able to access this in a web worker so you will have to limit it to pure "input/output" operations.
You can also use promises instead of homemade callbacks. It makes the code a bit more readable, and honestly easier to debug. You can do some cool stuff like :
this.processThis('hello').then((resultThis) => {
this.processThat(resultThis).then((resultThat) => {
this.displayUI(resultThat);
}, (error) => {
this.errorController.show(error); //processThat error
});
}, (error) => {
this.errorController.show(error); //processThis error
});
Good luck!
Related
I have a Google spreadsheet and a Google document. The document is a report which gets filled by the spreadsheet. The spreadsheet is also defining what comes into the report. Therefore I have a script, which gathers a bunch of placeholders depending on values in the the document.
After all the placeholders have been inserted in the document (there are a couple of pages before that) it looks kind of like this:
{{header1.1}}
{{text1.1}}//this is already a couple lines of text
{{table1.1}}
{{table.dir}}
{{blob1.1}}
{{blob.dir}}
I already have a script, which inserts all the text parts and I have set up a script, which should be capable of writing the tables at the correct position. So far I can replace the {{header1.1}}, but if I try to define it as a heading it works, but everything after the header1.1 is also a heading
I've been at this problem for quite a while and didn't get and its always one step forward one step back. Also this is my first question after a couple of years just reading on stackoverflow. I'd appreciate if someone could help.
function myUeberschriftenboi() {
doc = DocumentApp.openById('someID');
console.log(doc.getName());
var body = doc.getBody();
//formate
const plain3style = {};
plain3style[DocumentApp.Attribute.HEADING] = DocumentApp.ParagraphHeading.HEADING3;
var lvl2array = [ "{{header1.1}}" , "{{header1.2}}" ];
var fill2array = [ "Energy" , "Energyflow" ]
var lvl2count = 1;
for( var j = 0 ; j < lvl2array.length ; j++)
{
var seek = body.findText(lvl2array[j]);
if( seek != null)
{
body.replaceText(lvl2array[j] , "1.1."+lvl2count+" "+fill2array[j]+"\n");
var seek2 = body.findText("1."+lvl2count+" "+fill2array[j]);
seek2.getElement().getParent().getChild().setAttributes(plain3style);
lvl2count++;
}}}
Ok, I know there are similar questions out there to mine, but so far I have yet to find any answers that work for me. What I am trying to do is gather data from an entire HTML table on the web (https://www.sports-reference.com/cbb/schools/indiana/2022-gamelogs.html) and then parse it/transfer it to a range in my Google Sheet. The code below is probably the closest thing I've found so far because at least it doesn't error out, but it will only find one string or value, not the whole table. I've found other answers where they use xmlservice.parse, however that doesn't work for me, I believe because the HTML format has issues that it can't parse. Does anyone have an idea of how to edit what I have below, or a whole new idea that may work for this website?
function SAMPLE() {
const url="http://www.sports-reference.com/cbb/schools/indiana/2022-gamelogs.html#sgl-basic?"
// Get all the static HTML text of the website
const res = UrlFetchApp.fetch(url, {muteHttpExceptions: true}).getContentText();
// Find the index of the string of the parameter we are searching for
index = res.search("td class");
// create a substring to only get the right number values ignoring all the HTML tags and classes
sub = res.substring(index+92,index+102);
Logger.log(sub);
return sub;
}
I understand that I can use importHTML natively in a Google Sheet, and that's what I'm currently doing. However I am doing this for over 350 webpage tables, and iterating through each one to load it and then copy the value to another sheet. App Script bogs down quite a bit when it is repeatedly waiting on Sheets to load an importHTMl and then grab some data and do it all over again on another url. I apologize for any formatting issues in this post or things I've done wrong, this is my first time posting here.
Edit: ok, I've found a method that works, but it's still much slower than I would like, because it is using Drive API to create a document with the HTML data and then parse and create an array from there. The Drive.Files.Insert line is the most time consuming part. Anyone have an idea of how to make this quicker? It may not seem that slow to you right now, but when I need to do this 350 times, it adds up.
function parseTablesFromHTML() {
var html = UrlFetchApp.fetch("https://www.sports-reference.com/cbb/schools/indiana/2022-gamelogs.html");
var docId = Drive.Files.insert(
{ title: "temporalDocument", mimeType: MimeType.GOOGLE_DOCS },
html.getBlob()
).id;
var tables = DocumentApp.openById(docId)
.getBody()
.getTables();
var res = tables.map(function(table) {
var values = [];
for (var row = 0; row < table.getNumRows(); row++) {
var temp = [];
var cols = table.getRow(row);
for (var col = 0; col < cols.getNumCells(); col++) {
temp.push(cols.getCell(col).getText());
}
values.push(temp);
}
return values;
});
Drive.Files.remove(docId);
var range=SpreadsheetApp.getActive().getSheetByName("Test").getRange(3,6,res[0].length,res[0][0].length);
range.setValues(res[0]);
SpreadsheetApp.flush();
}
Solution by formula
Try
=importhtml(url,"table",1)
Other solution by script
function importTableHTML() {
var url = 'https://www.sports-reference.com/cbb/schools/indiana/2022-gamelogs.html'
var html = '<table' + UrlFetchApp.fetch(url, {muteHttpExceptions: true}).getContentText().replace(/(\r\n|\n|\r|\t| )/gm,"").match(/(?<=\<table).*(?=\<\/table)/g) + '</table>';
var trs = [...html.matchAll(/<tr[\s\S\w]+?<\/tr>/g)];
var data = [];
for (var i=0;i<trs.length;i++){
var tds = [...trs[i][0].matchAll(/<(td|th)[\s\S\w]+?<\/(td|th)>/g)];
var prov = [];
for (var j=0;j<tds.length;j++){
donnee=tds[j][0].match(/(?<=\>).*(?=\<\/)/g)[0];
prov.push(stripTags(donnee));
}
data.push(prov);
}
return(data);
}
function stripTags(body) {
var regex = /(<([^>]+)>)/ig;
return body.replace(regex,"");
}
I have a script to create Notes on cells based on their value, but the process is very slow and my sheet has 15000 rows. Is it possible to reduce the delay by optimizing the script ?
PS : I use spreadsheet with french parameters.
function InsertCellsNotes(){
var plage = SpreadsheetApp.getActiveSpreadsheet().getSelection().getActiveRange();
var Notes = plage.getValues();
var NB_lines = Notes.length;
for (var i=1; i<NB_lines+1; i++){ // ajouter +1 !
var myCell = plage.getCell(i, 1);
var cellValue = Notes[i-1];
if (cellValue == "#N/A" || ""){ }
else { myCell.setNote(cellValue); }
}
}
An example of the sheet : https://docs.google.com/spreadsheets/d/1lu7dEoyO2NDHV4phXeh8DAAkbBuQG5EQWwMA6SJDP1A/edit?usp=sharing
Explanation:
Two tricks:
Avoid unnecessary api calls when possible. You are iteratively using methods that interact with the spreadsheet file and that causes extreme delays. Read best practices.
When you use null as an argument for setNote, no note is set. We can take advantage of this and construct an array by using the map method. Namely, if the value is #N/A or blank "", assign null to the element, otherwise take the value of the cell:
var notes = rng.getValues().flat().map(v=>[v=="#N/A" || ""?null:v]);
This will allow you to get rid of the for loops but also create an array that can directly be used in the setNotes function: rng.setNotes(notes);
Solution - active range:
Select (with your mouse) a particular range and insert notes (depending on the condition):
function InsertCellsNotes(){
var rng = SpreadsheetApp.getActiveRange();
var notes = rng.getValues().flat().map(v=>[v=="#N/A" || ""?null:v]);
rng.setNotes(notes);
}
Solution - predefined range:
This is a more static approach. You define a particular sheet "Sheet1" and for all the cells in column B (until the last row with content of the sheet) you insert notes (depending on the condition):
function InsertCellsNotes(){
var plage = SpreadsheetApp.getActive().getSheetByName("Sheet1");
var rng = plage.getRange(1,2,plage.getLastRow(),1);
var notes = rng.getValues().flat().map(v=>[v=="#N/A" || ""?null:v]);
rng.setNotes(notes);
}
I've just written my first google apps scripts, ported from VBA, which formats a column of customer order information (thanks to you all of your direction).
Description:
The code identifies state codes by their - prefix, then combines the following first name with a last name (if it exists). It then writes "Order complete" where the last name would have been. Finally, it inserts a necessary blank cell if there is no gap between the orders (see image below).
Problem:
The issue is processing time. It cannot handle longer columns of data. I am warned that
Method Range.getValue is heavily used by the script.
Existing Optimizations:
Per the responses to this question, I've tried to keep as many variables outside the loop as possible, and also improved my if statements. #MuhammadGelbana suggests calling the Range.getValue method just once and moving around with its value...but I don't understand how this would/could work.
Code:
function format() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var s = ss.getActiveSheet();
var lastRow = s.getRange("A:A").getLastRow();
var row, range1, cellValue, dash, offset1, offset2, offset3;
//loop through all cells in column A
for (row = 0; row < lastRow; row++) {
range1 = s.getRange(row + 1, 1);
//if cell substring is number, skip it
//because substring cannot process numbers
cellValue = range1.getValue();
if (typeof cellValue === 'number') {continue;};
dash = cellValue.substring(0, 1);
offset1 = range1.offset(1, 0).getValue();
offset2 = range1.offset(2, 0).getValue();
offset3 = range1.offset(3, 0).getValue();
//if -, then merge offset cells 1 and 2
//and enter "Order complete" in offset cell 2.
if (dash === "-") {
range1.offset(1, 0).setValue(offset1 + " " + offset2);
//Translate
range1.offset(2, 0).setValue("Order complete");
};
//The real slow part...
//if - and offset 3 is not blank, then INSERT CELL
if (dash === "-" && offset3) {
//select from three rows down to last
//move selection one more row down (down 4 rows total)
s.getRange(row + 1, 1, lastRow).offset(3, 0).moveTo(range1.offset(4, 0));
};
};
}
Formatting Update:
For guidance on formatting the output with font or background colors, check this follow-up question here. Hopefully you can benefit from the advice these pros gave me :)
Issue:
Usage of .getValue() and .setValue() in a loop resulting in increased processing time.
Documentation excerpts:
Minimize calls to services:
Anything you can accomplish within Google Apps Script itself will be much faster than making calls that need to fetch data from Google's servers or an external server, such as requests to Spreadsheets, Docs, Sites, Translate, UrlFetch, and so on.
Look ahead caching:
Google Apps Script already has some built-in optimization, such as using look-ahead caching to retrieve what a script is likely to get and write caching to save what is likely to be set.
Minimize "number" of read/writes:
You can write scripts to take maximum advantage of the built-in caching, by minimizing the number of reads and writes.
Avoid alternating read/write:
Alternating read and write commands is slow
Use arrays:
To speed up a script, read all data into an array with one command, perform any operations on the data in the array, and write the data out with one command.
Slow script example:
/**
* Really Slow script example
* Get values from A1:D2
* Set values to A3:D4
*/
function slowScriptLikeVBA(){
const ss = SpreadsheetApp.getActive();
const sh = ss.getActiveSheet();
//get A1:D2 and set it 2 rows down
for(var row = 1; row <= 2; row++){
for(var col = 1; col <= 4; col++){
var sourceCellRange = sh.getRange(row, col, 1, 1);
var targetCellRange = sh.getRange(row + 2, col, 1, 1);
var sourceCellValue = sourceCellRange.getValue();//1 read call per loop
targetCellRange.setValue(sourceCellValue);//1 write call per loop
}
}
}
Notice that two calls are made per loop(Spreadsheet ss, Sheet sh and range calls are excluded. Only including the expensive get/set value calls). There are two loops; 8 read calls and 8 write calls are made in this example for a simple copy paste of 2x4 array.
In addition, Notice that read and write calls alternated making "look-ahead" caching ineffective.
Total calls to services: 16
Time taken: ~5+ seconds
Fast script example:
/**
* Fast script example
* Get values from A1:D2
* Set values to A3:D4
*/
function fastScript(){
const ss = SpreadsheetApp.getActive();
const sh = ss.getActiveSheet();
//get A1:D2 and set it 2 rows down
var sourceRange = sh.getRange("A1:D2");
var targetRange = sh.getRange("A3:D4");
var sourceValues = sourceRange.getValues();//1 read call in total
//modify `sourceValues` if needed
//sourceValues looks like this two dimensional array:
//[//outer array containing rows array
// ["A1","B1","C1",D1], //row1(inner) array containing column element values
// ["A2","B2","C2",D2],
//]
//#see https://stackoverflow.com/questions/63720612
targetRange.setValues(sourceValues);//1 write call in total
}
Total calls to services: 2
Time taken: ~0.2 seconds
References:
Best practices
What does the range method getValues() return and setValues() accept?
Using methods like .getValue() and .moveTo() can be very expensive on execution time. An alternative approach is to use a batch operation where you get all the column values and iterate across the data reshaping as required before writing to the sheet in one call. When you run your script you may have noticed the following warning:
The script uses a method which is considered expensive. Each
invocation generates a time consuming call to a remote server. That
may have critical impact on the execution time of the script,
especially on large data. If performance is an issue for the script,
you should consider using another method, e.g. Range.getValues().
Using .getValues() and .setValues() your script can be rewritten as:
function format() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var s = ss.getActiveSheet();
var lastRow = s.getLastRow(); // more efficient way to get last row
var row;
var data = s.getRange("A:A").getValues(); // gets a [][] of all values in the column
var output = []; // we are going to build a [][] to output result
//loop through all cells in column A
for (row = 0; row < lastRow; row++) {
var cellValue = data[row][0];
var dash = false;
if (typeof cellValue === 'string') {
dash = cellValue.substring(0, 1);
} else { // if a number copy to our output array
output.push([cellValue]);
}
// if a dash
if (dash === "-") {
var name = (data[(row+1)][0]+" "+data[(row+2)][0]).trim(); // build name
output.push([cellValue]); // add row -state
output.push([name]); // add row name
output.push(["Order complete"]); // row order complete
output.push([""]); // add blank row
row++; // jump an extra row to speed things up
}
}
s.clear(); // clear all existing data on sheet
// if you need other data in sheet then could
// s.deleteColumn(1);
// s.insertColumns(1);
// set the values we've made in our output [][] array
s.getRange(1, 1, output.length).setValues(output);
}
Testing your script with 20 rows of data revealed it took 4.415 seconds to execute, the above code completes in 0.019 seconds
I have a problem I am not able to fix, quite difficult to explain but I will do my best.
Basically I have created a web application (in CodeIgniter) that takes some data of an array coming from a json encoding and adds rows to a table by using the jQuery .prependTo() method in the success function.
Every row of the table contains different elements from the database (coming from the json), depending from the value of the i counter.
The code is as the following (i cut the part about the <tr> and <td> content, it's just styling)
$.ajax({
type: "get",
async: false,
....
....
success: function(data) {
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
for(i = 0; i < data.length; i++) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
var newTr = // creates the <tr>
newTr.html('//creates the <td>')
newTr.prependTo("#myTable");
}
},
I am sorry if it is a bit unclear, if you need me to update the code because I missed something important let me know and I will do it, but this code alone should explain my problem.
The application works beautifully if in the database there are just a little number of rows (for example, i = 300). They are showed and rendered correctly and I don't get any browser slowing process. But when I work with i = 4000 rows, the browser starts acting slow just like the Javascript code is too heavy to render, and i get "lag" while trying to scroll down the html table, or inputting some values in the input boxes inside the table (to later update it by clicking a button). This is my problem: I am quite sure I'm doing something wrong that is loading up too much memory, as I tested this also on very strong computers. Even totally disabling my CSS won't do the trick.
Thanks for any help you can give me, it would be really appreciated.
The problem
You are using a lot of function call inside a loop. That take a lot of juice and the more item you have, the slower it is.
To solve that, we need to reduce the number of function calls.
My suggestion
Working with native JavaScript would save on performance here. So I suggestion you use string concatenation instead of DOM manipulation methods of jQuery.
Let rework you loop. Since you want your data in a descendant order, we need to reverse the loop :
for(i = data.length; i >= 0; i--)
Then simple string concatenation to build a tr. For that you need a var outside the loop:
var myHTML = ''; //That's the one!
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
And build the tr with +=. Although, using a single line would make it faster, but less readable :
for(i = data.length; i >= 0; i--) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
myHTML += '<tr>';
myHTML += '<td>'+yourData+'</td>';//add those data here
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '</tr>';
}
Then, prepend it to your table :
var myHTML = '';
var i;
var item_cost = new Array();
var item_name = new Array();
var item_code = new Array();
var item_interno = new Array();
for(i = data.length; i >= 0; i--) {
item_cost[i] = data[i].cost;
item_name[i] = data[i].name;
item_code[i] = data[i].code;
item_interno[i] = data[i].codiceinterno;
myHTML += '<tr>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '<td>'+yourData+'</td>';
myHTML += '</tr>';
}
$('#myTable').prepend(myHTML);
Limitation
From my test, the string length can be 2^28 but cannot be 2^29. That make a maximum length of approx. 268,435,456 (approx. might not be the best word here since it's between 268,435,456 and 536,870,912.
If you data character count is higher than that (but let be honnest, that would be a lot of data), you might have to split you string into 2 variables.
I know this is not a real answer to your question, but...
jQuery definitely causes the lag - no wonder. But - whether you choose jQuery to build the table or you just concatenate HTML - let's agree on that: 4000 rows is definitely a lot of data. Too much data? I would say: 200 already is. If we go thousands, the question pops up: why? Is there any application-related reason you really need to retrieve such a big number of records at once? Some of them will probably never be read.
Why not try an ajax-based lazy load approach? Loading 50-record portions every time would seem more robust IMO, and would definitely be a better UX.
My two cents: Why don't retrieve the rows composed from the server instead of making the client work to compound it? You could even cache it. The only thing would be that you'll have more traffic, but in the browser it'll go smoother
UPDATE
To achieve it, you can make your method to distinguish between ajax input or not, and show only $tbody, or the whole table according to it. It would be something like:
$tbody = $this->load->view( 'view_with_oly_tbody', $data, true );
if ( $this->input->is_ajax_request() ) {
return $tbody;
} else {
$data['tbody'] = $tbody;
$this->load->view('your_view_with_table_with_tbody_as_var', $data);
}
Note: Code above will vary according to your views, controller, if you have json in your AJAX or not, write the return in other part, etc. Code above is just for clarify my point with CI.
Ah! and you'll have to manage the answer in the client to append/prepend/substitute the body of the table.