I currently have code that runs through every row of a html table and updates it with a different row.
Here is the code
function sort(index) {
var rows = $table.find('tbody tr');
var a = $table.find('tbody tr');
//Only sort if it has not been sorted yet and the index not the same
if(sortedIndex === index){
for(var i = 0; i < rows.length; i++){
a[i].outerHTML = rows[(rows.length - i)-1].outerHTML;
}
toggleSorted();
}else{
sortedIndex = index;
rows.sort(naturalSort);
for (var i = 0; i < rows.length; i++) {
a[i].outerHTML = rows[i].outerHTML;
}
sortedDown = true;
}
$('#${tableId}').trigger('repaginate');
};
What I am trying to do is to instead of going through every single row in the for loop and setting a[i].outterHTML = rows[i].outterHTML; I would like to just set all of the rows at once. Currently it takes about 1.5 seconds to set them and that is very slow.... Only issue is I cannot seem to find a way to do this. Is this actually possible? (It takes 1.5 seconds on large data sets which is what I am working with).
Since the rows are the same just reordered, you can .append them with the new order:
var $tBody = $table.find('tbody');
var $rows = $tBody.find('tr');
if(sortedIndex === index){
toggleSorted();
$tBody.append($rows.get().reverse());
}
else {
sortedIndex = index;
sortedDown = true;
$tBody.append($rows.get().sort(naturalSort));
}
Here's a fiddle that demonstrates the above: http://jsfiddle.net/k4u45Lnn/1/
Unfortunately the only way to "set all of your rows at once" is to loop through all of your rows and perform an operation on each row. There may be some libraries which have methods and functions that make it look like you're doing the operation on all of your rows at one shot, but ultimately if you want to edit every element in a set you need to iterate through the set and carry out the action on each element, seeing as HTML doesn't really provide any way to logically "link" the attributes of your elements.
Related
I have an HTML table. The first row contains a checkbox. There is a javascript method associated to the checkbox change. If the checkbox is checked, the code adds a few rows to the table and fills them. If the checkbox is unchecked, the code removes all rows but the first one (the one that contains the checkbox).
The first part of the code works fine : the rows are properly added.
I have an issue with the second part. Here is my code :
if (checkboxValue) {
//Add first row
var tr1 = document.createElement("tr");
var td1_1 = document.createElement("td");
....
tr1.appendChild(td1_1);
var td1_2 = document.createElement("td");
...
tr1.appendChild(td1_2);
table.appendChild(tr1);
//Add second row
var tr2 = document.createElement("tr");
var td2_1 = document.createElement("td");
...
tr2.appendChild(td2_1);
var td2_2 = document.createElement("td");
...
tr2.appendChild(td2_2);
table.appendChild(tr2);
} else {
//Remove all rows but the first
var rows = table.getElementsByTagName("tr");
var nbrRows = rows.length;
if (nbrRows > 1) {
for (var i = 1; i < nbrRows; i++) {
var row = rows[i];
row.parentNode.removeChild(row);
}
}
}
The issue always come from rows[2] being undefined. I have no idea why!
If, instead of using removeChild, I write row.innerHTML = "", I have the visual effect I am looking for (all rows gone), but this is inelegant, since the table now contains several empty rows (their number increasing everytime I check/uncheck the checkbox).
A clue? Thank you very much for your time!
Don't use for-loop to remove DOM elements like this. The problem is that rows is a live collection, meaning that it updates every time you remove elements from DOM. As the result, i counter shifts and eventually you hit "undefined" element spot.
Instead, use while loop. For example, to remove all rows except the first one you could do:
var rows = table.getElementsByTagName("tr");
while (rows.length > 1) {
rows[1].parentNode.removeChild(rows[1]);
}
Also note, that it's getElementsByTagName without s.
UPD. Or iterate backward if you like for-loops better:
var rows = table.getElementsByTagName("tr");
for (var i = rows.length - 1; i > 0; i--) {
rows[i].parentNode.removeChild(rows[i]);
}
Demo: https://jsfiddle.net/9y03co6w/
you remove a row from the array you are iterating over. This is always a bad idea and probably the reason for your error.
solution: start iterating from the end instead of the beginning.
also see for example: example
try to replace this line
var rows = table.getElementsByTagNames("tr");
by
var rows = table.find("tr");
I used this code to live search in table:
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
$.each($table.find('tbody'), function() {
if($(this).text().toLowerCase().indexOf(value) == -1)
$(this).hide();
else
$(this).show();
});
});
It worked great, but now I have 1000+ rows in table and searching is really slow on older machines so I created array from table:
function getTableData() {
var data = [];
$table.find('tbody').each(function (rowIndex, r) {
var cols = [];
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
data.push(cols);
});
return data;
};
And I don't know how to search same way in this array to get tbody index.
3 Update uses cache and a hidden results box;
Here we hide the table, and create a results list on the fly, please note the html markup is very simple and incorrect but it demonstrates the speed improvements of not touching the dom when we cycle our memory cache, the next improvement would be to cache the html of each row and have it in our cache object, thus no need to get the row from the dom when it matches.
http://jsfiddle.net/hh6aed45/5/
var $search = $('input[type="text"]');
var $table = $('table');
var $result = document.getElementById("search");
var i=1;
function uuid(){
i++;
return "A"+i;
};
$search.keyup(function() {
var value = this.value.toLowerCase().trim(), html="";
for (row in cache) {
// the getElementById can be speeded up by caching the html in our memory cache!
if (cache[row].indexOf(value) !== -1) html+=document.getElementById(row).innerHTML;
}
$result.innerHTML=html;
});
function getTableData() {
var cache = {};
$table.find('tbody').each(function (rowIndex, r) {
$(this).find("tr").each(function(rowIndex,r){
var cols = [], id = uuid();
r.id=id;
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
cache[id]=cols.join(" ").toLowerCase();
});
});
return cache;
};
var cache = getTableData();
2 Update uses just a memory cache!
http://jsfiddle.net/hh6aed45/3/
This caches the dom as you requested, I add a uuid as an id so that the search can find each row, i could of cached an object and included a live reference to the dom. in fairness as I said I would only switch on what filters in and switch in the minority, as that would speed the loop as no Dom changes would occur for all rows that are not matching, which out of 10,000 would be many. I would also toggle a css class for "in" and "out", that would allow a live list to be cached of all in items, which then could be hidden before the search starts. So in basic, hide all 10,000 rows, turn on what matches, thus the loop does not touch the Dom unless it has! That is about as fast as you will get it.
var $search = $('input[type="text"]');
var $table = $('table');
var i=1;
function uuid(){
i++;
return "A"+i;
};
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
for (row in cache) {
document.getElementById(row).style.display = (cache[row].indexOf(value) === -1) ? "none" : "table-row";
}
});
function getTableData() {
var cache = {};
$table.find('tbody').each(function (rowIndex, r) {
$(this).find("tr").each(function(rowIndex,r){
var cols = [], id = uuid();
r.id=id;
$(this).find('td').each(function (colIndex, c) {
cols.push(c.textContent);
});
cache[id]=cols.join(" ").toLowerCase();
});
});
return cache;
};
var cache = getTableData();
1 uses native raw javascript with optimizations;
For this kind of thing plane java-script is a must, if your talking about performance first things first, throw jquery out in your loop. Cache the dom first, dont keep loading it.
Speed is an illusion when your building programs.
For example you could do the lookup after two or three letters, not one. You could also change the code so that if a row does not match it is skipped, not switched into view.
so step one turn all rows on, (find all rows that are off, cache the view as views are live)
step two remove rows, not turn rows on and off as you go, that will speed the loop. here is
a sped up version walking the dom, if we had 10,000 rows two test we could improve and improve.
Here is a starting point, from here we use the steps as suggested above:
var $search = $('input[type="text"]'),
$table = $('table'),
$body = $table.find('tbody')
$rows = $body.find('tr'); // this view of the dom is live!
$search.keyup(function() {
var value = this.value.toLowerCase().trim();
//loops are costly, native is ultra fast
for (var i = 0; l=$rows.length, i < l; i++) {
$rows[i].style.display= ($rows[i].innerHTML.toLowerCase().indexOf(value) === -1) ? "none" : "table-row";
}
});
http://jsfiddle.net/hh6aed45/
I am trying to write a script in Google Apps Script that takes cell information from one sheet and copies it to another sheet, both for just grabbing certain columns to display on the second sheet and also a condition based on the values inside cells in a certain column. Here is what I have so far:
function onMyEdit() {
var myMaster = SpreadsheetApp.openById("xxxxx");
var masterSheet = myMaster.setActiveSheet(myMaster.getSheets()[0]);
var myNames = SpreadsheetApp.openById("xxxxx");
var namesSheet = myNames.setActiveSheet(myNames.getSheets()[0]);
var row1 = masterSheet.getRange(1, 1, masterSheet.getLastRow(), 1);
var rowV = row1.getValues();
var firstArray = masterSheet.getDataRange().getValues();
var dataList = [];
for (var i = 1; i < rowV.length; i++) {
dataList.push(firstArray[i][0]);
dataList.push(firstArray[i][1]);
dataList.push(firstArray[i][2]);
dataList.push(firstArray[i][3]);
}
for (var j = 0; j < rowV.length - 1; j++) {
namesSheet.getRange(2, j + 1, 1, 1).setValue(dataList[j]);
}
}
So as of now it only works on one row, starting from the second row (to allow for column headers). And I suppose when I want to grab rows conditionally based on cell data, I will use an 'if' statement for the condition inside the 'for' loop, but I want the data to copy to the next available row in both sheets. I suppose I'd use something like:
' getLastRow + 1 '
or something like that. I need this code to be as efficient as possible because of the amount of data and its purpose. I am pretty new to programming so please explain in detail, and thanks again.
I'm not sure I understood exactly what you wanted to do but -from what I understood- this code snippet should give you a better way to start with...
(I added a few comments to explain in the code itself)
function onMyEdit() {
var myMaster = SpreadsheetApp.openById("MasterSheet ID");
var masterSheet = myMaster.getSheets()[0]; // get 1rst sheet
var myNames = SpreadsheetApp.openById("NamesSheet ID");
var namesSheet = myNames.getSheets()[0]; // get 1rst sheet
var firstArray = masterSheet.getDataRange().getValues();
var dataList = [];
for ( r = 1; r < firstArray.length; r++) { // iterate the first col of masterSheet
if(firstArray[r][0]=='some condition'){ // if value in the first column == 'some condition get the second column cell in the new array (here you could change what you want to get)
dataList.push([firstArray[r][1]])
}
}
Logger.log(dataList)
if(dataList.length>0){
namesSheet.getRange(1,namesSheet.getLastColumn()+1,dataList.length,1).setValues(dataList);//copy data in a column after last col
}
}
I have a table:
<table>
<tr><td>1</td></tr>
<tr><td>2</td></tr>
<tr><td>3</td></tr>
</table>
An array that tells where every row should come [{index: 2},{index: 1},{index: 0}] (first row is the last from the array, second row is the 1 in array and third row 0 from the array).
Here is my approach.
// create a new temporary tbody outside the DOM (similar to jQuery's detach)
var tbody_tmp = document.createElement('tbody');
// itterate through the array in the order of a new table
for(var i = 0, j = data.length; i < j; i++)
{
// make a copy of current row (otherwise, append child removes the row from the rows array and messes up the index-finder; there got be a better way for this)
var row = rows[data[i].index].cloneNode(true);
tbody_tmp.appendChild(row);
// reset the index to reflect the new table order (optional, outside the sample)
data[i].index = i;
}
// Note that tbody is a jquery object
tbody.parent()[0].replaceChild(tbody_tmp, tbody[0]);
Though, the cloning approach is slow. With 10,000+ records it takes ~1200ms. Furthermore, a jQuery-less approach is preferable.
Posting this in case someone else might find it simple enough for their needs (with less than 1,000 rows).
After hours of restless thinking, I've ended up with the following. If this sample isn't enough, I've written a whole blog post explaining the logic behind it, http://anuary.com/57/sorting-large-tables-with-javascript.
// Will use this to re-attach the tbody object.
var table = tbody.parent();
// Detach the tbody to prevent unnecessary overhead related
// to the browser environment.
var tbody = tbody.detach();
// Convert NodeList into an array.
rows = Array.prototype.slice.call(rows, 0);
var last_row = rows[data[data.length-1].index];
// Knowing the last element in the table, move all the elements behind it
// in the order they appear in the data map
for(var i = 0, j = data.length-1; i < j; i++)
{
tbody[0].insertBefore(rows[data[i].index], last_row);
// Restore the index.
data[i].index = i;
}
// Restore the index.
data[data.length-1].index = data.length-1;
table.append(tbody);
Trying to create a table using the following code but not working. Please point out where I'm going wrong.
var i,j;
function cell(ih){
var tcell =document.createElement('td');
tcell.innerHTML=ih;
return tcell;
}
mutable=document.createElement('table');
for (i=0;i<10;i++){
row1=document.createElement('tr');
for(j=0;j<10;j++){
row1.appendChild(cell(j));
}
mutable.appendChild(row1);
document.write(mutable);
}
You have several problems, the first two are the big ones, the second two are a matter of style and risk of clashes with other code:
You are trying to document.write HTMLElementNodes. document.write only deals with strings. Grab a container element (e.g. with document.getElementById) and append to it
You are trying to document.write the entire table every time you add a row to it. Append the table once the table is complete, not every time you go through the loop.
You are using globals all over the place, learn to love the var keyword
row1 is a poor variable name for the row you are operating on which usually isn't the first
Use document.body.appendChild(...) instead of document.write(...).
You can do it by changing your script to use document.body.appendChild(mutable) after your nested for loop:
var i,j;
function cell(ih){
var tcell =document.createElement('td');
tcell.innerHTML=ih;
return tcell;
}
mutable=document.createElement('table');
for (i=0;i<10;i++){
row1=document.createElement('tr');
for(j=0;j<10;j++){
row1.appendChild(cell(j));
}
mutable.appendChild(row1);
}
document.body.appendChild(mutable);
This appends the entire mutable table object you've created to the <body> element of your page. You can see it working here.
Also note that most times in markup, you don't see the <tbody> element, but it is good practice to append this as a child element of the <table> and as a parent element for all of your rows. So, your script should look more like this:
function cell(ih){
var tcell = document.createElement('td');
tcell.innerHTML = ih; // I would suggest you use document.createTextNode(ih) instead
return tcell;
}
function appendTable() { // you now have to call this function some time
mutable = document.createElement("table");
var tBody = mutable.appendChild( document.createElement("tbody") ); // technique using "fluid interfaces"
for (var i = 0; i < 10; i++) {
var row1 = tBody.appendChild( document.createElement('tr') ); // fluid interface call again
for(var j = 0; j < 10; j++) {
row1.appendChild(cell(j));
}
}
document.body.appendChild(mutable);
}
I made some style changes to your script, and I would suggest making even more, but as far as correctness, it should work.