I'm cloning some table rows and would like to increment the IDs of its child nodes, I've tried doing this by:
var rowID = document.getElementById('RowTbl').rows.length / 2;
var NameRowCopy= document.getElementById('NameRow' + rowID).cloneNode(true);
NameRowCopy.getElementByID('txtName1').setAttribute('id', 'txtName' + (rowID + 1));
So I get the latest set of rows (2 are created each time), and divide by 2 to get the current row ID. I then store the latest table row into a var, ready for cloning, and attempt to set the child node IDs from there.
Unfortunately, Firebug errors out silently so I'm left clueless as to what's going on. What is going on???
There's no such thing as "errors out silently" if you're running in Firebug - it'll either error (verbosely), or do the right thing (silently).
In this case, you don't appear to be adding the cloned node back into the DOM.
Also, getElementById() is a document method, you can't use it on any arbitrary HTML element.
You should be getting an error when you attempt to invoke element.getElementById() telling you that there's no such method.
Related
As you see below in the code, I have a html snippet in string which is then parsed into a document. But when I iterate through it and get the item in the collection, only every other item is accessible. Also some behavior like losing an item in collection also happens and ends up being left with only two items remaining.
JSFiddle - https://jsfiddle.net/hari7190/xpvt214o/878427/
var response = "<option value='Bobs Dock'>Bobs Dock</option><option
value='Johns Dock'>Johns Dock</option><option value='Mikes Dock'>Mikes Dock</option><option value='Jacob Dock'>Jacob Dock</option><option value='Foo Dock'>Foo Dock</option>"
parser = new DOMParser();
doc = parser.parseFromString("<select>" + response + "</select>", "text/html");
var options = doc.getElementsByTagName("option");
for(i=0; i<options.length; i++){
console.log(i, options.length);
document.getElementById("list").append(options[i]);
}
Results of iteration goes like :
index - 1 options.length - 3
index - 2 options.length - 2
Can anyone explain why this implementation behaves this way?
Please note: I understand how to achieve the result (like here), but I am looking for an explanation why the above code behaves this way.
The answer lies in the append function.
You see the, append (which is based on the more standard appendChild) function adds the node and detaches it from the current parent.
See documentation:
The Node.appendChild() method adds a node to the end of the list of
children of a specified parent node. If the given child is a reference
to an existing node in the document, appendChild() moves it from its
current position to the new position (there is no requirement to
remove the node from its parent node before appending it to some other
node).
Now, the current parent of these option nodes is your select node.
Now, your options variable - It might look like it's an array, but its actually an HTMLCollection. And, again, from the documentation :
An HTMLCollection in the HTML DOM is live; it is automatically updated
when the underlying document is changed.
So in your loop, each time you append an element, the appended element is removed from the select node, the HTMLCollection options sees this and becomes shorter, but since i is incremented, you skip over an element.
To see this for yourself you can add a debugger; line in your jsfiddle and debug this line by line in the browser.
I am trying to access an element in jquery using a function input. To clarify, here is an example of what I have tried that isn't working:
function openReceivedMessage(messageid)
{
// ajax post query that is executing fine
// set row to not be highlighted
var rowid = 'receivedrow' + messageid.toString();
document.getElementById(rowid).style.background-color = "#ffffff";
// other code that is executing fine
}
Essentially, this is for a message inbox page. I have displayed the messages in a table, and, as the number of messages changes for each user, I used a loop to populate it. In order to open a message, I have hoped to use a jquery function (titled above), and so when the loop populated the tables, I set it so that each of the different subject lines would, onclick, execute the above function with the unique messageid passed in as the argument. Upon opening, I want to change other things in the table (that I named, similar to the message function, as things like 'receivedrow#' where # is the messageid.
Would hugely appreciate any help here, I feel like there must be a simple way create a string (like I did with rowid above) and access the element with that id (in the table there is a row with id="receivedrow#" that I want to adjust the css of).
I recommend using jQuery to find the element
var rowid = 'receivedrow' + messageid.toString();
var $el = $("#" + rowid);
Then simply operate on $el
$el.css({'background-color':'#FFFFFF'});
If you're having trouble still, I recommend checking that rowid is correct and that the jQuery is then giving you the right element back.
function openReceivedMessage(messageid)
{
var rowid = 'receivedrow' + messageid.toString();
var $el = $('#'+rowid);
$el.css({'background-color':'#FFFFFF'});
// other code that is executing fine
}
Seems what you have posted is not jQuery at all :D
It is simple JavaScript trying to get a DOM element with id rowid. It does not works maybe be because of following two reasons:
there is no element with id rowid which you can easily verify.
background-color is not a property its backgroundColor.
Try using this:
document.getElementById(rowid).style.backgroundColor = "#ffffff";
If you what you really need is an easy way get the id onclick, then simply use the this keyword in your row markup.
onclick="openReceivedMessage(this);"
Then access in your function as such:
function openReceivedMessage(row)
{
// set row to not be highlighted
var rowid = 'receivedrow' + row.id;
$('#' + rowid).css({'background-color':'#ffffff'});
}
I am trying to work out some performance problems with some JavaScript I've been working on for a few days. One of the pieces of the functions is below:
var removeAddress = function(pk) {
var startTime = new Date();
jQuery('.add_driver select.primary_address:has(option[value=' + pk + ']:selected)').each(function(c, o) {
console.log("Shouldn't get here yet...");
showInputs(o);
});
console.log('removeAddress1: ', (new Date() - startTime) / 1000);
jQuery('.add_driver select.primary_address option[value=' + pk + ']').remove();
console.log('removeAddress2: ', (new Date() - startTime) / 1000);
};
This code is quite peppy in Firefox:
removeAddress1: 0.004
removeAddress2: 0.023
But in IE8 it is another story:
LOG: removeAddress1: 0.203
LOG: removeAddress2: 0.547
The form in question is a 20-person in put form with first name, last name, and 5 address fields. I've also put in a drop down for selecting other addresses already existing in the form (.primary_address). This code is removing an address from the primary address select boxes.
I'm trying to nail down why this is taking so long, and the only thing which stands out is the option[value=????] section. This was the most practical way to find the elements in question, so I ran with it. Is there something about these two selectors which is causing IE to lose its lunch?
The option element is always temperamental. Perhaps it's simpler to just get all the SELECT elements and then simply query their values. The selected OPTION always will give its value property to the SELECT as well. So:
jQuery('.add_driver select.primary_address').filter(function() {
return $(this).value === pk;
});
jQuery('.add_driver select.primary_address[value='+pk+']');
Maybe one of those will be faster - not sure if the second will work.
You can likely speed this up a lot by breaking down your uber-selector string.
To start, begin with an id, or even better a cached element. Then get your select elements using .children(). Instead of using the :has selector use .has(). Methods are generally faster than complex selector syntax because jQ doesn't have to parts a string to figure out what you mean. Then, as Rafael said, you can skip the :selected and just look at the value of the matched select's.
formElem = document.getElementById('formId');
jQuery('.add_driver', formElem)
.children('select.primary_address')
.has('[value=' + pk + ']')
.each(...);
Passing formElem as the second arg uses it as the context for the search so jQ doesn't have to start at the root.
To .remove() the elements either cache the jQuery object from above or chain it on after the .each() so you don't have to reselect everything again.
Maybe precompute $('formId .add_driver select') outside of the removeAddress function, then reuse that so the removeAddress() doesn't have to enumerate so many elements.
I am developing a small web-utility that displays some data from some database tables.
I have the utility running fine on FF, Safari, Chrome..., but the memory management on IE8 is horrendous. The largest JSON request will return information to create around 5,000 or so rows in a table within the browser (3 columns in the table).
I'm using jQuery to get the data (via getJSON). To remove the old/existing table, I'm just doing a $('#my_table_tbody').empty(). To add the new info to the table, within the getJSON callback, I am just appending each table row that I am creating to a variable, and then once I have them all, I am using $('#my_table_tbody').append(myVar) to add it to the existing tbody. I don't add the table rows as they are created because that seems to be a lot slower than just adding them all at once.
Does anyone have any recommendation on what someone should do who is trying to add thousands of rows of data to the DOM? I would like to stay away from pagination, but I'm wondering if I don't have a choice.
Update 1
So here is the code I was trying after the innerHTML suggestion:
/* Assuming a div called 'main_area' holds the table */
document.getElementById('main_area').innerHTML = '';
$.getJSON("my_server", {my: JSON, args: are, in: here}, function(j) {
var mylength = j.length;
var k =0;
var tmpText = '';
tmpText += /* Add the table, thead stuff, and tbody tags here */;
for (k = mylength - 1; k >= 0; k--)
{
/* stack overflow wont let me type greater than & less than signs here, so just assume that they are there. */
tmpText += 'tr class="' + j[k].row_class . '" td class="col1_class" ' + j[k].col1 + ' /td td class="col2_class" ' + j[k].col2 + ' /td td class="col3_class" ' + j[k].col3 + ' /td /tr';
}
document.getElementById('main_area').innerHTML = tmpText;
}
That is the gist of it. I've also tried using just a $.get request, and having the server send the formatted HTML, and just setting that in the innerHTML (i.e. document.getElementById('main_area').innerHTML = j;).
Thanks for all of the replies. I'm floored with the fact that you all are willing to help.
var tmpText = [];
for (k = mylength - 1; k >= 0; k--)
{
/* stack overflow wont let me type greater than & less than signs here, so just assume that they are there. */
tmpText.push('anything you want')
tmpText.push( 'tr class="' + j[k].row_class . '" td class="col1_class" ' + j[k].col1 + ' /td td class="col2_class" ' + j[k].col2 + ' /td td class="col3_class" ' + j[k].col3 + ' /td /tr';)
}
$('#main_area').html(tmpText.join(''))
}
you dont need document.getElementById('main_area').innerHTML = '';
this method is to push into array, then join and use jquery html function to update. This is the fastest method I know. Sorry for the format here - its my first post and I thought I'd give something back here to stackoverflow.
To get IE to respond quickly you should be creating your table rows as string representations of HTML , appending them to a string variable, and then adding the result to your table's like this.
myTable.myTbody.innerHTML = allThoseRowsAsAString;
It's not a memory issue: 5,000 rows should be trivial. That's got to be far less than one megabyte.
Robusto is right about innerHTML assignment being a better way to go. IE sucks at dynamic DOM creation
Why not form your innerHTML on the server using a jsp and stream it back via ajax in one shot. It will definitely speed things up, remove complexity from your javascript and delegate markup creation to its proper place.
As Plodder said, IE has big problems when working with DOM. jQuery best practices recommends creating code on a simple string and appending just once inside the container.
Beside this, I recently had a similar problem for a hierarchycal data, having an amount of 5,000 data records. I asked myself: did the user really need all that information available at a given moment? Then I realized the best I could do was just present a "first chunk of data" and then insert more data on user demand.
Finally, just one good tool: Dynatrace Ajax (it helps a lot to find the javascript function it takes more time to operate)
Since you are dealing with thousands of data rows I wouldn't call $('#my_table_tbody').empty() and add the new data with new DOM elements. Instead I'd follow the Object Pool Pattern. Thus instead of dropping all the tr's you can reuse existing ones and just populate with the new data.
If your new data set has less rows then the previous one, remove the rest of the rows from the DOM, but keep references to them in some pool so that garbage collector won't destroy them. If your new data set is bigger - just create new tr's on demand.
You can look at the implementation of YUI DataTable, here's the source. IIRC they use this approach to speed up the render time.
I have the following problem:
I need to insert N rows after row X. The set of rows I need to insert is passed to my function as chunk of HTML consisting of TR elements. I also have access to the TR after which I need to insert.
This is slightly different then what I have done before where I was replacing TBODY with another TBODY.
The problem I am having is that appendChild requires a single element, but I have to insert a set.
Edit:
Here is the solution :
function appendRows(node, html){
var temp = document.createElement("div");
var tbody = node.parentNode;
var nextSib = node.nextSibling;
temp.innerHTML = "<table><tbody>"+html;
var rows = temp.firstChild.firstChild.childNodes;
while(rows.length){
tbody.insertBefore(rows[i], nextSib);
}
}
see this in action:
http://programmingdrunk.com/test.htm
if(node.nextSibling && node.nextSibling.nodeName.toLowerCase() === "tr")
What's this for? I don't think you need it. If node.nextSibling is null, it doesn't matter. You can pass that to insertBefore and it will act the same as appendChild.
And there's no other element allowed inside a tbody than ‘tr’ anyway.
for(var i = 0; i < rows.length; i++){
tbody.insertBefore(rows[i], node.nextSibling);
This won't work for multiple rows. Once you've done one insertBefore, node.nextSibling will now point to the row you just inserted; you'll end up inserting all your rows in reverse order. You'll need to remember the original nextSibling.
ETA: plus, if ‘rows’ is a live DOM NodeList, every time you insert one of the rows into the new body, it removes it from the old body! Thus, you are destroying the list as you iterate over it. This is a common cause of ‘every other one’ errors: you process item 0 of the list, and in doing so remove it from the list, moving item 1 down into where item 0 was. Next you access the new item 1, which is the original item 2, and the original item 1 never gets seen.
Either make a copy of the list in a normal non-live ‘Array’, or, if you know it's going to be a live list, you can actually use a simpler form of loop:
var parent= node.parentNode;
var next= node.nextSibling;
while (rows.length!=0)
parent.insertBefore(rows[0], next);
i have to insert a set.
Usually when you think about inserting a set of elements at once, you want to be using a DocumentFragment. However, unfortunately, you can't set ‘innerHTML’ on a DocumentFragment, so you'd have to set the rows on a table like above, then move them one by one into a DocumentFragment, then insertBefore the documentFragment. Whilst this could theoretically be faster than appending into the final target table (due to less childNodes list-bashing), in practice by my testing it isn't actually reliably significantly faster.
Another approach is insertAdjacentHTML, an IE only extension method. You can't call insertAdjacentHTML on the child of a tbody, unfortunately, in the same way as you can't set tbody.innerHTML due to the IE bug. You can set it inside a DocumentFragment:
var frag= document.createDocumentFragment();
var div= document.createElement(div);
frag.appendChild(div);
div.insertAdjacentHTML('afterEnd', html);
frag.removeChild(div);
node.parentNode.insertBefore(frag, node.nextSibling);
Unfortunately, somehow insertAdjacentHTML works very slowly in this context for some mysterious reason. The above method is about half the speed of the one-by-one insertBefore for me. :-(
What altCognito said, just be aware of the big innerHTML tbody bug in case you want to replace everything at once using innerHTML because you're doing a lot of this and the DOM operations turn out to be too slow.
There is an error with IE using innerHTML to insert the rows, so that won't work. Straight from the horse's mouth: http://www.ericvasilik.com/2006/07/code-karma.html
I would recommend just updating the tbody, but appending your code where it belongs in the new structure.
With your table object you can insert a new row into it.
var tbl = document.getElementById(tableBodyId);
var lastRow = tbl.rows.length;
// if there's no header row in the table, then iteration = lastRow + 1
var iteration = lastRow;
var row = tbl.insertRow(lastRow);
it will insert a row at the end of your table.
This worked a lot better for me when inserting a row anywhere other than in the last position,
var rowNode = refTable.insertRow(0);
var cellNode = rowNode.insertCell();
You'll need to append them one by one.
you need to determine if row X is the last row...
//determine if lastRow...
//if not, determine row_after_row_x
for(i in n_rows){
if(lastRow){
tbodyObj.appendChild(row_i_of_n);
} else {
tbodyObj.insertBefore(row_i_of_n, row_after_row_x);
}
}