I have an AJAX chat application that constantly gets new data and appends it to a div (#container) in the form of another div (.child). Multiple .child's can be inserted per second, but I only want to keep the most recent 10. Every time a download occurs, I call the following function:
function cleanup(){
var current = $('#container');
var allData = current.children();
if(allData.length > 10){
for(var i = 0;i<allData.length-10;i++){
allData[i].remove();
}
}
}
This works, but it lags horrendously. I have to switch my current tab just to see the css work correctly. Am I doing something wrong?
I cannot change the data flow, as the chat depends on getting all the data that is sent. I am just looking for the most efficient way of deleting old elements.
Example:
If I had 30 children in my div, the first 20 children would be .remove()'d and only the last 10 would remain.
Rather than letting some code add divs and the cleanup function clean it up afterwards - which if I understand you correctly, could add a whole bunch of children, not just one - why not have effectively a queue of length 10 in memory, and keep pushing things on (and popping them off the back once you reach 10+ items), and then set those children on your #container periodically.
This way you are always going to get the latest 10 elements, but you can update the container at a rate that makes sense (and therefore reflow the visual document a a rate that makes sense).
This could be on an interval, or you could even do it every time you process a message - but the point is, you are not adding to the document, reflowing it, then removing from it again. That seems like an inefficient way to approach the problem.
You can use :lt() selector to limit to target last divs( as :lt selector is zero index based):
$('#container .child:lt('+$('.child').length-11+')').remove();
also its better to replace the content rather than appending new one.
CSS Solution:
#container:nth-of-type(10) ~ #container {
display: none;
}
Related
I am making a project in Javascript that is mostly on run on a canvas, but also involves dynamically creating a lot of menus in the DOM. I am not using JQuery, instead I am using a custom function that creates an element using document.createElement and then adds it to a parent with appendChild. To remove elements, I use element.parentNode.removeChild(element). Elements created can be of any type; inputs, images, and so on.
The problem is that after creating and removing elements a lot of times, the page starts to slow down significantly, and this worsens steadily the more elements are created and removed, even though there is no point where an especially large number of elements exist at once. The Javascript does not slow down, though; the main update loop is fine, but mouse events and everything related to manipulating the DOM becomes slow until the page is reloaded.
I have experienced similar issues before, but have generally ignored them because they did not involve creating and removing large numbers of elements to the same degree as this one.
The only guess I have for a cause is that elements created by document.createElement continue to exist in memory even if their reference is cleared and they are removed from the visible part of the DOM. Or perhaps removing a parent element does not properly remove all of its children, even though they seem to be gone.
My question is: Are created elements retained by the DOM even when they are not visible and no JS variable points to them, and can this be the cause of slowdown? If so, how do I destroy a DOM element properly?
As it is described here:
The removed child node still exists in memory, but is no longer part
of the DOM. With the first syntax-form shown, you may reuse the
removed node later in your code, via the oldChild object reference.
In the second syntax-form however, there is no oldChild reference
kept, so assuming your code has not kept any other reference to the
node elsewhere, it will immediately become unusable and irretrievable,
and will usually be automatically deleted from memory after a short
time.
And you can try something described as here:
var garbageBin;
window.onload = function ()
{
if (typeof(garbageBin) === 'undefined')
{
//Here we are creating a 'garbage bin' object to temporarily
//store elements that are to be discarded
garbageBin = document.createElement('div');
garbageBin.style.display = 'none'; //Make sure it is not displayed
document.body.appendChild(garbageBin);
}
function discardElement(element)
{
//The way this works is due to the phenomenon whereby child nodes
//of an object with it's innerHTML emptied are removed from memory
//Move the element to the garbage bin element
garbageBin.appendChild(element);
//Empty the garbage bin
garbageBin.innerHTML = "";
}
}
Where you can delete your dom element like:
discardElement(yourDomElement);
When something is appended to the DOM in memory, does that cause a browser reflow? Or is it only when the pixels on the screen are told to change that the reflow happens? For example:
Case 1: Img elements appended to the DOM one at a time
var parentDiv = $('#imgHolder');
var imgArray = []; // Array of img paths populated in another function
$.each(imgArray, function()
{
parentDiv.append(createImgEle(this)); // createImgEle() // returns an <img> with the right src
}
Case 2: Img elements are put in a separate array and then appended to the DOM
var parentDiv = $('#imgHolder');
var imgArray = []; // Array of img paths populated in another function
var tempArray = []; // Holds the img elements until its fully populated
$.each(imgArray, function()
{
tempArray.push(createImgEle(this));
}
parentDiv.append(tempArray);
Case 3: Either case 1 or 2 but by default, parentDiv is set to display:none; and made visible after the each loop is done.
Basically, what I want to know is, does the browser only start to reflow when the pixels of the screen are told to change?
Btw, the code is only for example purposes and not in production so don't slam me for any logic errors. Thank you for any advice.
Basically, what I want to know is, does the browser only start to reflow when the pixels of the screen are told to change?
No, the browser reflows when the DOM changes. After that, it will repaint (tell the pixels on the screen to change).
For the details, have a look at this dev.opera.com article and the question When does reflow happen in a DOM environment?.
In short: There is of course optimisation for subsequent DOM changes, for example if you insert an array of elements in a loop. I would not expect your cases 1 and 2 to differ noticeable.
Only if you are doing really heavy DOM changes, you might need case #3. This makes reflows, should they happen during the insert loop, stop when encountering the hidden elements so they are basically prevented. However, the two display changes before and after the loop can lead to flickering in some browsers.
I have an autocomplete form where the user can type in a term and it hides all <li> elements that do not contain that term.
I originally looped through all <li> with jQuery's each and applied .hide() to the ones that did not contain the term. This was WAY too slow.
I found that a faster way is to loop through all <li> and apply class .hidden to all that need to be hidden, and then at the end of the loop do $('.hidden').hide(). This feels kind of hackish though.
A potentially faster way might be to rewrite the CSS rule for the .hidden class using document.styleSheets. Can anyone think of an even better way?
EDIT: Let me clarify something that I'm not sure too many people know about. If you alter the DOM in each iteration of a loop, and that alteration causes the page to be redrawn, that is going to be MUCH slower than "preparing" all your alterations and applying them all at once when the loop is finished.
Whenever you're dealing with thousands of items, DOM manipulation will be slow. It's usually not a good idea to loop through many DOM elements and manipulate each element based on that element's characteristics, since that involves numerous calls to DOM methods in each iteration. As you've seen, it's really slow.
A much better approach is to keep your data separate from the DOM. Searching through an array of JS strings is several orders of magnitude faster.
This might mean loading your dataset as a JSON object. If that's not an option, you could loop through the <li>s once (on page load), and copy the data into an array.
Now that your dataset isn't dependent on DOM elements being present, you can simply replace the entire contents of the <ul> using .html() each time the user types. (This is much faster than JS DOM manipulation because the browser can optimize the DOM changes when you simply change the innerHTML.)
var dataset = ['term 1', 'term 2', 'something else', ... ];
$('input').keyup(function() {
var i, o = '', q = $(this).val();
for (i = 0; i < dataset.length; i++) {
if (dataset[i].indexOf(q) >= 0) o+='<li>' + dataset[i] + '</li>';
}
$('ul').html(o);
});
As you can see, this is extremely fast.
Note, however, that if you up it to 10,000 items, performance begins to suffer on the first few keystrokes. This is more related to the number of results being inserted into the DOM than the raw number of items being searched. (As you type more, and there are fewer results to display, performance is fine – even though it's still searching through all 10,000 items.)
To avoid this, I'd consider capping the number of results displayed to a reasonable number. (1,000 seems as good as any.) This is autocomplete; no one is really looking through all the results – they'll continue typing until the resultset is manageable for a human.
I know this is question is old BUT i'm not satisfied with any of the answers. Currently i'm working on a Youtube project that uses jQuery Selectable list which has around 120.000 items. These lists can be filtered by text and than show the corresponding items. The only acceptable way to hide all not matching elements was to hide the ul element first than hide the li elements and show the list(ul) element again.
You can select all <li>s directly, then filter them: $("li").filter(function(){...}).hide() (see here)
(sorry, I previously posted wrong)
You can use the jQuery contains() selector to find all items in a list with particular text, and then just hide those, like this:
HTML:
<ul id="myList">
<li>this</li>
<li>that</li>
<ul>
jQuery
var term = 'this';
$('li:contains("' + term + '")').hide();
You could use a more unique technique that uses technically no JavaScript to do the actual hiding, by putting a copy of the data in an attribute, and using a CSS attribute selector.
For example, if the term is secret, and you put a copy of the data in a data-term attribute, you can use the following CSS:
li[data-term*="secret"] {
display: none;
}
To do this dynamically you would have to add a style to the head in javascript:
function hideTerm(term) {
css = 'li[data-term*="'+term+'"]{display:none;}'
style = $('<style type="text/css">').text(css)
$('head').append(style);
}
If you were to do this you would want to be sure to clean up the style tags as you stop using them.
This would probably be the fastest, as CSS selection is very quick in modern browsers. It would be hard to benchmark so I can't say for sure though.
How about:
<style>
.hidden{ display: none; }
</style>
That way you don't have to do the extra query using $('.hidden').hide() ?
Instead of redefining the Stylesheets rules, you can directly define 'hide'
class property to "display:none;" before hand and in your page, you can just
apply the class you defined after verifying the condition through javascript,
like below.
$("li").each(function(){if(condition){$(this).addClass('hide');}});
and later, if you want to show those li's again, you can just remove the class like below
$("li").each(function(){if(condition){$(this).removeClass('hide');}});
Can I determine the number of jQuery objects on a page?
I want to use the number of elements as a sort of weak benchmark for page complexity. I'm thinking that if I can reduce the number of elements that jQuery knows about , the page might run more efficiently.
Does this make sense?
Is it as simple as doing a * select and counting the results?
related:
How can I clear content without getting the dreaded “stop running this script?” dialog?
http://api.jquery.com/size/
var elementCount = $('*').size();
Although this might be more what you want:
var elementCount = $('body').find('*').size()
var n= 0;
for (var i in jQuery.cache)
n++;
Now n holds the number of elements jQuery has ‘touched’ (added data to, such as event handlers).
Previously this used to be a whole lot, as it would ‘touch’ every element it was even checking for data. This unpleasantness is fixed in jQuery 1.4.
As for clearing content, yes, you can use innerHTML= '' to remove all the content without giving jQuery the chance to detach its data so very slowly. If you know there are no ‘touched’ elements inside the element that's a win, but otherwise it's a potential memory leak until the page is reloaded, as unused jQuery.cache data stays around.
Using live()/delegate() event binding avoids adding data to its target elements, which can allow you to use this short-cut more freely. However if you have a lot of events to bind and they're not selectors that are very easy to match quickly, this can make event handling slow.
(Because there is no browser-native speedup like querySelectorAll for matching elements against a particular selector as delegation needs to do; this is proposed for Selectors-API level 2.)
var elementCount = $('*').length;
This doesn't really have much to do with jQuery except insofar as it's a handy way to get the answer.
I am working on a project where I am building a treeview and in some cases my tree could have a 1000+ child nodes. The problem is its really slow on like a IE7 machine.
I'm not doing any kind of animation or anything, just simply trying to hide the next UL in the item using JQuery's toggle feature. Does anyone have any ideas of how to improve the performance?
Thanks!!
If toggle is slow, you can set css styles directly via jquery like:
$(".tree_item").click(function(){
//check the next ul if it is expanded or not
if(this.next('ul:hidden').length == 0){
//if expanded hide it
this.next('ul').css('display', 'none');
}else{
//if not shown, show it
this.next('ul').css('display', 'block');
}
});
such approach would help. I don't know if it would work faster but give it a try...
Sinan.
I'm not surprised at all that this is slow if your treeview is that big. Silverlight 3 handles this problem with UI Virtualization.
You'll have to roll your own in javascript, but it shouldn't be that hard. Just make a huge blank div that's the size of what the rendered tree would have been, and put it inside a scrollable div, and then only render what should show up. Change it on the onscroll event.
You can try to build at start your own tree object from DOM document.
Just iterate through all elements and nest them in standard attributes and variables. You can make additional parent and children pointers using $(element).get(0).parent = $(parent).get(0);
Then if you want to get specified elements use $.map function.
We used it to prepare something like firebug on a project. It rebuilded all 5000+ nodes portal in 3 sec and provided fast access on ie6+
I've found that .toggle(showOrHide) is fast. It was added in jQuery 1.3 and really makes a difference on large collections (50+ elements) in IE8 if animation is not required. The current visibility state can be obtained from the first element, e.g.:
var isVisible = $('.myClass').first().is(':visible');
$('.myClass').toggle(!isVisible);