Improving performance of javascript intervals on IE8 - javascript

I'm using javascript loop (using setInterval) that runs through a list of search results, highlighting the search term by adding a css styled <span> around search hits as it goes. I'm using setInterval like this to release control of the browser while it does this.
In Chrome and Firefox this works well - even with a setInterval parameter of 10-20ms; and the user has full control of the browser (i.e. scrolling, clicking links etc.) while the results are rapidly highlighted:
mylooper = setInterval(function() {
// my functionality is here
},15); // 15ms
Unfortunately, when using the dreaded IE8, the browser locks up and takes a really long time to add the <span>'s and style the search results. It also takes a long time just to load the page in the first place - shortened a great deal when this script is removed.
So far I've tried:
changing the interval values (I've read that IE8 doesn't detect intervals of sub 15ms);
using setTimeout instead of setInterval;
removing the interval to check that this is in fact what is causing the slow-down (it is!); and
swearing about Internet Explorer a lot;
var highlightLoop;
var index = 0;
highlightLoop = setInterval(function () {
var regex = RegExp(regexPhrase, "gi"); // regexPhase created elsewhere
var searchResults = resultElements.eq(index).get(0); // run through resultElements which contain alll the nodes with search results in them.
findAndReplaceDOMText( // a function that does the searching and inserting of styling
regex,
searchResults,
function (fill, matchIndex) {
called = true;
var span = document.createElement("span");
span.className = "result-highlight";
span.innerHTML = fill;
return span;
}
);
if (index == resultElements.length || searchTermUpdated == true) { // stop interval loop when search term changes or we reach the end of results - variable set elsewhere.
searchTermUpdated = false;
clearInterval(highlightLoop); // stop the loop
}
index++
}
}, 50); // 50ms does not improve performance.
Any advice on workarounds for this kind of javascripting in IE would be massively appreciated. Thanks all.

I believe you may be able to improve the performance by tweaking findAndReplaceDOMText, and maybe its callback too. I suppose findAndReplaceDOMText appends the element returned by the callback to the DOM, from within a loop of all matches. If it's doing that inside a loop, try to move it outside the loop, and apply the all changes to the DOM at once. That should result in better performance, as repainting the page after each DOM update is expensive.

Try this recursive approach instead:
get a list of all elements to be acted upon into array X (one time cost)
while the array X has length, keep repeating the next actions
shift the first element off the array
process the single element
start this process again with the new array X (now Xn - 1 length) on a setTimeout
The code looks like this in general
function processArray(array) {
var element = array.shift();
processElement(element);
if (array)
setTimeout(function(){processArray(array);},15ms);
}
There might be something else to be done with this recursion, but it works fairly well in all browsers and never blocks, because you're only initiating the repeat when the last one has had time to finish.

Related

Append items ordering by placed amount

I'm using this function to append new items in order by the amount. This function is being called every 30-50ms.
var insertBefore = false;
container.find('.roll-user-row[data-user-id="' + user_data.id + '"]').remove();
container.children().each(function () {
var betContainer = $(this), itemAmount = $(this).attr('data-amount'), betId = $(this).attr('data-user-id');
if (itemAmount < betData.totalAmount) {
insertBefore = betContainer;
return false;
}
});
if (insertBefore) {
$(template).insertBefore(container);
} else {
container.prepend(template);
}
itemAmount = $(this).attr('data-amount') is integer, betData.totalAmount is interger too. And if appending goes slower than ±300ms - everything works well. In case of fast appending I get this result:
and thats not even close what I want - thats random. How to solve this?
1. Refactoring
First of all, return within .each callback doesn't work. It just breaks current iteration, not all the cycle. If you want to interrupt cylce, you should use simple for-loop and break statement. Then, I would recommend to call $() as rarely as possible, because this is expensive. So I would suggest the following refactoring for your function:
function run() {
container.find('.roll-user-row[data-user-id="' + user_data.id + '"]').remove();
var children = container.children();
for (var i = 0; i < children.length; i++) {
var betContainer = $(children[i]); // to cache children[i] wrapping
var itemAmount = betContainer.attr('data-amount');
var betId = betContainer.attr('data-user-id');
if (itemAmount < betData.totalAmount) {
$(template).insertBefore(container);
return; // instead of "break", less code for same logic
}
}
container.prepend(template); // would not be executed in case of insertBefore due to "return"
}
2. Throttling
To run a 50ms repeating process, you are using something like setInterval(run, 50). If you need to be sure, that run is done and this is 300ms delay, then you may use just setInterval(run, 300). But if the process initializes in a way that you can't change, and 50ms is fixed interval for that, then you may protect run calling by lodash throttle or jquery throttle plugin:
var throttledRun = _.throttle(run, 300); // var throttledRun = $.throttle(300, run);
setInterval(throttledRun, 50);
setInterval is just for example, you need to replace your initial run with throttled version (throttledRun) in your repeater initialization logic. This means that run would not be executed until 300ms interval has passed since the previous run execution.
I am only posting the approach here, if my understanding is right, then I'll post a code. First thing came to my mind reading this was the 'Virtual DOM' concept. Here is what you can do,
Use highly frequent random function calls only to maintain a data structure like an object. Don't rely on DOM updates.
Then use a much less frequent setInterval repetitive function call to redraw (or update) your DOM from that data structure.
I am not sure there are any reason you can't take this approach, but this will be the most efficient way to handle DOM in a time critical use-case.

Searching element faster using document.querySelector in a large DOM

In a huge DOM with hundreds of elements, finding elements using document.querySelector("input[name='foo'][value='bar']") takes about 3-5 seconds for each element. Is there a way I can reduce this time? may be by giving full path of the element like say, document.querySelector("parent child grandchild and so on and then input[name='foo'][value='Modem']") or any other way?
I'm using CasperJS to test a large webpage and it takes really long to fetch each element and this is making my test run for an hour.. I've also tried __utils__.findOne() but the result is same 3-4 secs for each element. Since my test is focused on a very small part of the entire page, I wish if there's some way I could tell the document.querySelector to focus the element search on a particular portion of the page.
So could someone tell me whats the fastest way if any to fetch elements from a large DOM?
Update: This is how I measured the time
var init = (new Date()).getTime();
var element=this.evaluate(function() {
return document.querySelector('input[value='somethin'][name='somethin']');
});
this.echo('Time Taken :'+((new Date()).getTime() - init));
somehow the time is very high when I fetch radio buttons from the form, select elements and text boxes however returns within few milliseconds(I noticed this only today).
When I run the document.querySelector('input[value='somethin'][name='somethin']') in modern browser consoles like the chrome's , the time is less than a second.
I don't know if it has to do with the phantomjs's headless browser or something. Only for a particular page in that website, fetching elements is slowing down..
And yes, the page is very large with hundreds of thousands of elements. It's a legacy webapp thats a decade old. While on that page with IE 8 , pressing F12 to view source hangs IE for 5 minutes, but not chrome or firefox..maybe it's phantomjs's memory overload or something, rarely phantomjs crashes when I run the test on that particular page. I don't know if this info helps , but I'm not sure whats relevant.
General considerations
The fastest selector would be the id selector, but even if you had ids higher up the tree, they would not get you much. As Ian pointed out in the comments, selector are parsed/evaluated right to left. It means that the engine would look up all inputs that have the matching attributes even if it is only one, and only then search up the tree to see if the previous elements match.
I found that if you can know in what enclosing element the inputs are, you can use JavaScript DOM properties to walk over the DOM and run querySelector over a smaller part of the tree. At least in my tests, this reduces the time by more than half.
Memory problem
Judging by your updated question, it seems that it is really a memory problem. When you have hundreds of thousands of elements the relatively old PhantomJS WebKit engine will try to allocate enough memory. When it takes more memory than is free or even more than your machine has, the OS compensates by using swap memory on the hard disk.
When your script tries to query an element that is currently only in swap, this query takes very long, because it has to fetch the data from the high latency hard disk which is very slow compared to memory.
My tests run for 100k forms with one elements each in under 30 msec per query. When I increased the number of elements the execution time has grown linearly until at some point I got (by registering to onError)
runtime error R6016
- not enough space for thread data
So I cannot reproduce your problem of 3-5 seconds per query on windows.
Possible solutions
1. Better hardware:
Try to run it on a machine with more memory and see if it runs better.
2. Reduce used memory by closing unnecessary applications
3. Manipulate the page to reduce the memory footprint:
If there are parts of the page that you don't need to test, you can simply remove them from the DOM before running the tests. If you need to test all of it, you could run multiple tests on the same page, but every time remove everything that is currently not tested.
Don't load images if this is a image heavy site by setting casper.options.pageSettings.loadImages = false;.
Test script
var page = require('webpage').create();
var content = "",
max = 100000,
i;
for(i = 0; i < max; i++) {
content += '<form id="f' + i + '"><input type="hidden" name="in' + i + '" valuate"iv' + i + '"></form>';
}
page.evaluate(function(content){
document.body.innerHTML = content;
}, content);
console.log("FORMS ADDED");
setTimeout(function(){
var times = page.evaluate(function(max){
var obj = {
cssplain: 0,
cssbyForm: 0,
cssbyFormChild: 0,
cssbyFormJsDomChild: 0,
cssbyFormChildHybridChild: 0,
cssbyFormHybridChild: 0,
xpathplain: 0,
xpathbyForm: 0
},
idx, start, el, i,
repeat = 100;
function runTest(name, obj, test) {
var idx = Math.floor(Math.random()*max);
var start = (new Date()).getTime();
var el = test(idx);
obj[name] += (new Date()).getTime() - start;
return el;
}
for(i = 0; i < repeat; i++){
runTest('cssplain', obj, function(idx){
return document.querySelector('input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('cssbyForm', obj, function(idx){
return document.querySelector('#f'+idx+' input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('cssbyFormChild', obj, function(idx){
return document.querySelector('form:nth-child('+(idx+1)+') input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('cssbyFormJsDomChild', obj, function(idx){
return document.body.children[max-1].querySelector('input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('cssbyFormChildHybridChild', obj, function(idx){
return document.querySelector('form:nth-child('+(idx+1)+')').querySelector('input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('cssbyFormHybridChild', obj, function(idx){
return document.querySelector('#f'+idx).querySelector('input[name="in'+idx+'"][value="iv'+idx+'"]');
});
runTest('xpathplain', obj, function(idx){
return document.evaluate('//input[#name="in'+idx+'" and #value="iv'+idx+'"]', document, null, XPathResult.FIRST_ORDERED_NODE_TYPE, null);
});
runTest('xpathbyForm', obj, function(idx){
return document.evaluate('//form[#id="f'+idx+'"]//input[#name="in'+idx+'" and #value="iv'+idx+'"]', document, null, XPathResult.FIRST_ORDERED_NODE_TYPE, null);
});
}
for(var type in obj) {
obj[type] /= repeat;
}
return obj;
}, max);
console.log("TIMES");
for(var type in times) {
console.log(type+":\t"+times[type]);
}
phantom.exit();
}, 0); // just in case the content is not yet evaluated
Output on my machine (nicer):
cssbyForm: 29.55
cssbyFormChild: 29.97
cssbyFormChildHybridChild: 11.51
cssbyFormHybridChild: 10.17
cssbyFormJsDomChild: 11.73
cssplain: 29.39
xpathbyForm: 206.66
xpathplain: 207.05
Note: I used PhantomJS directly. It should not have different results when the same technique is used in CasperJS.

Optimizing Javascript Loop for Wheel Game

I have a game I'm creating where lights run around the outside of a circle, and you must try and stop the light on the same spot three times in a row. Currently, I'm using the following code to loop through the lights and turn them "on" and "off":
var num_lights = 20;
var loop_speed = 55;
var light_index = 0;
var prevent_stop = false; //If true, prevents user from stopping light
var loop = setTimeout(startLoop, loop_speed);
function startLoop() {
prevent_stop = false;
$(".light:eq(" + light_index + ")").css("background-color", "#fff");
light_index++;
if(light_index >= num_lights) {
light_index = 0;
}
$(".light:eq(" + light_index + ")").css("background-color", "red");
loop = setTimeout(startLoop, loop_speed);
}
function stopLoop() {
clearTimeout(loop);
}
For the most part, the code seems to run pretty well, but if I have a video running simultaneously in another tab, the turning on and off of the lights seems to chug a bit. Any input on how I could possibly speed this up would be great.
For an example of the code from above, check out this page: http://ericditmer.com/wheel
When optimizing the thing to look at first is not doing twice anything you only need to do once. Looking up an element from the DOM can be expensive and you definitely know which elements you want, so why not pre-fetch all of them and void doing that multiple times?
What I mean is that you should
var lights = $('.light');
So that you can later just say
lights.eq(light_index).css("background-color", "red");
Just be sure to do the first thing in a place which keeps lights in scope for the second.
EDIT: Updated per comment.
I would make a global array of your selector references, so they selector doesn't have to be executed every time the function is called. I would also consider swapping class names, rather than attributes.
Here's some information of jQuery performance:
http://www.componenthouse.com/article-19
EDIT: that article id quite old though and jQuery has evolved a lot since. This is more recent: http://blog.dynatrace.com/2009/11/09/101-on-jquery-selector-performance/
You could try storing the light elements in an array instead of using a selector each time. Class selectors can be a little slow.
var elements = $('.light');
function startLoop() {
prevent_stop = false;
$(elements[light_index]).css('background-color', '#fff');
...
}
This assumes that the elements are already in their intended order in the DOM.
One thing I will note is that you have used a setTimeout() and really just engineered it to behave like setInterval().
Try using setInterval() instead. I'm no js engine guru but I would like to think the constant reuse of setTimeout has to have some effect on performance that would not be present using setInterval() (which you only need to set once).
Edit:
Curtousy of Diodeus, a related post to back my statement:
Related Stack Question - setTimeout() vs setInterval()
OK, this includes some "best practice" improvements, if it really optimizes the execution speed should be tested. At least you can proclaim you're now coding ninja style lol
// create a helper function that lend the array reverse function to reverse the
// order of a jquery sets. It's an object by default, not an array, so using it
// directly would fail
$.fn.reverse = Array.prototype.reverse;
var loop,
loop_speed = 55,
prevent_stop = false,
// prefetch a jquery set of all lights and reverses it to keep the right
// order when iterating backwards (small performance optimization)
lights = $('.light').reverse();
// this named function executes as soon as it's initialized
// I wrapped everything into a second function, so the variable prevent_stop is
// only set once at the beginning of the loop
(function startLoop() {
// keep variables always in the scope they are needed
// changed the iteration to count down, because checking for 0 is faster.
var num_lights = light_index = lights.length - 1;
prevent_stop = false;
// This is an auto-executing, self-referencing function
// which avoids the 55ms delay when starting the loop
loop = setInterval((function() {
// work with css-class changing rather than css manipulation
lights.eq( light_index ).removeClass('active');
// if not 0 iterate else set to num_lights
light_index = (light_index)? --light_index:num_lights;
lights.eq( light_index ).addClass('active');
// returns a referenze to this function so it can be executed by setInterval()
return arguments.callee;
})(), loop_speed);
})();
function stopLoop() {
clearInterval(loop);
}
Cheers neutronenstern

slide a div using javascript

I wanted to write a javascript code that will slide a div in specific direction, distance and in some given time. I wrote this small script. but doesn't work at all. Instead browser gets slow. No change in position is visible.
Can someone tell me how to achieve the result ? I know there are many ready made libraries that can do this easily. But I just wanted to give it a try.
<script type="text/javascript" language="javascript">
var element = '';
var slidePerMS = '';
function slideIt(ele, direction, distance, slideDuration){
element = ele;
var i=0;
slidePerMS = distance / (slideDuration*1000);
for(i=0; i<3000; i++){
setTimeout("changePosition()",1);
}
}
function changePosition(){
var currElement = document.getElementById(element);
currElement.style.left = "'"+slidePerMS+"px'";
}
</script>
SOO many things wrong with that code it's not even funny... Let's see...
You are trying to render a 1,000 FPS animation. This is simply impossible for a browser.
You are passing a string as parameter to setTimeout, which is as evil as eval.
You set slidePerMS once but never change it after, resulting in the div being moved to the exact same spot over and over.
You are setting the style with extra quotes inside - do you put quotes in a CSS file?
That's to name but a few. Try this instead:
<script type="text/javascript" language="javascript">
function slideIt(elem, direction, distance, slideDuration){
var elmt = document.getElementById(elem),
i=0, step = distance / (slideDuration*20),
stepper = setInterval(function() {
i = Math.min(distance,i+step);
elmt.style.left = i+'px';
if( i == distance) clearInterval(stepper);
},50);
}
</script>
You have many problems.
You are treating setTimeout as if it was sleep. Don't do that. It isn't like sleep at all, it runs a function after a given period of time, but doesn't pause the execution of anything else.
This means you just hammer the function repeatedly 3000 times, which is what is locking up the browser.
Instead of using a for loop, you should be using setInterval.
Don't pass a string to setInterval (or setTimeout), it gets evaled, which is slow and hard to debug, and it breaks scope. Pass a function instead.
Inside changePosition you are trying to use a variable called slidePerMS, which is undefined because it is defined in the scope of slideIt.
You are also trying to set left to "'123px'". You can't quote your values in CSS.
Get rid of both the 's.
This is why you can't see any change. Invalid values are ignored in CSS.

JavaScript & string length: why is this simple function slow as hell?

i'm implementing a charcounter in the UI, so a user can see how many characters are left for input.
To count, i use this simple function:
function typerCount(source, layerID)
{
outPanel = GetElementByID(layerID);
outPanel.innerHTML = source.value.length.toString();
}
source contains the field which values we want to meassure
layerID contains the element ID of the object we want to put the result in (a span or div)
outPanel is just a temporary var
If i activate this function, while typing the machine really slows down and i can see that FF is using one core at 100%. you can't write fluently because it hangs after each block of few letters.
The problem, it seems, may be the value.length() function call in the second line?
Regards
I can't tell you why it's that slow, there's just not enough code in your example to determine that. If you want to count characters in a textarea and limit input to n characters, check this jsfiddle. It's fast enough to type without obstruction.
It could be having problems with outPanel. Every time you call that function, it will look up that DOM node. If you are targeting the same DOM node, that's very expensive for the browser if it's doing that every single time you type a character.
Also, this is too verbose:
source.value.length.toString();
This is sufficient:
source.value.length;
JavaScript is dynamic. It doesn't need the conversion to a string.
I doubt your problem is with the use of innerHTML or getElementById().
I would try to isolate the problem by removing parts of the function and seeing how the cpu is used. For instance, try it all these ways:
var len;
function typerCount(source, layerID)
{
len = source.value.length;
}
function typerCount(source, layerID)
{
len = source.value.length.toString();
}
function typerCount(source, layerID)
{
outPanel = GetElementByID(layerID);
outPanel.innerHTML = "test";
}
As artyom.stv mentioned in the comments, cache the result of your GetElementByID call. Also, as a side note, what is GetElementByID doing? Is it doing anything else other than calling document.getElementById?
How would you cache this you say?
var outPanelsById = {};
function getOutPanelById(id) {
var panel = outPanelsById[id];
if (!panel) {
panel = document.getElementById(id);
outPanelsById[id] = panel;
}
return panel;
};
function typerCount(source, layerId) {
var panel = getOutPanelById(layerId);
panel.innerHTML = source.value.length.toString();
};
I'm thinking there has to be something else going on though, as even getElementById calls are extremely fast in FF.
Also, what is "source"? Is it a DOMElement? Or is it something else?

Categories

Resources