Append items ordering by placed amount - javascript

I'm using this function to append new items in order by the amount. This function is being called every 30-50ms.
var insertBefore = false;
container.find('.roll-user-row[data-user-id="' + user_data.id + '"]').remove();
container.children().each(function () {
var betContainer = $(this), itemAmount = $(this).attr('data-amount'), betId = $(this).attr('data-user-id');
if (itemAmount < betData.totalAmount) {
insertBefore = betContainer;
return false;
}
});
if (insertBefore) {
$(template).insertBefore(container);
} else {
container.prepend(template);
}
itemAmount = $(this).attr('data-amount') is integer, betData.totalAmount is interger too. And if appending goes slower than ±300ms - everything works well. In case of fast appending I get this result:
and thats not even close what I want - thats random. How to solve this?

1. Refactoring
First of all, return within .each callback doesn't work. It just breaks current iteration, not all the cycle. If you want to interrupt cylce, you should use simple for-loop and break statement. Then, I would recommend to call $() as rarely as possible, because this is expensive. So I would suggest the following refactoring for your function:
function run() {
container.find('.roll-user-row[data-user-id="' + user_data.id + '"]').remove();
var children = container.children();
for (var i = 0; i < children.length; i++) {
var betContainer = $(children[i]); // to cache children[i] wrapping
var itemAmount = betContainer.attr('data-amount');
var betId = betContainer.attr('data-user-id');
if (itemAmount < betData.totalAmount) {
$(template).insertBefore(container);
return; // instead of "break", less code for same logic
}
}
container.prepend(template); // would not be executed in case of insertBefore due to "return"
}
2. Throttling
To run a 50ms repeating process, you are using something like setInterval(run, 50). If you need to be sure, that run is done and this is 300ms delay, then you may use just setInterval(run, 300). But if the process initializes in a way that you can't change, and 50ms is fixed interval for that, then you may protect run calling by lodash throttle or jquery throttle plugin:
var throttledRun = _.throttle(run, 300); // var throttledRun = $.throttle(300, run);
setInterval(throttledRun, 50);
setInterval is just for example, you need to replace your initial run with throttled version (throttledRun) in your repeater initialization logic. This means that run would not be executed until 300ms interval has passed since the previous run execution.

I am only posting the approach here, if my understanding is right, then I'll post a code. First thing came to my mind reading this was the 'Virtual DOM' concept. Here is what you can do,
Use highly frequent random function calls only to maintain a data structure like an object. Don't rely on DOM updates.
Then use a much less frequent setInterval repetitive function call to redraw (or update) your DOM from that data structure.
I am not sure there are any reason you can't take this approach, but this will be the most efficient way to handle DOM in a time critical use-case.

Related

understanding setInterval in javascript

I have a function which does something async like saving to database. Want a mechanism that first inserts the row and the next insertion should occur only when the first insert operation has finished.
Here is what I have tried and it somewhat works.
var interval = true;
function insert() {
model.save(function () {
interval = true;
})
}
foreach(row, function (key, val) {
var interval1 = setInterval(function () {
if (interval) {
insert();
interval = false;
clearInterval(interval1);
}
}, 100)
})
Is it the correct approach of doing this? Please shed some light about my understanding of timers in javascript.
No, you should not be creating timers to poll for when something is done. That's probably the worst way you can do it. What you want to do is to explicitly start the next iteration each time the previous one finishes.
Here's the general idea for how you do this without polling. The idea is that you need to create a function that can be called successive times and each time it's called, it will perform the next iteration. You can then call that function from the completion handler of your async operation. Since you don't have a nice convenient foreach loop to control the iteration, you then have to figure out what state variables you need to keep track of to guide each iteration. If your data is an array, all you need is the index into the array.
function insertAll(rows) {
// I'm assuming rows is an array of row items
// index to keep track of where we are in the iteration
var rowIndex = 0;
function insert() {
// keep going as long as we have more rows to process
if (rowIndex < rows.length) {
// get rows[rowIndex] data and do whatever you need to do with it
// increment our rowIndex counter for the next iteration
++rowIndex;
// save and when done, call the next insert
model.save(insert)
}
}
// start the first iteration
insert();
}
If you don't have your data in an array that is easy to step through one at a time this way, then you can either fetch each next iteration of the data when needed (stopping when there is no more data) or you can collect all the data into an array before you start the operation and use the collected array.
No, this is absolutely not the right way to do this. Lets assume that row contains 10 values, then you are creating 10 independent timers which continuously run and check whether they can insert. And it's not even guaranteed that they are executed in the order they are created.
As jfriend00 already mentioned, you should omit the "loop" and make use of the completion callback of the save operation. Something like this:
var rows = [...];
function insert(rows, index) {
index = index || 0;
var current_element = rows[index];
model.save(function() {
if (index < rows.length - 1) {
insert(rows, index + 1);
}
});
}
insert(rows);
Notice how the function calls itself (somehow) after the save operation is complete, increasing the index so the next element in the array is "saved".
I would use a library that handles async stuff such as async.js
BTW it seems like your model.save methods takes a callback, which you can use directly to call the insert method. And if the insert function is one you have made by yourself, and not a part of some bigger framework, I will suggest to re-write it and make take a callback as parameter, and use that instead of using setInterval for checking when async work is done.

Improving performance of javascript intervals on IE8

I'm using javascript loop (using setInterval) that runs through a list of search results, highlighting the search term by adding a css styled <span> around search hits as it goes. I'm using setInterval like this to release control of the browser while it does this.
In Chrome and Firefox this works well - even with a setInterval parameter of 10-20ms; and the user has full control of the browser (i.e. scrolling, clicking links etc.) while the results are rapidly highlighted:
mylooper = setInterval(function() {
// my functionality is here
},15); // 15ms
Unfortunately, when using the dreaded IE8, the browser locks up and takes a really long time to add the <span>'s and style the search results. It also takes a long time just to load the page in the first place - shortened a great deal when this script is removed.
So far I've tried:
changing the interval values (I've read that IE8 doesn't detect intervals of sub 15ms);
using setTimeout instead of setInterval;
removing the interval to check that this is in fact what is causing the slow-down (it is!); and
swearing about Internet Explorer a lot;
var highlightLoop;
var index = 0;
highlightLoop = setInterval(function () {
var regex = RegExp(regexPhrase, "gi"); // regexPhase created elsewhere
var searchResults = resultElements.eq(index).get(0); // run through resultElements which contain alll the nodes with search results in them.
findAndReplaceDOMText( // a function that does the searching and inserting of styling
regex,
searchResults,
function (fill, matchIndex) {
called = true;
var span = document.createElement("span");
span.className = "result-highlight";
span.innerHTML = fill;
return span;
}
);
if (index == resultElements.length || searchTermUpdated == true) { // stop interval loop when search term changes or we reach the end of results - variable set elsewhere.
searchTermUpdated = false;
clearInterval(highlightLoop); // stop the loop
}
index++
}
}, 50); // 50ms does not improve performance.
Any advice on workarounds for this kind of javascripting in IE would be massively appreciated. Thanks all.
I believe you may be able to improve the performance by tweaking findAndReplaceDOMText, and maybe its callback too. I suppose findAndReplaceDOMText appends the element returned by the callback to the DOM, from within a loop of all matches. If it's doing that inside a loop, try to move it outside the loop, and apply the all changes to the DOM at once. That should result in better performance, as repainting the page after each DOM update is expensive.
Try this recursive approach instead:
get a list of all elements to be acted upon into array X (one time cost)
while the array X has length, keep repeating the next actions
shift the first element off the array
process the single element
start this process again with the new array X (now Xn - 1 length) on a setTimeout
The code looks like this in general
function processArray(array) {
var element = array.shift();
processElement(element);
if (array)
setTimeout(function(){processArray(array);},15ms);
}
There might be something else to be done with this recursion, but it works fairly well in all browsers and never blocks, because you're only initiating the repeat when the last one has had time to finish.

How to stop long javascript loops from crashing the browser?

I have a button in my page that's looping a series of DIVs and editing them (appending text mostly, nothing serious),
The thing is, the number of DIVs is changing by the user (The user can add or remove them freely),
I'm looping the DIVs via jQuery $.each function:
var DomToEdit = $('.divs_to_edit');
$.each(DomToEdit, function() { $(this).append('text'); ... });
the variable DomToEdit contains somewhat unlimited number of divs, and then I refer to them via the $.each function.
Sometimes while doing the $.each loop the user gets to wait for a couple of secons, and in worse cases the browser is crashing
Is there a way to prevent this? Maybe having the loop "sleep" after 50 DIVs?
Thanks
EDIT: I didn't use the same ID, sorry - it was a flaw in my explanation. I use the same class. :)
The first argument of the function in the .each handler is the index of the current element. you can simply add a check before it, and return false to stop the loop.
$.each(DomToEdit, function(i) { // or DomToEdit.each(function() {
if (i === 50) return false;
..
DomToEdit is a jQuery object, so $.each(DomToEdit, fn) and DomToEdit.each(fn) are equivalent.
A more effective method is to cut off the elements, using .slice(0, 50).
DomToEdit.slice(0, 50).each( function(i) {
..
Add a timer which will execute append 50 div's every 5 seconds and works thru the array of div until it finishes iterating all div.
Below code works on 50 div every 5 seconds.
var DomToEdit = $('#divs_to_edit');
var timer = setInterval( function () { //<-- Create a Timer
$.each(DomToEdit, function(index) { //<-- Iterate thru divs
if (index == 50) return; //<-- Return on 50 for later
$(this).append('text');
});
DomToEdit = DomToEdit.slice(0, 50); //<-- Slice the processed div's
// Clear timer when all elements are processed.
if (DomToEdit.length == 0) {
clearInterval(timer);
}
}, 5000); // <-- Do the steps on every 5 secs
Whenever I think that code can potentially cause a crash, I'll create a self-decementing breaker variable that breaks out of a loop after a certain number of loop cycles.
var breaker = 100;
while(true) {
breaker--;if(breaker<0){console.log("Oh snap batman");break;}
console.log("CRASH");
}
The method could execute alternative code that works around the crash as well. Usually, I just try to fix the code somehow ;)
You could setTimeout to 0 in order to queue the processing of each element into the execution stack (0 makes it just queue without delay):
$.each(DomToEdit, function() {
var elem = $(this);
setTimeout(function() { elem.append('text'); }, 0);
});
You could queue the tasks and then execute tasks
in batches of X every Y milliseconds:
var queue = [];
$.each(DomToEdit, function () {
queue.push( $.proxy( function () {
$(this).append('text');
}, this ));
});
window.setInterval( function(){
var l = 100;
while( queue.length && l-- ) { //Keep executing tasks until there
//is no more or maximum amount of tasks
//executed for this batch is executed
queue.shift()();
}
}, 50 );
The real fix is of course carefully review what you are doing and fix that. $('#divs_to_edit') always returns a single element max so .each doesn't make much sense here for example...
It's possible with an extremely large number of elements, that it would actually be less processor intensive to pull the entire container element as a string and run a Javascript .replace() on it and replace the entire container, than looping through hundreds of thousands of elements?

Optimizing Javascript Loop for Wheel Game

I have a game I'm creating where lights run around the outside of a circle, and you must try and stop the light on the same spot three times in a row. Currently, I'm using the following code to loop through the lights and turn them "on" and "off":
var num_lights = 20;
var loop_speed = 55;
var light_index = 0;
var prevent_stop = false; //If true, prevents user from stopping light
var loop = setTimeout(startLoop, loop_speed);
function startLoop() {
prevent_stop = false;
$(".light:eq(" + light_index + ")").css("background-color", "#fff");
light_index++;
if(light_index >= num_lights) {
light_index = 0;
}
$(".light:eq(" + light_index + ")").css("background-color", "red");
loop = setTimeout(startLoop, loop_speed);
}
function stopLoop() {
clearTimeout(loop);
}
For the most part, the code seems to run pretty well, but if I have a video running simultaneously in another tab, the turning on and off of the lights seems to chug a bit. Any input on how I could possibly speed this up would be great.
For an example of the code from above, check out this page: http://ericditmer.com/wheel
When optimizing the thing to look at first is not doing twice anything you only need to do once. Looking up an element from the DOM can be expensive and you definitely know which elements you want, so why not pre-fetch all of them and void doing that multiple times?
What I mean is that you should
var lights = $('.light');
So that you can later just say
lights.eq(light_index).css("background-color", "red");
Just be sure to do the first thing in a place which keeps lights in scope for the second.
EDIT: Updated per comment.
I would make a global array of your selector references, so they selector doesn't have to be executed every time the function is called. I would also consider swapping class names, rather than attributes.
Here's some information of jQuery performance:
http://www.componenthouse.com/article-19
EDIT: that article id quite old though and jQuery has evolved a lot since. This is more recent: http://blog.dynatrace.com/2009/11/09/101-on-jquery-selector-performance/
You could try storing the light elements in an array instead of using a selector each time. Class selectors can be a little slow.
var elements = $('.light');
function startLoop() {
prevent_stop = false;
$(elements[light_index]).css('background-color', '#fff');
...
}
This assumes that the elements are already in their intended order in the DOM.
One thing I will note is that you have used a setTimeout() and really just engineered it to behave like setInterval().
Try using setInterval() instead. I'm no js engine guru but I would like to think the constant reuse of setTimeout has to have some effect on performance that would not be present using setInterval() (which you only need to set once).
Edit:
Curtousy of Diodeus, a related post to back my statement:
Related Stack Question - setTimeout() vs setInterval()
OK, this includes some "best practice" improvements, if it really optimizes the execution speed should be tested. At least you can proclaim you're now coding ninja style lol
// create a helper function that lend the array reverse function to reverse the
// order of a jquery sets. It's an object by default, not an array, so using it
// directly would fail
$.fn.reverse = Array.prototype.reverse;
var loop,
loop_speed = 55,
prevent_stop = false,
// prefetch a jquery set of all lights and reverses it to keep the right
// order when iterating backwards (small performance optimization)
lights = $('.light').reverse();
// this named function executes as soon as it's initialized
// I wrapped everything into a second function, so the variable prevent_stop is
// only set once at the beginning of the loop
(function startLoop() {
// keep variables always in the scope they are needed
// changed the iteration to count down, because checking for 0 is faster.
var num_lights = light_index = lights.length - 1;
prevent_stop = false;
// This is an auto-executing, self-referencing function
// which avoids the 55ms delay when starting the loop
loop = setInterval((function() {
// work with css-class changing rather than css manipulation
lights.eq( light_index ).removeClass('active');
// if not 0 iterate else set to num_lights
light_index = (light_index)? --light_index:num_lights;
lights.eq( light_index ).addClass('active');
// returns a referenze to this function so it can be executed by setInterval()
return arguments.callee;
})(), loop_speed);
})();
function stopLoop() {
clearInterval(loop);
}
Cheers neutronenstern

Javascript: How to put a simple delay in between execution of javascript code?

I have a for loop which iterates more than 10,000 times in a javascript code. The for loop creates and adds < div > tags into a box in the current page DOM.
for(i = 0; i < data.length; i++)
{
tmpContainer += '<div> '+data[i]+' </div>';
if(i % 50 == 0) { /* some delay function */ }
}
containerObj.innerHTML = tmpContainer;
i want to put a delay after every 50 < div > tags so what will be the code at the place of
/* some delay function */
because its taking too much time to load all 10,000 < div > tags. i want to update the box in chunks of 50 < div > tags.
thanks in advance.
There's a handy trick in these situations: use a setTimeout with 0 milliseconds. This will cause your JavaScript to yield to the browser (so it can perform its rendering, respond to user input and so on), but without forcing it to wait a certain amount of time:
for (i=0;i<data.length;i++) {
tmpContainer += '<div> '+data[i]+' </div>';
if (i % 50 == 0 || i == data.length - 1) {
(function (html) { // Create closure to preserve value of tmpContainer
setTimeout(function () {
// Add to document using html, rather than tmpContainer
}, 0); // 0 milliseconds
})(tmpContainer);
tmpContainer = ""; // "flush" the buffer
}
}
Note: T.J. Crowder correctly mentions below that the above code will create unnecessary functions in each iteration of the loop (one to set up the closure, and another as an argument to setTimeout). This is unlikely to be an issue, but if you wish, you can check out his alternative which only creates the closure function once.
A word of warning: even though the above code will provide a more pleasant rendering experience, having 10000 tags on a page is not advisable. Every other DOM manipulation will be slower after this because there are many more elements to traverse through, and a much more expensive reflow calculation for any changes to layout.
You could use the window.setTimeout function to delay the execution of some code:
if(i % 50 == 0) {
window.setTimeout(function() {
// this will execute 1 second later
}, 1000);
}
But your javascript will continue to execute. It won't stop.
I'd break out the code creating the divs into a function, and then schedule execution of that function periodically via setTimeout, like this:
function createThousands(data) {
var index;
index = 0;
doAChunk();
function doAChunk() {
var counter;
for (counter = 50; counter > 0; --counter) {
// Are we done?
if (index >= data.length) {
// Yup
return;
}
// ...create a div...
// Move to the next
++index;
}
// Schedule the next pass
setTimeout(doAChunk, 0); // 0 = defer to the browser but come back ASAP
}
}
This uses a single closure, doAChunk to do the work. That closure is eligible for garbage collection once its work is done. (More: Closures are not complicated)
Live example
it takes much time because the reflows. you should create a document fragment and then adding the brats.
When does reflow happen in a DOM environment?
Javascript Performance - Dom Reflow - Google Article
sleeping will not solve your problem
on the other hand, you creating a string containing the innerhtml and the add to innerhtml. the string stuff really dont needs a big performance, but when you execute the .innerhtml command, it starts a process, which parse your string and creating elements and appending them. you cant interrupt or add a delay.
the innerhtml process cannot be sleeped or interrupted.
you need to generate the elements one by one, and after 50 elemnts added, create a settimeout delay.
var frag = document.createDocumentFragment();
function addelements() {
var e;
for(i=0;i<50;++i) {
e = document.createElement('div');
frag.appendChild(e);
}
dest.appendChild(frag);
window.setTimeout(addelements,1000);
}
Here is the real trick to put a delay in javascript without hanging the browser.
You need to use a ajax function with synchronous method which will call a php page and in that php page you can use the sleep() php function !
http://www.hklabs.org/articles/put-delay-in-javascript

Categories

Resources