Recursion AJAX post, Causing Computer to run slowly - javascript

I have this recursion loop where inside the function I have atleast 2 ajax get/post, and the recursion happens after the first ajax get. my function structure is like this,
function Loop() {
$.get(url, data, function(result) {
for loop to render the result {
// render the result here
}
for loop to get another data using the result {
$.post(url, result.data, function(postResult) {
// I don't know what it did here since
// I don't have an access to this post
});
// is there a way here that i will not proceed if the post is not done yet?
}
setTimeout("", 1000); // I wait for 1 second for the post to finish
Loop(); // call the recursion
}, "json");
}
can anyone tell me what's wrong with this code? why do i get a warning from the computer that my script is causing the computer to run slowly. I know that this code is the one causing it, but I don't know the work around.
I know inside the second loop inside the get is causing a lot of memory. Is there a way that it will not loop back if the ajax post is not finished?

Your setTimeout will not neatly pause the code for one second: it will just set a timer for an (empty, in your case) event to go off after a certain time. The rest of the script will continue to execute parallel to that.
So you're currently calling your recursion function a lot more frequently than you think you are. That's your first problem.
Your biggest problem, though, is that regardless of what you're doign in the result of your post, that's in another scope entirely, and you cannot break out of the Loop function from there. There is nothing in your code to break the recursion, so it is infinite, and very fast, and it sends off Ajax requests on top of that.
You need to describe in more detail what you want to achieve, and perhaps somebody can show you how you should do it. The only thing that is certain is that you need to use callbacks. I've written an example but it's making a lot of assumptions. It's a lot of approximations of what I think you might want to achieve, but no doubt you'll need to tweak this a bit to fit your needs. Hopefully it'll give you an idea of the workflow you need to use:
function Loop() {
$.get(url, data, function(result) {
for loop to render the result {
// render the result here
}
// this is what you're looping over in your second loop
var postQueue = result.someArray;
renderChildData(postQueue, 0);
}, "json");
}
function renderChildData(array, index) {
// this is just one item in the loop
var currentItem = array[index];
$.post(url, currentItem, function(postResult) {
// we have received the result for one item
// render it, and proceed to fetch the next item in the list
index++;
if(index < array.length) {
renderChildData(array, index);
}
});
}

First of all this line:
setTimeout("", 1000); // I wait for 1 second for the post to finish
doesn't make your script to wait, since it's improper usage of setTimeout function. I think you should consider to use setInterval instead and do it like:
function Loop() {
$.get(url, data, function(result) {
for loop to render the result {
// render the result here
}
for loop to get another data using the result {
$.post(url, result.data, function(postResult) {
// I don't know what it did here since
// I don't have an access to this post
});
// is there a way here that i will not proceed if the post is not done yet?
}
}, "json");
}
setInterval( Loop, 1000);
This will make execute your function every 1 sec. I guess this is exactly what you wanted to gain. There is no reason to make recursive call here.

it basically happen when you use a huge code on a page ..
so just try to compress this code

Related

Facebook shares count, pause until callback function finishes

I'm trying to get Facebook share count about many URLs on one page. To do this I'm calling http://graph.facebook.com for each of them. Because I'm doing this in a loop I run to a problem, my loop gets executed before my callback function finishes so I run into trouble. My code looks like this
$('span.share_counter').each(function() {
site_url=$(this).attr('id');
getJSON(site_url, function(err, data) {
alert(site_url);
});
});
Is there anything to make the loop wait until the callback function of getJSON finishes and then continue or am I approaching this in a wrong way?
There is basically no good way to do what you want. Moreover, you shouldn't be trying to do this. Javascript is run on an event loop, meaning that if you have one particular action blocking, the entire UI freezes.
The creation of callbacks allows the UI to continue by letting other jobs get a turn to process. getJson takes a callback for a reason. It potentially could block for a long time, which would be terrible for your event loop.
You should try to restructure your code so that you don't need the getJson() call to return immediately.
Potentially what you want to do:
$('span.share_counter').each(function() {
var curElement = $(this);
var site_url = curElement.attr('id');
getJSON(site_url, function(err, data) {
modifyElement(curElement, data)
});
});

setTimeout for incremental page load

Please see the code below:
function GetSQLTable() {
var str = $("#<%=fieldGroupReferences.ClientID%>")[0].value
var res = str.split(",");
$("#LoadingImage").show();
$("#LoadingImage2").show();
for (var i = 0; i < res.length; i++) {
alert(res[i])
(function(i,res) {
setTimeout(function (i,res) {
alert(res[i])
GetSQLTable2(i, res.length,res)
}, 0)
})(i,res)
}
}
The first alert displays the correct information. The second alter errors (undefined). Why is this?
Also, I am informed that this approach should stop the webpage crashing when there are lots of AJAX requests (it is an incremental page load). However, I do not understand how setting a timeout of zero seconds between AJAX requests will help. GetSQLTable2 executes an AJAX call.
fieldGroupReferences can contain up to about 50 values.
This jsbin link should answer your question
JSBIN
When you call
setTimeout(function(res,i){
//using res and i here
},0)
dont use res,i as function parameters,by logic of closure res and i are by default available inside the function. If you use res,i then you are creating a function with new parameters for which values are not being sent to.
So it should just be
setTimeout(function(){
//using res and i here
},0)
I believe when you call setTimeout(function(){},0) the format of setTimeout is something like this,
function setTimeout(callback,time){
after-time{
callback(); //observe here we are not passing anything as arguements, i am not sure but setTimeout may pass its own values like how $.on passes 'events' to callbacks
}
}
To answer your second part as to how setting 0 seconds will help,
by setting 0 seconds, to say in laymans terms you are continuing the flow in an asynchronous manner!!! Hence no matter how many ajax responses are recieved they wont block each other since the code is asynchronously solved

Critical Section in JavaScript or jQuery

I have a webpage, in which a certain Ajax event is triggered asynchronously. This Ajax section could be called once or more than once. I do not have control over the number of times this event is triggered, nor the timing.
Also, there is a certain code in that Ajax section that should run as a critical section, meaning, when it is running, no other copy of that code should be running.
Here is a pseudo code:
Run JavaScript or jQuery code
Enter critical section that is Ajax (when a certain process is waiting for a response callback, then do not enter this section again, until this process is done)
Run more JavaScript or jQuery code
My question is, how can I run step 2 the way described above? How do I create/guarantee a mutual exclusion section using JavaScript or jQuery.
I understand the theory (semaphores, locks, ...etc.), but I could not implement a solution using either JavaScript or jQuery.
EDIT
In case you are suggesting a Boolean variable to get into the critical section, this would not work, and the lines below will explain why.
the code for a critical section would be as follows (using the Boolean variable suggestions):
load_data_from_database = function () {
// Load data from the database. Only load data if we almost reach the end of the page
if ( jQuery(window).scrollTop() >= jQuery(document).height() - jQuery(window).height() - 300) {
// Enter critical section
if (window.lock == false) {
// Lock the critical section
window.lock = true;
// Make Ajax call
jQuery.ajax({
type: 'post',
dataType: 'json',
url: path/to/script.php,
data: {
action: 'action_load_posts'
},
success: function (response) {
// First do some stuff when we get a response
// Then we unlock the critical section
window.lock = false;
}
});
// End of critical section
}
}
};
// The jQuery ready function (start code here)
jQuery(document).ready(function() {
var window.lock = false; // This is a global lock variable
jQuery(window).on('scroll', load_data_from_database);
});
Now this is the code for the lock section as suggested using a Boolean variable. This would not work as suggested below:
The user scrolls down, (and based on the association jQuery(window).on('scroll', load_data_from_database); more than one scroll event is triggered.
Assume two scroll events are triggered right at almost the same moment
Both call the load_data_from_database function
The first event checks if window.lock is false (answer is true, so if statement is correct)
The second event checks if window.lock is false (answer is true, so if statement is correct)
The first event enters the if statement
The second event enters the if statement
The first statement sets window.lock to true
The second statement sets window.lock to true
The first statement runs the Ajax critical section
The second statement runs the Ajax critical section.
Both finish the code
As you notice, both events are triggered almost at the same time, and both enter the critical section. So a lock is not possible.
I think the most helpful information you provided above was your analysis of the locking.
The user scrolls down, (and based on the association jQuery(window).on('scroll', load_data_from_database); more than one
scroll event is triggered.
Assume two scroll events are triggered right at almost the same moment
Both call the load_data_from_database function
The first event checks if window.lock is false (answer is true, so if statement is correct)
The second event checks if window.lock is false (answer is true, so if statement is correct)
Right away this tells me that you have come to a common (and quite intuitive) misunderstanding.
Javascript is asynchronous, but asynchronous code is not the same thing as concurrent code. As far as I understand, "asynchronous" means that a function's subroutines aren't necessarily explored in depth-first order as we would expect in synchronous code. Some function calls (the ones you are calling "ajax") will be put in a queue and executed later. This can lead to some confusing code, but nothing is as confusing as thinking that your async code is running concurrently. "Concurrency" (as you know) is when statements from different functions can interleave with one another.
Solutions like locks and semaphores are not the right way to think about async code. Promises are the right way. This is the stuff that makes programming on the web fun and cool.
I'm no promise guru, but here is a working fiddle that (I think) demonstrates a fix.
load_data_from_database = function () {
// Load data from the database. Only load data if we almost reach the end of the page
if ( jQuery(window).scrollTop() >= jQuery(document).height() - jQuery(window).height() - 300) {
console.log(promise.state());
if (promise.state() !== "pending") {
promise = jQuery.ajax({
type: 'post',
url: '/echo/json/',
data: {
json: { name: "BOB" },
delay: Math.random() * 10
},
success: function (response) {
console.log("DONE");
}
});
}
}
};
var promise = new $.Deferred().resolve();
// The jQuery ready function (start code here)
jQuery(document).ready(function() {
jQuery(window).on('scroll', load_data_from_database);
});
I'm using a global promise to ensure that the ajax part of your event handler is only called once. If you scroll up and down in the fiddle, you will see that while the ajax request is processing, new requests won't be made. Once the ajax request is finished, new requests can be made again. With any luck, this is the behaviour you were looking for.
However, there is a pretty important caveats to my answer: jQuery's implementation of promises is notoriously broken. This isn't just something that people say to sound smart, it is actually pretty important. I would suggest using a different promise library and mixing it with jQuery. This is especially important if you are just starting to learn about promises.
EDIT: On a personal note, I was recently in the same boat as you. As little as 3 months ago, I thought that some event handlers I was using were interleaving. I was stupefied and unbelieving when people started to tell me that javascript is single-threaded. What helped me is understanding what happens when an event is fired.
In syncronous coding, we are used to the idea of a "stack" of "frames" each representing the context of a function. In javascript, and other asynchronous programming environments, the stack is augmented by a queue. When you trigger an event in your code, or use an asynchronous request like that $.ajax call, you push an event to this queue. The event will be handled the next time that the stack is clear. So for example, if you have this code:
function () {
this.on("bob", function () { console.log("hello"); })
this.do_some_work();
this.trigger("bob");
this.do_more_work();
}
The two functions do_some_work and do_more_work will fire one after the other, immediately. Then the function will end and the event you enqueued will start a new function call, (on the stack) and "hello" will appear in the console. Things get more complicated if you trigger an event in your handler, or if you trigger and event in a subroutine.
This is all well and good, but where things start to get really crappy is when you want to handle an exception. The moment you enter asynchronous land, you leave behind the beautiful oath of "a function shall return or throw". If you are in an event handler, and you throw an exception, where will it be caught? This,
function () {
try {
$.get("stuff", function (data) {
// uh, now call that other API
$.get("more-stuff", function (data) {
// hope that worked...
};
});
} catch (e) {
console.log("pardon me?");
}
}
won't save you now. Promises allow you to take back this ancient and powerful oath by giving you a way to chain your callbacks together and control where and when they return. So with a nice promises API (not jQuery) you chain those callbacks in a way that lets you bubble exceptions in the way you expect, and to control the order of execution. This, in my understanding, is the beauty and magic of promises.
Someone stop me if I'm totally off.
I would recommend a queue which only allows one item to be running at a time. This will require some modification (though not much) to your critical function:
function critical(arg1, arg2, completedCallback) {
$.ajax({
....
success: function(){
// normal stuff here.
....
// at the end, call the completed callback
completedCallback();
}
});
}
var queue = [];
function queueCriticalCalls(arg1, arg2) {
// this could be done abstractly to create a decorator pattern
queue.push([arg1, arg2, queueCompleteCallback]);
// if there's only one in the queue, we need to start it
if (queue.length === 1) {
critical.apply(null, queue[0]);
}
// this is only called by the critical function when one completes
function queueCompleteCallback() {
// clean up the call that just completed
queue.splice(0, 1);
// if there are any calls waiting, start the next one
if (queue.length !== 0) {
critical.apply(null, queue[0]);
}
}
}
UPDATE: Alternative solution using jQuery's Promise (requires jQuery 1.8+)
function critical(arg1, arg2) {
return $.ajax({
....
});
}
// initialize the queue with an already completed promise so the
// first call will proceed immediately
var queuedUpdates = $.when(true);
function queueCritical(arg1, arg2) {
// update the promise variable to the result of the new promise
queuedUpdates = queuedUpdates.then(function() {
// this returns the promise for the new AJAX call
return critical(arg1, arg2);
});
}
Yup, the Promise of cleaner code was realized. :)
You can wrap the critical section in a function and then swap the function so it does nothing after first run:
// this function does nothing
function noop() {};
function critical() {
critical = noop; // swap the functions
//do your thing
}
Inspired by user #I Hate Lazy Function in javascript that can be called only once

Javascript: Non-blocking way to wait until a condition is true

I have several ASP.NET UpdatePanels, each with an AsyncPostBackTrigger tied to the same button's serverside click event. Since only one UpdatePanel can be doing its thing at a time, I use .get_isInAsyncPostBack() of the PageRequestManager to prevent a user from being able to access another part of the page until the async postback is complete.
Another part of this page needs to dynamically update multiple update panels consecutively. Since the update panels use async triggers, calling __doPostBack("<%=ButtonName.ClientID %>", 'PanelId'); fires asynchonously. Because of this, it will quickly move along to the next iteration of the loop and try to update the next panel. However, the second iteration fails because there is already another update panel doing an async postback.
Ideally, there would be a way to wait until .get_isInAsyncPostBack() returns false without blocking other client activity.
Research has lead me to a lot people with my problem, almost all of whom are advised to use setTimeOut(). I do not thing this will work for me. I don't want to wait for a specified amount of time before executing a function. I simply want my Javascript to wait while another script is running, preferably wait until a specific condition is true.
I understand that many will probably want to suggest that I rethink my model. It's actually not my model, but one that was handed to our development team that is currently a total mess under the hood. Due to time contraints, rewriting the model is not an option. The only option is to make this work. I think that if I had a way to make the client code wait without blocking, my problem would be solved.
There is no such functionality such as wait or sleep in javascript, since it would stop browser from responding.
In your case I would go with something similar to following:
function wait(){
if (!condition){
setTimeout(wait,100);
} else {
// CODE GOES IN HERE
}
}
It's easy to make a mistake when calling setTimeout that will cause the JavaScript call stack to fill up. If your function has parameters, you need to pass those in at the end of the setTimeout parameter list like this:
function wait(param1, param2){
if (!condition){
setTimeout(wait, 100, param1, param2);
} else {
// CODE GOES IN HERE
}
}
If you pass parameters or even include empty () after the name of the function, it will be executed immediately and fill up the stack.
// This is the wrong way to do it!
function wait(param1, param2){
if (!condition){
setTimeout(wait(param1, param2), 100); // you'll get max call stack error if you do this!
} else {
// CODE GOES IN HERE
}
}
I needed to slow down a process and came up with a helpful little method.
const wait = (seconds) =>
new Promise(resolve =>
setTimeout(() => resolve(true), seconds * 1000)
);
And you can use it like this.
const doWork = async() => {
// After 3 seconds do something...
await wait(3);
console.log('work done');
}
This function calls condFunc which should return true when condition is met. When that happens readyFunc is called. checkInterval sets checking rate in milliseconds
var wait = function(condFunc, readyFunc, checkInterval) {
var checkFunc = function() {
if(condFunc()) {
readyFunc();
}
else
{
setTimeout(checkFunc, checkInterval);
}
};
checkFunc();
};
Usage:
wait(
function() { return new Date().getSeconds() == 10; },
function() { console.log("Done"); },
100
);
prints "Done" when current time is 10 seconds after minute

Checking if a JavaScript setTimeout has fired

I'd like to be able to dispatch a bunch of work via JavaScript to be done in the browser in such a way that the browser stays responsive throughout.
The approach I'm trying to take is to chunk up the work, passing each chunk to a function that is then queued with a setTimeout(func, 0) call.
I need to know when all the work is done, so I'm storing the returned timer ID in a map (id -> true|false). This mapping is set to false in the next block of code after I have the timer ID, and the queued function sets the mapping to true when it completes... except, of course, the queued function doesn't know its timer ID.
Maybe there's a better/easier way... or some advice on how I can manipulate my map as I need to?
I would queue the work in an array, use one timeout to process the queue and call a callback once the queue is empty. Something like:
var work = [...];
var run = function(work, callback) {
setTimeout(function() {
if(work.length > 0) {
process(work.shift());
setTimeout(arguments.callee, 25);
}
else {
callback();
}
}, 25);
};
run(work, function() {
alert('Work is done!');
});
As JavaScript in browsers is single threaded there is no real advantage to run multiple timeouts (at least I think this is what you are doing). It may even slow down the browser.
I'd like to add that although javascript is single threaded you can still have multiple ajax calls going at once. I recently had a site that needed to do potentially hundreds of ajax calls and the browser just couldn't handle it. I created a queue that used setTimeOut to run 5 calls at once. When one of the ajax calls returned it fired a callback (which is handled by a single thread) and then made the next call on the stack.
Imagine you're a manager that can only talk to one person at a time, you give 5 employees assignments, then wait for their responses, which may come in any order. Once the first employee comes back and gives you the information, you give them a new assignment and wait for the next employee (or perhaps even the same employee) to come back. So although you're "single threaded" 5 things are going on at once.
There is an example right in the HTML Standard, how it is best to handle it:
To run tasks of several milliseconds back to back without any delay,
while still yielding back to the browser to avoid starving the user
interface (and to avoid the browser killing the script for hogging the
CPU), simply queue the next timer before performing work:
function doExpensiveWork() {
var done = false;
// ...
// this part of the function takes up to five milliseconds
// set done to true if we're done
// ...
return done;
}
function rescheduleWork() {
var handle = setTimeout(rescheduleWork, 0); // preschedule next iteration
if (doExpensiveWork())
clearTimeout(handle); // clear the timeout if we don't need it
}
function scheduleWork() {
setTimeout(rescheduleWork, 0);
}
scheduleWork(); // queues a task to do lots of work
The moment of finishing the work is pretty clear, when clearTimeout is called.

Categories

Resources