Run code on completion of AJAX within an AJAX - JQuery - javascript

I have an ajax call that retrieves data and on the success of it, runs a loop and runs functions that run more ajax calls.
CODE:
success: function(data){
// FIRST MAKE SURE DATA WAS FOUND
console.log(data);
if(data["status"] == "found")
{
// CREATE NEEDED ARRAY FROM SINGLE STRING
var restrictionArray = data["data_retrieved"].split(',');
loop_amount = restrictionArray.length; //<!-- AMOUNT TO BE LOOPED FOR BUILDING FORMS
//var Action = this.Elevation.getActionsByOptionId(Option.getID())[i];
for(var j = 0; j < loop_amount; j++)
{
var EditRowRestriction = OptionRules.prototype.getEditRowRestriction(j);
var check = $(EditRowRestriction).find("select");
console.log(check[0]);
var EditRowRestirction_select_ability = OptionRules.prototype.getEditRowRestriction_select_ability(j);
//var EditRowRestirction_access_ability = OptionRules.prototype.getEditRowRestriction_access_ability(j);
EditRowRestriction.onremove = function()
{
$(this).next().empty(); <!-- RESTRICTION SELECT ABILITY REMOVE
//$(this).next().next().empty(); <!-- RESTRICTION ACCESS ABILITY REMOVE
$(this).empty();
//var Action = this.Action;
//that.removeAction(Action);
}
tbody.appendChild(EditRowRestriction);
tbody.appendChild(EditRowRestirction_select_ability);
console.log(check[1]);
}
}
},
error:function(){
alert("An error occured, please try again.");
}
Heres the problem, inside the for loop; those methods link to another method that invokes an ajax call. What happens here is even before the ajax call's finish, the loop is always continuing and those methods are always being called. What I want to do is to stop the loop until those methods have returned based on the ajax call's being finished. And than to invoke the last 2 lines of code within the loop:
tbody.appendChild(EditRowRestriction);
tbody.appendChild(EditRowRestirction_select_ability);
What would be my best approach to accomplishing this?
Suggestions, thoughts?

It would be best to consolidate all of this looping with a single server-side script, however, if that isn't an option, you can use .then:
var def = $.Deferred(function(def){
def.resolve();
}).promise(); // used to start .then chain
for (var i = 0; i < 10; i++) {
def = def.then(function () {
return $.ajax({});
});
}
def.done(function(){
// All requests from chain are done
console.log('all done');
});
http://jsfiddle.net/kg45U/

You could modify the calls to $.ajax within the loop to execute synchronously, rather than asynchronously. (See SO question How can I get jQuery to perform a synchronous, rather than asynchronous, AJAX request?.) This would have the effect of "pausing" the loop while the inner ajax calls execute. This would be the most straightforward approach, but does have the disadvantage of locking up the user's browser while those ajax requests execute.
The alternative is to break up the code with the loops into pieces, so that you complete the processing with the loop in a callback function that is invoked after the inner-ajax calls have completed.

Related

Failing to delay function until after 'for' loop has fully completed

I've got a tool in place which is splitting a large query into manageable chunks, then using a simple AJAX method to spit this out. The destination for the AJAX form is just a script which delegates some form data to a function, including which 'chunk' to process.
<script>
var passes = Math.ceil($max / $offset);
for (i = 0; i < passes; i++)
{
$.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form" . $i . "').serialize(),
success: function(data){
$('#update" . $i . "').append(data);
}
});
}
</script>
As this can iterate a few times, I was looking to execute a script for when the looping (i.e. the function itself) has finished.
As this isn't anything too snazzy, I thought it would be a simple case of adding if(i == passes -1) { alert('test');}if(i == passes -1) { alert('test');} to the end of the loop, like this:
for (i = 0; i < passes; i++) {
$.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form" . $i . "').serialize(),
success: function(data){
$('#update" . $i . "').append(data);
}
});
if(i == passes -1) { alert('test');}
}
....but it loads this as soon as the page loads, before the loop.
Likewise, adding a simple function after the loop acheives the same result, too.
I would have thought (but I'm quite fresh at JS) that it would complete a loop before attempting to execute the second instance of 'i', but it doesn't seem to do so - the page acts like all of the requests are sent instantly, completing the loop, executing the code, then allowing the functions within 'success' to echo back in their own time. This seems even more evident in that sometimes it will append the results for the second iteration of i before the first.
Questions...
1) Have I made an error in how I've constructed the loop, considering the purpose?
2) Why does the loop seem to execute code after the loop when it seems like it is still processing the loop itself?
What I'm trying to achieve
Each loop should perform a MySQL query, return the function's HTML output, then print it before moving on to the next. It does do this 99% correct, just with the occasional problem of it not always appending in order.
After all loops have completed and appended to the container, I would like to run some code to confirm that the operation is complete.
Many Thanks in advance, hope this is clear enough
This is a "Promise" based solution to your problem.
First, decompose each pass into a function that does one unit of work:
function makePass(i) {
return $.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form' + i).serialize()
}).then(function(data) {
$('#update' + i).append(data);
});
}
Then you can make a general purpose function that pseudo-recursively makes the desired number of calls to the supplied function, calling it each time with the current pass number, and finally returning a new "resolved" promise once every pass has been completed:
function makeNPasses(f, n) {
var i = 0;
return (function loop() {
if (i < n) {
return f(i++).then(loop);
} else {
return Promise.resolve();
}
})();
}
You can then register a handler to be invoked once everything is done:
var passes = Math.ceil($max / $offset);
makeNPasses(makePass, passes).then(function() {
console.log("All done!");
});
Javascript has an async flow.. It doesn't wait for the above request to get the data from somewhere... Instead it just keeps firing statements in a row.
To escape this, there are 3 options.
The ideal approach is to make 1 single http request to server and get all data in form of json array. This will be more efficient, easy and time-saving and also follows the best practices.
Make an async call of ajax. You will get the good information about it in this answer jQuery ajax success anonymous function scope. But again.. Callbacks are recommended and not doing async false. Instead .when() or .then() is easier.. because ultimately they too are callbacks.
Recursive Function can help you through such kind of tasks. It is a dirty approach because ES5 Doesn't let you iterate much deeper using recursive functions. But ES6 does allow it to be 10000 iterations. But its not a good approach. It increases overhead and bottleneck your page load.
AJAX calls are asynchronous and therefore they cannot succeed before the loop ends (as JavaScript is non blocking). In fact what you are wanting to do is to perform action after AJAX calls, not just after the loop (which is synchronous) so I would suggest either using Promise or chaining or aggregating success events like below:
var passes = Math.ceil($max / $offset);
for (i = 0; i < passes; i++) {
$.ajax({
type: 'POST',
url: 'do.php?p=' + i,
data: $('#form' + i).serialize(),
success: function(data){
$('#update' + i).append(data);
resolve();
},
error: reject
});
}
var resolved = 0;
function resolve() {
if(++resolved >= passes) {
alert('test');
}
}
function reject() {
alert('One of AJAX requests failed');
}

JQuery Each not waiting for function inside to finish

I have this inside an ajax function
$.each(data['Result'][0], function(Key, Value) {
InputGen(Value['Type'], Value['Name'], Value['Other'], Value['Value'], function(Html){
Table = Table + "<tr><td>"+Key+"</td><td>" + Html + "</td></tr>";
});
});
and InputGen has a callback from another ajax function but when i run this the loop does not seem to be waiting for the ajax to finish. how would i achieve this?
...the loop does not seem to be waiting for the ajax to finish.
No, because the "a" in "ajax" stands for asynchronous; it doesn't happen when you make the call, it happens later. But there's no reason the $.each loop would know to sit around and wait for it to finish.
If you don't need it to (e.g., it's okay for the ajax calls to overlap, which normally it should be unless they rely on each other), look at Rocket Hazmat's approach.
If you need each ajax call to wait for the previous one to finish, you can't use $.each (or at least, not the way you were); instead, use an index and respond to the callback by triggering the next request:
// Start with first entry
var index = 0;
var array = data['Result'][0];
// Do the first request
doRequest();
function doRequest() {
var value = array[index];
InputGen(value['Type'], value['Name'], value['Other'], value['Value'], function(Html) {
Table = Table + "<tr><td>"+index+"</td><td>" + Html + "</td></tr>";
// This request is done, move to the next if any
if (++index < array.length) {
doRequest();
}
});
}
Side note: Overwhelmingly in JavaScript, variables and non-constructor functions are named starting with a lower-case letter: value rather than Value, etc. So I used index, array, and value above.
Side note 2: value['Type] can be more simply written as value.Type. (And so on.)
This is because AJAX is asynchronous. Nothing is going to wait for it to finish. The callback will run in the future at some point when the call is done. By then your $.each (and the code after) is long done.
The solution here is to use promises. That way you can run a callback once all the AJAX calls are done.
You can use jQuery's $.Deferred for this. Without editing the InputGen() function, you can do something like this:
var promises = [];
$.each(data['Result'][0], function(Key, Value) {
var d = new $.Deferred;
InputGen(Value['Type'], Value['Name'], Value['Other'], Value['Value'], function(Html){
d.resolve([Key, Html]);
});
promises.push(d.promise());
});
$.when.apply($, promises).done(function(){
for(var i=0, length=arguments.length; i < length; i++){
var ele = arguments[i],
Key = ele[0],
Html = ele[1];
Table = Table + "<tr><td>"+Key+"</td><td>" + Html + "</td></tr>";
}
// In here is where you can use your updated `Table` variable.
// You *cannot* use it outside of here, since it was not updated yet
});

http request in while loop in JavaScript

I am trying to send a xmlHttpRequest in a while loop and want to do something with the response in the same while loop. Since the requests are asynchronous, how can I achieve it? I need to execute everything serially
while(i < n){
response = self.sendHttpRequest(params);
//do something with the response
}
Any help would be appreciated.
Even If I use a callback, how can I get back to same while loop after executing the callback?
There are two ways I can think of:
1. Add a polling loop after the get call that waits until the response.readyState is set and then process the response:
while(i < n){
response = self.sendHttpRequest(params);
while( response.readyState != 4 ){
// polling wait
}
//do something with the response
}
This option is not really recommended since it stops the flow of the code and you can get stop in the loop if the readyState never changes (not likely, but possible with errors).
2. You can encapsulate the request in a function that will be called recursively when the last response handling finishes:
var i = 0;
function handle( response ){
//handle response
i++;
if( i < n ) sendRequest();
}
function sendRequest(){
// Your request setup code
response.onreadystatechange = handle;
response = self.sendHttpRequest(params);
}
The second method is preferred in my opinion, as it maintains the asynchronicity of the html request call, and doesn't stop the flow of the code, however it does "break" the loop structure. The first method keeps the loop structure, but is not very good coding practice.
Are you using any ajax library or plain js. If you are not using any library ,you can pass third argument to open method false.like below
var xmlHttp=new xmlHttpRequest();
xmlHttp.open({YOUR_METHOD},{YOUR_PATH},false);
Passing false to open method makes synchronous call .so you can handle the return in same loop.

Queuing/throttling jQuery ajax requests

I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example

Javascript function for N ajax calls

I frequently need to load a function on a webpage after between two and five modest-sized data files have loaded. Let's say a maximum of 3MB of data split over a maximum of five files.
I try to optimize the load time by making all the AJAX calls at the same time and loading an initialize() function after they have all loaded, like so:
var data1, data2;
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
initialize();
});
$.ajax({
url: url_to_data_1,
success: function (d) {
data1 = d;
}
});
$.ajax({
url: url_to_data_2,
success: function (d) {
data2 = d;
}
});
function initialize() { /* do something with data1 and data2 */ }
But I'm tired of pasting this in every time, so I want a function like this:
function multi_Ajax(urls, callback) {
var data = {};
//call init() after both the list of prayers and the word concordance index load
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
callback(data);
});
for (var c = 0; c < urls.length; c += 1) {
//data files
$.ajax({
url: urls[c],
dataType: "json",
success: function (d) {
data[urls[c]] = d; console.log("Loaded " + urls[c]);
}
});
}
}
This does not work, of course, since the ajax calls do not exist for ajaxStop to catch. But I do not understand how ajaxStop works well enough to get much further. Thank you!
I'm not sure why your second attempt wouldn't work. According to the documentation, whenever an ajax request completes, jquery will check if there are any other requests that are still outstanding and fires ajaxStop if there are none. Calling $.ajax in a loop shouldn't be any different than hardcoding each call, as far as I can tell.
However, as Barmar suggested in the comments, $.when seems like a cleaner way to do what you want to do. Here's an example of how to pass an array (which you can populate in a loop) to $.when: Pass in an array of Deferreds to $.when()
Using $.when seems cleaner than $.ajaxStop because if someone later comes along and adds an unrelated ajax request somewhere before or after your loop, that would interfere with when ajaxStop triggers. $.when allows you to explicitly say which ajax requests you want to wait for.
EDIT: Here's a fiddle showing ajaxStop working for multiple calls issued in a loop: http://jsfiddle.net/zYk5W/
It looks like the reason this wasn't working for you has nothing to do with .ajax or .ajaxStop; instead it's a scope issue in your success callback. Your success callback closes over c, but this is the same c that the outer (loop) scope uses. By the time any of the success callbacks runs, the for loop has completed and c has been incremented to urls.length. Thus every time your success callback runs, urls[c] is undefined. See the fiddle I linked or JavaScript closure inside loops – simple practical example for an example of how to give each success callback its own c in a scope separate from the loop's c.

Categories

Resources