How to use JQuery $.when to process ajax calls in order? - javascript

How do I use $.when in JQuery with chained promises to ensure my ajax requests are completed in the right order?
I have an array called costArray which is made up of a number of dynamic objects. For each item in this array, I'll call an Ajax request called GetWorkOrder which returns a WorkOrder which is basically a table row element with the class .workOrder and appends it to the table with id #tbodyWorkOrders.
Once all the items in the array are processed, I use $.when to let me know when I can calculate the SubTotal of each WorkOrder.
My problem is that my WorkOrders are being inserted in random orders, as the ajax requests are being processed async. How can I ensure my ajax requests are processed and appended in the correct order?
i = 0;
$.each(costArray, function (key, value) {
var d1 = $.get('/WorkOrders/GetWorkOrder', { 'i': i }, function (html) {
$('#tbodyWorkOrders').append(html);
$('.workOrder').last().find('input').val(value.Subtotal);
});
$.when(d1).done(function () {
SetSubtotal();
i++;
});
Edit:
costArray is taken from an earlier ajax call and is an array of items that I am inserting into the table rows:
var costArray = JSON.parse([{"Trade":"Plasterer","Notes":"Test","Subtotal":"3781.00"}]);
The line:
$('.workOrder').last().find('input').val(value.Subtotal);
is one of many that takes the values from GetWorkOrder and puts them into the correct inputs, but I've left out the extra code for clarity

$.when() processes all the promises you pass it in parallel, not sequential (since the async operations have already been started before you even get to $.when()).
It will collect the results for you in the order you pass the promises to $.when(), but there is no guarantee about the execution order of the operations passed to it.
What I would suggest is that you collect all the results (in order), then insert them in order after they are all done.
I've tried to restructure your code, but it is not clear what items form costArray you want to pass to your Ajax call. You weren't passing anything from costArray in your code, but the text of your question says you should be. So, anyway, here's a structural outline for how this could work:
var promises = costArray.map(function (value, index) {
// Fix: you need to pass something from costArray to your ajax call here
return $.get('/WorkOrders/GetWorkOrder', { 'i': value });
});
$.when.apply($, promises).done(function() {
// all ajax calls are done here and are in order in the results array
// due to the wonders of jQuery, the results of the ajax call
// are in arguments[0][0], arguments[1][0], arguments[2][0], etc...
for (var i = 0; i < arguments.length; i++) {
var html = arguments[i][0];
$('#tbodyWorkOrders').append(html);
}
SetSubtotal();
});

Wrap it in a function and recall it from your ajax success:
ajax(0);
function ajax(key) {
$.get('/WorkOrders/GetWorkOrder', {'i' : key },
function (html) {
$('#tbodyWorkOrders').append(html);
$('.workOrder').last().find('input').val(costArray[key].Subtotal);
SetSubtotal();
key++;
if (key < costArray.length)
ajax(key);
});
}
Edit: On further consideration, while this is one way to do it, it entails the ajax calls executing only one at a time, which isn't very time efficient. I'd go with jfreind00's answer.

Related

NodeJS Synchronous Trouble [duplicate]

This question already has answers here:
How do I return the response from an asynchronous call?
(41 answers)
Closed 5 years ago.
Using NodeJS, I'm trying to get info from the database on a list of IDs and populate an array of objects. I need this to process synchronously. I'm having trouble figuring out how to do this so the next function call will wait for the previous one to finish before being called. Then all of the 'each' iterations to finish for the return to be called.
Example:
getInfo();
function getInfo(){
var return_arr = Array();
var ids = getListOfIDs();
// loop through each id getting more info from db
$.each( ids, function( i, v ){
return_arr.push(getFullInfo( id ));
});
return return_arr;
}
function getListOfIDs(){
// database call to get IDs
// returns array
}
function getFullInfo(){
// database call to get full info
// returns object
}
This is a simplified example, so assume that a single query to get all info will not work as there are multiple joins going on and post processing that needs to take place in js. Also, assume I'm doing proper error handling, which I omitted in my example.
Your database queries are asynchronous so you need to either use callbacks or promises to perform your work once the database returns with results.
function getListOfIDs(callback) {
// database call to get IDs
db.query('...', function(data) {
// call the callback and pass it the array
callback(data);
});
}
function getInfo() {
// pass an anonymous function as a callback
getListofIDs(function(data) {
// we now have access to the data passed into the callback
console.log(data);
});
}
Your current code example is synchronous, although I don't know how your db code works. The each loop is synchronous it just iterates over your ids and calls the getFullInfo function.
I'm not sure why you want synchronous code though as it doesn't utilise Nodes event loop architecture.
What I would do is use a good Promises framework such as Bluebird (http://bluebirdjs.com/docs/api/promise.each.html) with Promise.each ensuring each iteration occurs serially. Or you could also use a Callback library such as async (http://caolan.github.io/async/) with async.eachSeries. Either of these will ensure that you (a) get the benefit of asynchronous and (b) iterate in a serial fashion.
Promise way to do it:
function getListOfIDs() {
return new Promise((resolve, reject) => {
// database call
if (err)
{
reject(your_db_err);
return;
}
resolve(your_db_result);
});
}
function getInfo() {
getListOfIDs().then((result) => {
//do whatever you want with result
})
.catch((err) => {
//handle your err
});
}

Running Dojo xhr.get in for loop. Executing out of order

The below function is supposed to use load in a set of JSON files into javascript objects and return them as an array. From my debug I can see that this is all working and doing as it is supposed to although it is executing the load: function(data) last after all other Javascript has finished executing.
So the function executes. Runs the xhr.get 15 times. Returns currentDataSet (empty). Only then the currentDataSet.push(data); within load: function(data) executes populating currentDataSet. At which point it is too late as the return has already run.
require(['dojo/_base/xhr', "dojo/dom", "dojo/date/stamp"], function(xhr, dom){
generateDataSet = function (startTime) {
var dataSetFilesNames = generateFilenameSet(startTime);
var currentDataSet = [];
for(var j=0; j<15; j++){
xhr.get({
url:("/JSONFiles/" + dataSetFilesNames[j]), handleAs:"json",
load: function(data){
//The .push statement below is executed 15 times at the end
currentDataSet.push(data);
}
});
}
return currentDataSet;
}
});
I'll admit this is all new territory for me and I may be misunderstanding how the xhr.get works. I guess the load function executes as the responses come back from the requests. Is there any way that I can get the desired function above to execute in the right order? A way to wait for the response of each xhr.get etc. Open to suggestions.
Thanks in advance.
As you're probably aware, xhr.get happens asynchronously, meaning the code will continue to execute and note wait for it to finish. This is why the order is unpredictable.
Fortunately, this is not a new problem and can be solved using an array of Promises. A Promise interface basically has a handler that will be fired when asynchronous results are returned, whether the call is successful or not. In your case, since you have an array of Promises that you need to handle at once, you can use a function called all.
all (in the case of dojo/promise/all) accepts an array of Promises as input, and executes when all of them have results returned. This means you can use all with a Promise for each of your calls to wait until they're all finished, and then process them. I haven't used dojo/_base/xhr, but I can provide an example using dojo/request/xhr which I'm sure is similar:
var promises = [];
for (var j=0; j<15; j++) {
var promise = xhr(url, {
handleAs: "json"
}).then({
function(data){
// process data, return results here
}
});
promises.push(promise);
}
all(promises).then(function(results) {
// process array of all results here
});
I've attached links to all and Promise so you can read about them more.
dojo/promise/all
dojo/promise/Promise

Wait for multiple deferred to complete [duplicate]

So, I have a page that loads and through jquery.get makes several requests to populate drop downs with their values.
$(function() {
LoadCategories($('#Category'));
LoadPositions($('#Position'));
LoadDepartments($('#Department'));
LoadContact();
};
It then calls LoadContact(); Which does another call, and when it returns it populates all the fields on the form. The problem is that often, the dropdowns aren't all populated, and thus, it can't set them to the correct value.
What I need to be able to do, is somehow have LoadContact only execute once the other methods are complete and callbacks done executing.
But, I don't want to have to put a bunch of flags in the end of the drop down population callbacks, that I then check, and have to have a recursive setTimeout call checking, prior to calling LoadContact();
Is there something in jQuery that allows me to say, "Execute this, when all of these are done."?
More Info
I am thinking something along these lines
$().executeAfter(
function () { // When these are done
LoadCategories($('#Category'));
LoadPositions($('#Position'));
LoadDepartments($('#Department'));
},
LoadContact // Do this
);
...it would need to keep track of the ajax calls that happen during the execution of the methods, and when they are all complete, call LoadContact;
If I knew how to intercept ajax that are being made in that function, I could probably write a jQuery extension to do this.
My Solution
;(function($) {
$.fn.executeAfter = function(methods, callback) {
var stack = [];
var trackAjaxSend = function(event, XMLHttpRequest, ajaxOptions) {
var url = ajaxOptions.url;
stack.push(url);
}
var trackAjaxComplete = function(event, XMLHttpRequest, ajaxOptions) {
var url = ajaxOptions.url;
var index = jQuery.inArray(url, stack);
if (index >= 0) {
stack.splice(index, 1);
}
if (stack.length == 0) {
callback();
$this.unbind("ajaxComplete");
}
}
var $this = $(this);
$this.ajaxSend(trackAjaxSend)
$this.ajaxComplete(trackAjaxComplete)
methods();
$this.unbind("ajaxSend");
};
})(jQuery);
This binds to the ajaxSend event while the methods are being called and keeps a list of urls (need a better unique id though) that are called. It then unbinds from ajaxSend so only the requests we care about are tracked. It also binds to ajaxComplete and removes items from the stack as they return. When the stack reaches zero, it executes our callback, and unbinds the ajaxComplete event.
You can use .ajaxStop() like this:
$(function() {
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
LoadContact();
});
LoadCategories($('#Category'));
LoadPositions($('#Position'));
LoadDepartments($('#Department'));
});
This will run when all current requests are finished then unbind itself so it doesn't run if future ajax calls in the page execute. Also, make sure to put it before your ajax calls, so it gets bound early enough, it's more important with .ajaxStart(), but best practice to do it with both.
Expanding on Tom Lianza's answer, $.when() is now a much better way to accomplish this than using .ajaxStop().
The only caveat is that you need to be sure the asynchronous methods you need to wait on return a Deferred object. Luckily jQuery ajax calls already do this by default. So to implement the scenario from the question, the methods that need to be waited on would look something like this:
function LoadCategories(argument){
var deferred = $.ajax({
// ajax setup
}).then(function(response){
// optional callback to handle this response
});
return deferred;
}
Then to call LoadContact() after all three ajax calls have returned and optionally executed their own individual callbacks:
// setting variables to emphasize that the functions must return deferred objects
var deferred1 = LoadCategories($('#Category'));
var deferred2 = LoadPositions($('#Position'));
var deferred3 = LoadDepartments($('#Department'));
$.when(deferred1, deferred2, deferred3).then(LoadContact);
If you're on Jquery 1.5 or later, I suspect the Deferred object is your best bet:
http://api.jquery.com/category/deferred-object/
The helper method, when, is also quite nice:
http://api.jquery.com/jQuery.when/
But, I don't want to have to put a bunch of flags in the end of the drop down population callbacks, that I then check, and have to have a recursive setTimeout call checking, prior to calling LoadContact();
No need for setTimeout. You just check in each callback that all three lists are populated (or better setup a counter, increase it in each callback and wait till it's equal to 3) and then call LoadContact from callback. Seems pretty easy to me.
ajaxStop approach might work to, I'm just not very familiar with it.

How to use jQuery when function with more postJSON queries?

I have a problem at work: I have a section of installations which are dependent on servers. I want to do this: When a user deletes a server, it loops through installations collection and deletes all dependent installations. For that I use jQuery 'when' function, which is said to wait for a response from the server and then move on to 'then' function. It works flawlessly when there is only one dependent installation. A problem occurs when there are more installations, however, because it moves to the 'then' function immediately after receiving a JSON response.
The question is: How do I make 'when' function wait for all server responses? Eg. I send out three delete requests through $.postJSON and want to move on after I get all three responses. If it's not possible with 'when', what should I use to make it happen? If it helps, I maintain all my entities collections with KnockoutJS. Thanks!
EDIT:
I have it like this:
$.when(DeleteDependentInstallations())
.then (function() {
...
});
DeleteDependentInstallations looks like (pseudocode):
Search the installations collection;
If installation.ID equals server.InstallationID
{
Add to dependent installations collection;
}
Repeat until the whole collection is searched;
for (i = 0; i < dependentInstallations.length; i++)
{
DeleteInstallation(dependentInstallations[i]);
}
DeleteInstallations is a simple function using $.postJSON function.
The problem is the .then function executes immediately after the first JSON response.
I think you need to have DeleteDependentInstallations return an array of JQuery deferreds. $.when allows you to pass multiple arguments to it in order to let it know it has to wait for each one.
I don't know the whole context of what you're doing, but I think something like this might work:
function DeleteDependentInstallations() {
var installations = getDependantInstallations();
var promises = [];
for (var i = 0; i < installations.length; i++) {
var installation = installations[i];
promises.push(DeleteInstallation(installation));
}
return promises;
}
function DeleteInstallation(installation) {
//do whatever here, but return the result of $.ajaxPost
return $.post('/foo', installation);
}
Now when you use that method, it should wait for all returned promises to complete.
$.when.apply(null, DeleteDependentInstallations()).then(function() { alert('wee!'); });
The .apply() is so we can pass an array as an arguments collection.
EDIT: I was confusing "deferreds" and promises in my head. Deferreds are what the $.ajax calls return, and a promise is what the $.when() function returns.
EDIT2: You might also want to look at the .done() method, if the behavior of .then() doesn't suit your needs.

trying to incorporate the asynchronous scheme

I have an application that uses several instances of getJSON, and I'm getting into lots of trouble. Pointy once suggested reworking the main routine to include asynchronous processing, and I'm agreeing (now that I understand something of this).
Before attempting to rework this, it was structured like this:
Fill some arrays;
Call processArray to create a set of strings for each;
Stick the strings into the DIVs.
In the processArray routine, I call $.getJSON--a couple times, and you folks have pointed out that I'm getting into trouble with expecting values I have no right to expect. The overall routine processes an array into a complex string, but some arrays have to be sorted (unconventionally) first. My original structure began by asking: is this an array to be sorted? If so, I did such and such, involving getJSON, then returned to the main routine. What I had done to the array did not make it over the main routine, which continued to see the original array contents.
So, that processArray was configured like so:
function processArray(arWorking, boolToSort...) {
if(boolToSort) {do special stuff}
//continue on with processing
return complexString;
}
I figured that I would try to guarantee the inclusion of the sorted array in the main routine by replacing the 'arWorking' argument with a function that did the sorting if processArray was called with boolToSort = true. In my thinking, the rest of the main routine would go on with one of two forms of array: the original as passed or the sorted array. To this end, I made the sorting routine a separate routine: SortArray(arrayToUse).
I came up with this:
function processArray( function(arWorking) {if(boolToSort) SortArray(arWorking); else return arWorking;}, boolToSort, ...) {
//main routine
return complexString;
}
Both FireFox and IE9 object. FF breaks to jQuery, while IE9 wants an identifier in the calling arguments.
What looks to be wrong? Can I use boolToSort in my "argument function?"
The first part of you understanding this is this:
$.getJSON() does it's work asynchronously. That means that when you call it, all it does is start the operation. The code following that function continues to execute while the $.getJSON() call works in the background. Some time later, the JSON results will be available and the success handler will get called.
ONLY in the success handler can you use those results.
As such, you cannot write normal procedural code that does this:
function processArray() {
$.getJSON(url, function(data) {
// only in here can you process the data returns from the getJSON call
})
// you cannot use the JSON data here as it is not yet available
// you cannot return any of the JSON data from the processArray function
}
Instead, you must write code that uses the success handler. Here's one way of doing that:
function processArrays(urlToProcess1, urlToProcess2, callbackWhenDone) {
$.getJSON(urlToProcess1, function(data) {
// only in here can you process the data returns from the getJSON call
// do whatever you want to do with the JSON data here
// when you are done process it, you can then make your next getJSON call
$.getJSON(urlToProcess2, function(data) {
// do whatever you want to do with the JSON data here
// when done, you can then call your callback function to continue on with other work
callbackWhenDone();
});
});
}
Another thing you cannot do is this:
function processArray() {
var result;
$.getJSON(url, function(data) {
// only in here can you process the data returns from the getJSON call
result = data;
})
return(result);
}
var data = processArray();
// code that uses data
You cannot do this because the result data is not available when processArray() returns. That means not only can you not use it inside of processArray (but outside the success handler), but you cannot return it from processArray() and you cannot use it in code written after processArray(). You can only use that data from within the success handler or in code called from the success handler.
If you had a whole bunch of URLs to process and you used the same code on each one, you could pass an array of URLs and loop through them, starting the next getJSON call only when the success handler of the first was called.
If you had a whole bunch of URLs each with different code, you could pass an array of URLs and an array of callback functions (one for each URL).
FYI, I see no issue with passing boolToSort. It sounds to me like the issue is with how you handle asynchronous ajax calls.
For completeness, it is possible use synchronous ajax calls, but that is NOT recommended because it's a bad user experience. It locks up the browser for the duration of the networking operations which is generally not a good thing. It's much better to use the normal asynchronous ajax calls and structure your code to work properly with them.

Categories

Resources