Jquery deferred from callback of ajax calls - javascript

I'm trying to write an procedure that does something after 2 objects are returned as a result of the callback of an ajax function.
I know the classic example of using Jquery when():
$.when($.get("http://localhost:3000/url1"),
$.get("http://localhost:3000/url2").done(//do something));
But in my case, I don't want to trigger the when on the execution of the ajax function, I want the when to trigger from the callback from the execution of the ajax function. Something like:
$.get("http://localhost:3000/url1", function(data){
function(){
//do something with the data, and return myobj1;
}
});
$.get("http://localhost:3000/url2", function(data){
function(){
//do something with the data, and return myobj2;
}
});
$.when(obj1, obj2).done(function(){
//do something with these 2 objects
});
But of course, that doesn't work. Ideas?

That actually should work. jQuery.when() takes multiple arguments and fires once they all have completed returning each results arguments as an array:
var req1 = $.get("http://localhost:3000/url1");
var req2 = $.get("http://localhost:3000/url2");
$.when(req1, req2).done(function(res1, res2) {
//do something with these 2 objects
});
If you don't want to handle the requests together you can create your own deferreds and use those:
var deferred1 = $.Deferred(),
deferred2 = $.Deferred();
$.get("http://localhost:3000/url1", function(data){
function(){
//do something with the data, and return myobj1;
deferred1.resolve(myobj1);
}
});
$.get("http://localhost:3000/url2", function(data){
function(){
//do something with the data, and return myobj2;
deferred2.resolve(myobj2);
}
});
$.when(deferred1, deferred2).done(function(){
//do something with these 2 objects
});

or you can do controls yourself
$(function(){$('body').addClass('doc-ready')})
var thingsToLoad = ['blabla.js','blublu.js','weee.js'];
var launch = function(){
// do whatever you want to do after loading is complete
// this will be invoked after dom ready.
// objectCollection will have everything you loaded.
// and you can wrap your js files in functions, and execute whenever you want.
}
var loadTester = (function() {
var loadCounter = 0,
loadEnds = thingToLoad.length; // total number of items needs to be loaded
return function() {
loadCounter += 1;
if (loadCounter === loadEnds) {
if ($('body').hasClass('doc-ready')) {
launch();
} else {
/* if body doesnt have ready class name, attach ready event and launch application */
$(function() {
launch();
});
}
}
}
}());
$.each(thingsToLoad, function(i) {
$.ajax({
url : thingsToLoad[i],
mimeType : 'application/javascript',
dataType : 'script',
success : function(data) {
loadTester();
}
});
});
add your files into thingsToLoad array,
at the end it will be iterated over and will be loaded after success, it will init
loadTester.
loadTester will check length of your thingsToLoad array, when number of loaded files vs files length matches and dom in ready status, it will launch().
if you're just loading html files, or text files, you can pass those (data in ajax function) into loadTester and accumulate there (within a private var like those loadCounter and loadEnds), and pass accumulated array or object to launch()

Related

Stopping Code until AJAX call completes inside $.each loop

I have a function that loads HTML from an external file via an AJAX call using jQuery.
These ajax call runs inside of a $.each loop and I need this ajax call to finish before the loop continues. Here is my code:
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
});
});
I know I can set async:false but I hear that is not a good idea. Any ideas?
To achieve this you can put each request in to an array and apply() that array to $.when. Try this:
var requests = [];
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
requests.push($.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
}));
});
$.when.apply($, requests).done(function() {
console.log('all requests complete');
});
Note that you're replacing the same content on each request, so the only one which will have any effect on the UI is the last request. The preceding ones are redundant.
Also note that you should never, ever use async: false. It locks the UI thread of the browser until the request completes, which makes it look like it has crashed to the user. It is terrible practice. Use callbacks.
The OP appears to want the calls to run in series, not in parallel
If this is the case you could use recursion:
function makeRequest($els, index) {
var cotainer_html = $('.cotainer_html').clone();
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
if ($els.eq(index + 1).length) {
makeRequest($els, ++index);
} else {
console.log('all requests complete');
}
}
});
}
makeRequest($('img'), 0);
You can use a pseudo-recursive loop:
var imgs = $('img').get();
var done = (function loop() {
var img = imgs.shift();
if (img) {
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
return $.get('/assets/ajax/get_studio_image.php')
.then(function(data) {
cotainer_html.find('.replaceme').replaceWith(data);
}).then(loop);
} else {
return $.Deferred().resolve(); // resolved when the loop terminates
}
})();
This will take each element of the list, get the required image, and .then() start over until there's nothing left.
The immediately invoked function expression itself returns a Promise, so you can chain a .then() call to that that'll be invoked once the loop has completed:
done.then(function() {
// continue your execution here
...
});

Finding out when both JSON requests have completed

I currently have the following code:
function render(url1, url2, message) {
utility.messageBoxOpen(message);
$.getJSON(url1, function (items) {
// Do something
utility.messageBoxClose();
});
$.getJSON(url2, function (items) {
// Do something
});
}
When the function is executed a modal window appears to inform the user something is loading. Initially I only had one $getJSON request so when the request was done the modal window closed as per the code above.
I am looking to add another $getJSON request but want to close the modal window only when both $getJSON requests have completed.
What is the best way of achieving this?
You're looking for $.when()
All jQuery ajax requests (including shortcuts like getJSON) return deferred objects which can be used to control other actions.
var dfd1 = $.getJSON(url1, function (items) {
// Do something
});
var dfd1 = $.getJSON(url2, function (items) {
// Do something
});
$.when(dfd1, dfd2).then(function(){
//both succeeded
utility.messageBoxClose();
},function(){
//one or more of them failed
});
If you don't care whether the getJSONs come back successfully or not and instead only care that they are done you can instead:
$.when(dfd1, dfd2).done( utility.messageBoxClose );
A variable
function render(url1, url2, message) {
utility.messageBoxOpen(message);
var isOneDone = false;
$.getJSON(url1, function (items) {
// Do something
if(!isOneDone)
isOneDone = true;
else
utility.messageBoxClose();
});
$.getJSON(url2, function (items) {
// Do something
if(!isOneDone)
isOneDone = true;
else
utility.messageBoxClose();
});
}
You can replace the getJSON() call to one using $.ajax which accomplishes the same thing but gives you more flexibility:
$.ajax({
url: http://whatever,
dataType: 'json',
async: false,
data: {data},
success: function(data) {
// do the thing
}
});
Note the async:false part - this makes the code execution pause until the request is completed. So you could simply make your two calls this way, and close the dialog after the second call is completed.
function render(url1, url2, message) {
utility.messageBoxOpen(message);
$.when($.getJSON(url1, function (items) {
// Do something
utility.messageBoxClose();
}), $.getJSON(url2, function (items) {
// Do something
})).then(function () {
//Both complete
});
}
jQuery.when

Sequencing ajax requests

I find I sometimes need to iterate some collection and make an ajax call for each element. I want each call to return before moving to the next element so that I don't blast the server with requests - which often leads to other issues. And I don't want to set async to false and freeze the browser.
Usually this involves setting up some kind of iterator context that i step thru upon each success callback. I think there must be a cleaner simpler way?
Does anyone have a clever design pattern for how to neatly work thru a collection making ajax calls for each item?
jQuery 1.5+
I developed an $.ajaxQueue() plugin that uses the $.Deferred, .queue(), and $.ajax() to also pass back a promise that is resolved when the request completes.
/*
* jQuery.ajaxQueue - A queue for ajax requests
*
* (c) 2011 Corey Frang
* Dual licensed under the MIT and GPL licenses.
*
* Requires jQuery 1.5+
*/
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function( ajaxOpts ) {
var jqXHR,
dfd = $.Deferred(),
promise = dfd.promise();
// queue our ajax request
ajaxQueue.queue( doRequest );
// add the abort method
promise.abort = function( statusText ) {
// proxy abort to the jqXHR if it is active
if ( jqXHR ) {
return jqXHR.abort( statusText );
}
// if there wasn't already a jqXHR we need to remove from queue
var queue = ajaxQueue.queue(),
index = $.inArray( doRequest, queue );
if ( index > -1 ) {
queue.splice( index, 1 );
}
// and then reject the deferred
dfd.rejectWith( ajaxOpts.context || ajaxOpts,
[ promise, statusText, "" ] );
return promise;
};
// run the actual query
function doRequest( next ) {
jqXHR = $.ajax( ajaxOpts )
.done( dfd.resolve )
.fail( dfd.reject )
.then( next, next );
}
return promise;
};
})(jQuery);
jQuery 1.4
If you're using jQuery 1.4, you can utilize the animation queue on an empty object to create your own "queue" for your ajax requests for the elements.
You can even factor this into your own $.ajax() replacement. This plugin $.ajaxQueue() uses the standard 'fx' queue for jQuery, which will auto-start the first added element if the queue isn't already running.
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function(ajaxOpts) {
// hold the original complete function
var oldComplete = ajaxOpts.complete;
// queue our ajax request
ajaxQueue.queue(function(next) {
// create a complete callback to fire the next event in the queue
ajaxOpts.complete = function() {
// fire the original complete if it was there
if (oldComplete) oldComplete.apply(this, arguments);
next(); // run the next query in the queue
};
// run the query
$.ajax(ajaxOpts);
});
};
})(jQuery);
Example Usage
So, we have a <ul id="items"> which has some <li> that we want to copy (using ajax!) to the <ul id="output">
// get each item we want to copy
$("#items li").each(function(idx) {
// queue up an ajax request
$.ajaxQueue({
url: '/echo/html/',
data: {html : "["+idx+"] "+$(this).html()},
type: 'POST',
success: function(data) {
// Write to #output
$("#output").append($("<li>", { html: data }));
}
});
});
jsfiddle demonstration - 1.4 version
A quick and small solution using deferred promises. Although this uses jQuery's $.Deferred, any other should do.
var Queue = function () {
var previous = new $.Deferred().resolve();
return function (fn, fail) {
return previous = previous.then(fn, fail || fn);
};
};
Usage, call to create new queues:
var queue = Queue();
// Queue empty, will start immediately
queue(function () {
return $.get('/first');
});
// Will begin when the first has finished
queue(function() {
return $.get('/second');
});
See the example with a side-by-side comparison of asynchronous requests.
This works by creating a function that will automatically chain promises together. The synchronous nature comes from the fact that we are wrapping $.get calls in function and pushing them into a queue. The execution of these functions are deferred and will only be called when it gets to the front of the queue.
A requirement for the code is that each of the functions you give must return a promise. This returned promise is then chained onto the latest promise in the queue. When you call the queue(...) function it chains onto the last promise, hence the previous = previous.then(...).
You can wrap all that complexity into a function to make a simple call that looks like this:
loadSequantially(['/a', '/a/b', 'a/b/c'], function() {alert('all loaded')});
Below is a rough sketch (working example, except the ajax call). This can be modified to use a queue-like structure instead of an array
// load sequentially the given array of URLs and call 'funCallback' when all's done
function loadSequantially(arrUrls, funCallback) {
var idx = 0;
// callback function that is called when individual ajax call is done
// internally calls next ajax URL in the sequence, or if there aren't any left,
// calls the final user specified callback function
var individualLoadCallback = function() {
if(++idx >= arrUrls.length) {
doCallback(arrUrls, funCallback);
}else {
loadInternal();
}
};
// makes the ajax call
var loadInternal = function() {
if(arrUrls.length > 0) {
ajaxCall(arrUrls[idx], individualLoadCallback);
}else {
doCallback(arrUrls, funCallback);
}
};
loadInternal();
};
// dummy function replace with actual ajax call
function ajaxCall(url, funCallBack) {
alert(url)
funCallBack();
};
// final callback when everything's loaded
function doCallback(arrUrls, func) {
try {
func();
}catch(err) {
// handle errors
}
};
Ideally, a coroutine with multiple entry points so every callback from server can call the same coroutine will be neat. Damn, this is about to be implemented in Javascript 1.7.
Let me try using closure...
function BlockingAjaxCall (URL,arr,AjaxCall,OriginalCallBack)
{
var nextindex = function()
{
var i =0;
return function()
{
return i++;
}
};
var AjaxCallRecursive = function(){
var currentindex = nextindex();
AjaxCall
(
URL,
arr[currentindex],
function()
{
OriginalCallBack();
if (currentindex < arr.length)
{
AjaxCallRecursive();
}
}
);
};
AjaxCallRecursive();
}
// suppose you always call Ajax like AjaxCall(URL,element,callback) you will do it this way
BlockingAjaxCall(URL,myArray,AjaxCall,CallBack);
Yeah, while the other answers will work, they are lots of code and messy looking. Frame.js was designed to elegantly address this situation. https://github.com/bishopZ/Frame.js
For instance, this will cause most browsers to hang:
for(var i=0; i<1000; i++){
$.ajax('myserver.api', { data:i, type:'post' });
}
While this will not:
for(var i=0; i<1000; i++){
Frame(function(callback){
$.ajax('myserver.api', { data:i, type:'post', complete:callback });
});
}
Frame.start();
Also, using Frame allows you to waterfall the response objects and deal with them all after the entire series of AJAX request have completed (if you want to):
var listOfAjaxObjects = [ {}, {}, ... ]; // an array of objects for $.ajax
$.each(listOfAjaxObjects, function(i, item){
Frame(function(nextFrame){
item.complete = function(response){
// do stuff with this response or wait until end
nextFrame(response); // ajax response objects will waterfall to the next Frame()
$.ajax(item);
});
});
Frame(function(callback){ // runs after all the AJAX requests have returned
var ajaxResponses = [];
$.each(arguments, function(i, arg){
if(i!==0){ // the first argument is always the callback function
ajaxResponses.push(arg);
}
});
// do stuff with the responses from your AJAX requests
// if an AJAX request returned an error, the error object will be present in place of the response object
callback();
});
Frame.start()
I am posting this answer thinking that it might help other persons in future, looking for some simple solutions in the same scenario.
This is now possible also using the native promise support introduced in ES6. You can wrap the ajax call in a promise and return it to the handler of the element.
function ajaxPromise(elInfo) {
return new Promise(function (resolve, reject) {
//Do anything as desired with the elInfo passed as parameter
$.ajax({
type: "POST",
url: '/someurl/',
data: {data: "somedata" + elInfo},
success: function (data) {
//Do anything as desired with the data received from the server,
//and then resolve the promise
resolve();
},
error: function (err) {
reject(err);
},
async: true
});
});
}
Now call the function recursively, from where you have the collection of the elements.
function callAjaxSynchronous(elCollection) {
if (elCollection.length > 0) {
var el = elCollection.shift();
ajaxPromise(el)
.then(function () {
callAjaxSynchronous(elCollection);
})
.catch(function (err) {
//Abort further ajax calls/continue with the rest
//callAjaxSynchronous(elCollection);
});
}
else {
return false;
}
}
I use http://developer.yahoo.com/yui/3/io/#queue to get that functionality.
The only solutions I can come up with is, as you say, maintaining a list of pending calls / callbacks. Or nesting the next call in the previous callback, but that feels a bit messy.
You can achieve the same thing using then.
var files = [
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example2.txt',
'example.txt'
];
nextFile().done(function(){
console.log("done",arguments)
});
function nextFile(text){
var file = files.shift();
if(text)
$('body').append(text + '<br/>');
if(file)
return $.get(file).then(nextFile);
}
http://plnkr.co/edit/meHQHU48zLTZZHMCtIHm?p=preview
I would suggest a bit more sophisticated approach which is reusable for different cases.
I am using it for example when I need to slow down a call sequence when the user is typing in text editor.
But I am sure it should also work when iterating through the collection. In this case it can queue requests and can send a single AJAX call instead of 12.
queueing = {
callTimeout: undefined,
callTimeoutDelayTime: 1000,
callTimeoutMaxQueueSize: 12,
callTimeoutCurrentQueueSize: 0,
queueCall: function (theCall) {
clearTimeout(this.callTimeout);
if (this.callTimeoutCurrentQueueSize >= this.callTimeoutMaxQueueSize) {
theCall();
this.callTimeoutCurrentQueueSize = 0;
} else {
var _self = this;
this.callTimeout = setTimeout(function () {
theCall();
_self.callTimeoutCurrentQueueSize = 0;
}, this.callTimeoutDelayTime);
}
this.callTimeoutCurrentQueueSize++;
}
}
There's a very simple way to achieve this by adding async: false as a property to the ajax call. This will make sure the ajax call is complete before parsing the rest of the code. I have used this successfully in loops many times.
Eg.
$.ajax({
url: "",
type: "GET",
async: false
...

Parallel asynchronous Ajax requests using jQuery

I'd like to update a page based upon the results of multiple ajax/json requests. Using jQuery, I can "chain" the callbacks, like this very simple stripped down example:
$.getJSON("/values/1", function(data) {
// data = {value: 1}
var value_1 = data.value;
$.getJSON("/values/2", function(data) {
// data = {value: 42}
var value_2 = data.value;
var sum = value_1 + value_2;
$('#mynode').html(sum);
});
});
However, this results in the requests being made serially. I'd much rather a way to make the requests in parallel, and perform the page update after all are complete. Is there any way to do this?
jQuery $.when() and $.done() are exactly what you need:
$.when($.ajax("/page1.php"), $.ajax("/page2.php"))
.then(myFunc, myFailure);
Try this solution, which can support any specific number of parallel queries:
var done = 4; // number of total requests
var sum = 0;
/* Normal loops don't create a new scope */
$([1,2,3,4,5]).each(function() {
var number = this;
$.getJSON("/values/" + number, function(data) {
sum += data.value;
done -= 1;
if(done == 0) $("#mynode").html(sum);
});
});
Run multiple AJAX requests in parallel
When working with APIs, you sometimes need to issue multiple AJAX requests to different endpoints. Instead of waiting for one request to complete before issuing the next, you can speed things up with jQuery by requesting the data in parallel, by using jQuery's $.when() function:
JS
$.when($.get('1.json'), $.get('2.json')).then(function(r1, r2){
console.log(r1[0].message + " " + r2[0].message);
});
The callback function is executed when both of these GET requests finish successfully. $.when() takes the promises returned by two $.get() calls, and constructs a new promise object. The r1 and r2 arguments of the callback are arrays, whose first elements contain the server responses.
Here's my attempt at directly addressing your question
Basically, you just build up and AJAX call stack, execute them all, and a provided function is called upon completion of all the events - the provided argument being an array of the results from all the supplied ajax requests.
Clearly this is early code - you could get more elaborate with this in terms of the flexibility.
<script type="text/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.2.min.js"></script>
<script type="text/javascript">
var ParallelAjaxExecuter = function( onComplete )
{
this.requests = [];
this.results = [];
this.onComplete = onComplete;
}
ParallelAjaxExecuter.prototype.addRequest = function( method, url, data, format )
{
this.requests.push( {
"method" : method
, "url" : url
, "data" : data
, "format" : format
, "completed" : false
} )
}
ParallelAjaxExecuter.prototype.dispatchAll = function()
{
var self = this;
$.each( self.requests, function( i, request )
{
request.method( request.url, request.data, function( r )
{
return function( data )
{
console.log
r.completed = true;
self.results.push( data );
self.checkAndComplete();
}
}( request ) )
} )
}
ParallelAjaxExecuter.prototype.allRequestsCompleted = function()
{
var i = 0;
while ( request = this.requests[i++] )
{
if ( request.completed === false )
{
return false;
}
}
return true;
},
ParallelAjaxExecuter.prototype.checkAndComplete = function()
{
if ( this.allRequestsCompleted() )
{
this.onComplete( this.results );
}
}
var pe = new ParallelAjaxExecuter( function( results )
{
alert( eval( results.join( '+' ) ) );
} );
pe.addRequest( $.get, 'test.php', {n:1}, 'text' );
pe.addRequest( $.get, 'test.php', {n:2}, 'text' );
pe.addRequest( $.get, 'test.php', {n:3}, 'text' );
pe.addRequest( $.get, 'test.php', {n:4}, 'text' );
pe.dispatchAll();
</script>
here's test.php
<?php
echo pow( $_GET['n'], 2 );
?>
Update: Per the answer given by Yair Leviel, this answer is obsolete. Use a promise library, like jQuery.when() or Q.js.
I created a general purpose solution as a jQuery extension. Could use some fine tuning to make it more general, but it suited my needs. The advantage of this technique over the others in this posting as of the time of this writing was that any type of asynchronous processing with a callback can be used.
Note: I'd use Rx extensions for JavaScript instead of this if I thought my client would be okay with taking a dependency on yet-another-third-party-library :)
// jQuery extension for running multiple async methods in parallel
// and getting a callback with all results when all of them have completed.
//
// Each worker is a function that takes a callback as its only argument, and
// fires up an async process that calls this callback with its result.
//
// Example:
// $.parallel(
// function (callback) { $.get("form.htm", {}, callback, "html"); },
// function (callback) { $.post("data.aspx", {}, callback, "json"); },
// function (formHtml, dataJson) {
// // Handle success; each argument to this function is
// // the result of correlating ajax call above.
// }
// );
(function ($) {
$.parallel = function (anyNumberOfWorkers, allDoneCallback) {
var workers = [];
var workersCompleteCallback = null;
// To support any number of workers, use "arguments" variable to
// access function arguments rather than the names above.
var lastArgIndex = arguments.length - 1;
$.each(arguments, function (index) {
if (index == lastArgIndex) {
workersCompleteCallback = this;
} else {
workers.push({ fn: this, done: false, result: null });
}
});
// Short circuit this edge case
if (workers.length == 0) {
workersCompleteCallback();
return;
}
// Fire off each worker process, asking it to report back to onWorkerDone.
$.each(workers, function (workerIndex) {
var worker = this;
var callback = function () { onWorkerDone(worker, arguments); };
worker.fn(callback);
});
// Store results and update status as each item completes.
// The [0] on workerResultS below assumes the client only needs the first parameter
// passed into the return callback. This simplifies the handling in allDoneCallback,
// but may need to be removed if you need access to all parameters of the result.
// For example, $.post calls back with success(data, textStatus, XMLHttpRequest). If
// you need textStatus or XMLHttpRequest then pull off the [0] below.
function onWorkerDone(worker, workerResult) {
worker.done = true;
worker.result = workerResult[0]; // this is the [0] ref'd above.
var allResults = [];
for (var i = 0; i < workers.length; i++) {
if (!workers[i].done) return;
else allResults.push(workers[i].result);
}
workersCompleteCallback.apply(this, allResults);
}
};
})(jQuery);
UPDATE And another two years later, this looks insane because the accepted answer has changed to something much better! (Though still not as good as Yair Leviel's answer using jQuery's when)
18 months later, I just hit something similar. I have a refresh button, and I want the old content to fadeOut and then the new content to fadeIn. But I also need to get the new content. The fadeOut and the get are asynchronous, but it would be a waste of time to run them serially.
What I do is really the same as the accepted answer, except in the form of a reusable function. Its primary virtue is that it is much shorter than the other suggestions here.
var parallel = function(actions, finished) {
finishedCount = 0;
var results = [];
$.each(actions, function(i, action) {
action(function(result) {
results[i] = result;
finishedCount++;
if (finishedCount == actions.length) {
finished(results);
}
});
});
};
You pass it an array of functions to run in parallel. Each function should accept another function to which it passes its result (if any). parallel will supply that function.
You also pass it a function to be called when all the operations have completed. This will receive an array with all the results in. So my example was:
refreshButton.click(function() {
parallel([
function(f) {
contentDiv.fadeOut(f);
},
function(f) {
portlet.content(f);
},
],
function(results) {
contentDiv.children().remove();
contentDiv.append(results[1]);
contentDiv.fadeIn();
});
});
So when my refresh button is clicked, I launch jQuery's fadeOut effect and also my own portlet.content function (which does an async get, builds a new bit of content and passes it on), and then when both are complete I remove the old content, append the result of the second function (which is in results[1]) and fadeIn the new content.
As fadeOut doesn't pass anything to its completion function, results[0] presumably contains undefined, so I ignore it. But if you had three operations with useful results, they would each slot into the results array, in the same order you passed the functions.
you could do something like this
var allData = []
$.getJSON("/values/1", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
$.getJSON("/values/2", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
var processData = function(data){
var sum = data[0] + data[1]
$('#mynode').html(sum);
}
Here's an implementation using mbostock/queue:
queue()
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 1}), delay: 1}, function(data) {
callback(null, data.value);
});
})
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 3}), delay: 2}, function(data) {
callback(null, data.value);
});
})
.awaitAll(function(err, results) {
var result = results.reduce(function(acc, value) {
return acc + value;
}, 0);
console.log(result);
});
The associated fiddle: http://jsfiddle.net/MdbW2/
With the following extension of JQuery (to can be written as a standalone function you can do this:
$.whenAll({
val1: $.getJSON('/values/1'),
val2: $.getJSON('/values/2')
})
.done(function (results) {
var sum = results.val1.value + results.val2.value;
$('#mynode').html(sum);
});
The JQuery (1.x) extension whenAll():
$.whenAll = function (deferreds) {
function isPromise(fn) {
return fn && typeof fn.then === 'function' &&
String($.Deferred().then) === String(fn.then);
}
var d = $.Deferred(),
keys = Object.keys(deferreds),
args = keys.map(function (k) {
return $.Deferred(function (d) {
var fn = deferreds[k];
(isPromise(fn) ? fn : $.Deferred(fn))
.done(d.resolve)
.fail(function (err) { d.reject(err, k); })
;
});
});
$.when.apply(this, args)
.done(function () {
var resObj = {},
resArgs = Array.prototype.slice.call(arguments);
resArgs.forEach(function (v, i) { resObj[keys[i]] = v; });
d.resolve(resObj);
})
.fail(d.reject);
return d;
};
See jsbin example:
http://jsbin.com/nuxuciwabu/edit?js,console
The most professional solution for me would be by using async.js and Array.reduce like so:
async.map([1, 2, 3, 4, 5], function (number, callback) {
$.getJSON("/values/" + number, function (data) {
callback(null, data.value);
});
}, function (err, results) {
$("#mynode").html(results.reduce(function(previousValue, currentValue) {
return previousValue + currentValue;
}));
});
If the result of one request depends on the other, you can't make them parallel.
Building on Yair's answer.
You can define the ajax promises dynamically.
var start = 1; // starting value
var len = 2; // no. of requests
var promises = (new Array(len)).fill().map(function() {
return $.ajax("/values/" + i++);
});
$.when.apply($, promises)
.then(myFunc, myFailure);
Suppose you have an array of file name.
var templateNameArray=["test.html","test2.html","test3.html"];
htmlTemplatesLoadStateMap={};
var deffereds=[];
for (var i = 0; i < templateNameArray.length; i++)
{
if (!htmlTemplatesLoadStateMap[templateNameArray[i]])
{
deferreds.push($.get("./Content/templates/" +templateNameArray[i],
function (response, status, xhr) {
if (status == "error") { }
else {
$("body").append(response);
}
}));
htmlTemplatesLoadStateMap[templateNameArray[i]] = true;
}
}
$.when.all(deferreds).always(function(resultsArray) { yourfunctionTobeExecuted(yourPayload);
});
I needed multiple, parallel ajax calls, and the jquery $.when syntax wasn't amenable to the full $.ajax format I am used to working with. So I just created a setInterval timer to periodically check when each of the ajax calls had returned. Once they were all returned, I could proceed from there.
I read there may be browser limitations as to how many simultaneous ajax calls you can have going at once (2?), but .$ajax is inherently asynchronous, so making the ajax calls one-by-one would result in parallel execution (within the browser's possible limitation).

Best way to add a 'callback' after a series of asynchronous XHR calls

I stumbled on a piece of Ajax code that is not 100% safe since it's mixing asynchronous/synchronous type of code... so basically in the code below I have a jQuery.each in which it grabs information on the elements and launch an Ajax get request for each:
$(search).each(function() {
$.ajax({
url: 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
success: function(o){
//Update UI
},
error: function(o){
//Update UI
}
});
});
//code to do after saving...
So obviously the 'code to do after saving...' often gets executed before all the requests are completed. In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback but assuming this is not possible, I changed the code to something like this to make sure all requests came back before continuing which I'm still not in love with:
var recs = [];
$(search).each(function() {
recs[recs.length] = 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
});
var counter = 0;
function saveRecords(){
$.ajax({
url: recs[counter],
success: function(o){
//Update progress
if (counter<recs.length){
counter++;
saveRecords();
}else{
doneSavingRecords();
}
},
error: function(o){
//Update progress
doneSavingRecords(o.status);
}
});
}
function doneSavingRecords(text){
//code to do after saving...
}
if (recs.length>0){
saveRecords(); //will recursively callback itself until a failed request or until all records were saved
}else{
doneSavingRecords();
}
So I'm looking for the 'best' way to add a bit of synchronous functionality to a series of asynchronous calls ?
Thanks!!
Better Answer:
function saveRecords(callback, errorCallback){
$('<div></div>').ajaxStop(function(){
$(this).remove(); // Keep future AJAX events from effecting this
callback();
}).ajaxError(function(e, xhr, options, err){
errorCallback(e, xhr, options, err);
});
$(search).each(function() {
$.get('save.x3', { id: $(this).attr("id"), value: $(this).data("value") });
});
}
Which would be used like this:
saveRecords(function(){
// Complete will fire after all requests have completed with a success or error
}, function(e, xhr, options, err){
// Error will fire for every error
});
Original Answer: This is good if they need to be in a certain order or you have other regular AJAX events on the page that would affect the use of ajaxStop, but this will be slower:
function saveRecords(callback){
var recs = $(search).map(function(i, obj) {
return { id: $(obj).attr("id"), value: $(obj).data("value") };
});
var save = function(){
if(!recs.length) return callback();
$.ajax({
url: 'save.x3',
data: recs.shift(), // shift removes/returns the first item in an array
success: function(o){
save();
},
error: function(o){
//Update progress
callback(o.status);
}
});
}
save();
}
Then you can call it like this:
saveRecords(function(error){
// This function will run on error or after all
// commands have run
});
If I understand what you're asking, I think you could use $.ajaxStop() for this purpose.
This is easily solved by calling the same function to check that all AJAX calls are complete. You just need a simple queue shared between functions, and a quick check (no loops, timers, promises, etc).
//a list of URLs for which we'll make async requests
var queue = ['/something.json', '/another.json'];
//will contain our return data so we can work with it
//in our final unified callback ('displayAll' in this example)
var data = [];
//work through the queue, dispatch requests, check if complete
function processQueue( queue ){
for(var i = 0; i < queue.length; i++){
$.getJSON( queue[i], function( returnData ) {
data.push(returnData);
//reduce the length of queue by 1
//don't care what URL is discarded, only that length is -1
queue.pop();
checkIfLast(displayAll(data));
}).fail(function() {
throw new Error("Unable to fetch resource: " + queue[i]);
});
}
}
//see if this is last successful AJAX (when queue == 0 it is last)
//if this is the last success, run the callback
//otherwise don't do anything
function checkIfLast(callback){
if(queue.length == 0){
callback();
}
}
//when all the things are done
function displayAll(things){
console.log(things); //show all data after final ajax request has completed.
}
//begin
processQueue();
Edit: I should add that I specifically aimed for an arbitrary number of items in the queue. You can simply add another URL and this will work just the same.
>> In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback
You'll need to think about this in terms of events.
Closure's net.BulkLoader (or a similar approach) will do it for you:
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/class_goog_net_BulkLoader.html
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/closure_goog_net_bulkloader.js.source.html
See:
goog.net.BulkLoader.prototype.handleSuccess_ (for individual calls)
&
goog.net.BulkLoader.prototype.finishLoad_ (for completion of all calls)

Categories

Resources