I have a function that loads HTML from an external file via an AJAX call using jQuery.
These ajax call runs inside of a $.each loop and I need this ajax call to finish before the loop continues. Here is my code:
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
});
});
I know I can set async:false but I hear that is not a good idea. Any ideas?
To achieve this you can put each request in to an array and apply() that array to $.when. Try this:
var requests = [];
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
requests.push($.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
}));
});
$.when.apply($, requests).done(function() {
console.log('all requests complete');
});
Note that you're replacing the same content on each request, so the only one which will have any effect on the UI is the last request. The preceding ones are redundant.
Also note that you should never, ever use async: false. It locks the UI thread of the browser until the request completes, which makes it look like it has crashed to the user. It is terrible practice. Use callbacks.
The OP appears to want the calls to run in series, not in parallel
If this is the case you could use recursion:
function makeRequest($els, index) {
var cotainer_html = $('.cotainer_html').clone();
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
if ($els.eq(index + 1).length) {
makeRequest($els, ++index);
} else {
console.log('all requests complete');
}
}
});
}
makeRequest($('img'), 0);
You can use a pseudo-recursive loop:
var imgs = $('img').get();
var done = (function loop() {
var img = imgs.shift();
if (img) {
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
return $.get('/assets/ajax/get_studio_image.php')
.then(function(data) {
cotainer_html.find('.replaceme').replaceWith(data);
}).then(loop);
} else {
return $.Deferred().resolve(); // resolved when the loop terminates
}
})();
This will take each element of the list, get the required image, and .then() start over until there's nothing left.
The immediately invoked function expression itself returns a Promise, so you can chain a .then() call to that that'll be invoked once the loop has completed:
done.then(function() {
// continue your execution here
...
});
Related
Let me explain my situation,
I have a list of checkboxes in a fieldset, for each checkbox i would like to send a jquery get and wait for the response from the server which can be random time 10 seconds to a long time once i get the result display the result and continue to the next loop.
$(function() {
$("button[name=distributeGroupProgramsToCustomersNow]").click(function() {
event.preventDefault();
$("button[name=distributeGroupProgramsToCustomersNow]").attr("disabled", "");
$("input[name='distributeGroups-GroupsList']:checked").each(function ()
{
// the loop that waits for the response
});
$("button[name=distributeGroupProgramsToCustomersNow]").removeAttr("disabled", "");
});
});
How do i achieve this in jQuery?
Any help is greatly appriciated, Thanks!
You can do this by making the call to your ajax synchronous, that way the loop has to wait for the response before it can continue.
$.each(arrayOfItems, function(i,v){
$.ajax({
url: 'path/to/my/file.extension',
type: 'GET', //this is already the default, only placed as an example.
async: false, // this is what you want to make sure things 'wait'.
data: 'checkedValue='+v, //value of our item.
success: function(data){
//manipulate / insert the data where you need.
$("#someElement").append(data);
}
});
});
How it works
For each item, we have an ajax call. Disabling 'asynchronous' ajax forces the server to 'wait' on the previous AJAX request to be completed before processing the next AJAX request. This replicates the behavior that you want.
you can write a function that calls itself
var ajaxLoopInput = $("input[name='distributeGroups-GroupsList']:checked");
var i = 0;
ajaxIt(i);
function ajaxIt(index){
if(i < ajaxLoopInput.length){
ajaxLoopInput.eq(index).dosomething(); //if you need the current item in list.
$.ajax({
//your ajax stuff
success : function(){
i++;
ajaxIt(i);
}
});
}else{
return false;
}
}
The best way to handle this without blocking and using synchronous calls, is to use the array as a queue, and have a function that pulls the first item from the queue, runs the query, and then calls itself.
var myQueueArray = [1, 2, 3];
var currentIndex = 0;
function queueGet() {
if(currentIndex >= myQueueArray.length) {
return;
}
$.get("/my/url/", { 'checkedValue', myQueueArray[currentIndex] }, function(data) {
//this is the important part...call the queue again until it's done
currentIndex += 1;
queueGet();
});
}
The version above is non-destructive of the array. You could of course just pop the array too.
Hi the below Javascript is called when I submit a form. It first splits a bunch of url's from a text area, it then:
1) Adds lines to a table for each url, and in the last column (the 'status' column) it says "Not Started".
2) Again it loops through each url, first off it makes an ajax call to check on the status (status.php) which will return a percentage from 0 - 100.
3) In the same loop it kicks off the actual process via ajax (process.php), when the process has completed (bearing in the mind the continuous status updates), it will then say "Completed" in the status column and exit the auto_refresh.
4) It should then go to the next 'each' and do the same for the next url.
function formSubmit(){
var lines = $('#urls').val().split('\n');
$.each(lines, function(key, value) {
$('#dlTable tr:last').after('<tr><td>'+value+'</td><td>Not Started</td></tr>');
});
$.each(lines, function(key, value) {
var auto_refresh = setInterval( function () {
$.ajax({
url: 'status.php',
success: function(data) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>"+data+"</td>");
}
});
}, 1000);
$.ajax({
url: 'process.php?id='+value,
success: function(msg) {
clearInterval(auto_refresh);
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>completed rip</td>");
}
});
});
}
What you want is to run several asynchronous actions in sequence, right? I'd build an array of the functions to execute and run it through a sequence helper.
https://github.com/michiel/asynchelper-js/blob/master/lib/sequencer.js
var actions = [];
$.each(lines, function(key, value) {
actions.push(function(callback) {
$.ajax({
url: 'process.php?id='+val,
success: function(msg) {
clearInterval(auto_refresh);
//
// Perform your DOM operations here and be sure to call the
// callback!
//
callback();
}
});
});
}
);
As you can see, we build an array of scoped functions that take an arbitrary callback as an argument. A sequencer will run them in order for you.
Use the sequence helper from the github link and run,
var sequencer = new Sequencer(actions);
sequencer.start();
It is, btw, possible to define sequencer functions in a few lines of code. For example,
function sequencer(arr) {
(function() {
((arr.length != 0) && (arr.shift()(arguments.callee)));
})();
}
AJAX is asynchronous.
That's exactly what's supposed to happen.
Instead of using each, you should send the next AJAX request in the completion handler of the previous one.
You can also set AJAX to run synchronously using the "async" property. Add the following:
$.ajax({ "async": false, ... other options ... });
AJAX API reference here. Note that this will lock the browser until the request completes.
I prefer the approach in SLaks answer (sticking with asynchronous behavior). However, it does depend on your application. Exercise caution.
I would give the same answer as this jquery json populate table
This code will give you a little idea how to use callback with loops and ajax. But I have not tested it and there will be bugs. I derived the following from my old code:-
var processCnt; //Global variable - check if this is needed
function formSubmit(){
var lines = $('#urls').val().split('\n');
$.each(lines, function(key, value) {
$('#dlTable tr:last').after('<tr><td>'+value+'</td><td>Not Started</td></tr>');
});
completeProcessing(lines ,function(success)
{
$.ajax({
url: 'process.php?id='+value,
success: function(msg) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>completed rip</td>");
}
});
});
}
//Following two functions added by me
function completeProcessing(lines,callback)
{
processCnt= 0;
processingTimer = setInterval(function() { singleProcessing(lines[processCnt],function(completeProcessingSuccess){ if(completeProcessingSuccess){ clearInterval(processingTimer); callback(true); }})}, 1000);
}
function singleProcessing(line,callback)
{
key=processCnt;
val = line;
if(processCnt < totalFiles)
{ //Files to be processed
$.ajax({
url: 'status.php',
success: function(data) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>"+data+"</td>");
processCnt++;
callback(false);
}
});
}
else
{
callback(true);
}
}
how do i make data overwrite results variable ?
var ajax = {
get : {
venues : function(search){
var results = "#";
$.getJSON("http://x.com/some.php?term="+search+"&callback=?",function(data){ results = data; });
return results;
}
}
};
data is overwriting results, just after results has been returned.
You can use the ajax function instead of getJSON, since getJSON is just shorthand for
$.ajax({
url: url,
dataType: 'json',
data: data,
success: callback
});
and then also set async to false, so that the call will block.
However, in your case this won't work, because JSONP requests (with "?callback=?") cannot be synchronous.
The other (better) option is to have whatever code is dependent on the results return value get called by the success callback.
So, instead of something like this:
var results = ajax.get.venues('search');
$('#results').html(translateResults(results));
Maybe something like this:
ajax.get.venues('search', function (results) {
$('#results').html(translateResults(results));
});
venues = function (search, callback) {
$.getJSON("http://x.com/some.php?term="+search+"&callback=?",
function(data){
callback(data);
});
};
Your problem is the asynchronous nature of JavaScript. results does get overwritten, but only later, after the function has already exited, because the callback is executed when the request has finished.
You would have to make the Ajax call synchronous using sync: true (this is usually not a good idea, just mentioning it for completeness's sake) or restructure your code flow so it doesn't depend on the return value any more, but everything you need to do gets done in the callback function.
This isn't a scope problem. It's because $.getJSON is asynchronous; results is returned before $.getJSON finishes. Try making a callback for $.getJSON to call when it's done.
function JSON_handler(data){
// do stuff...
}
$.getJSON("http://x.com/some.php?term="+search+"&callback=?", JSON_handler);
You could put the logic you want to run in a callback.
var ajax = {
get : {
venues : function(search, fnCallback){
var results = "#";
$.getJSON("http://x.com/some.php?term="+search+"&callback=?",
function(data){
// success
results = data;
(typeof fnCallback == 'function') && fnCallback(data);
});
return results;
}
}
};
ajax.get.venues(term, function(result){
// Do stuff with data here :)
})
functional programming can be fun.
I find I sometimes need to iterate some collection and make an ajax call for each element. I want each call to return before moving to the next element so that I don't blast the server with requests - which often leads to other issues. And I don't want to set async to false and freeze the browser.
Usually this involves setting up some kind of iterator context that i step thru upon each success callback. I think there must be a cleaner simpler way?
Does anyone have a clever design pattern for how to neatly work thru a collection making ajax calls for each item?
jQuery 1.5+
I developed an $.ajaxQueue() plugin that uses the $.Deferred, .queue(), and $.ajax() to also pass back a promise that is resolved when the request completes.
/*
* jQuery.ajaxQueue - A queue for ajax requests
*
* (c) 2011 Corey Frang
* Dual licensed under the MIT and GPL licenses.
*
* Requires jQuery 1.5+
*/
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function( ajaxOpts ) {
var jqXHR,
dfd = $.Deferred(),
promise = dfd.promise();
// queue our ajax request
ajaxQueue.queue( doRequest );
// add the abort method
promise.abort = function( statusText ) {
// proxy abort to the jqXHR if it is active
if ( jqXHR ) {
return jqXHR.abort( statusText );
}
// if there wasn't already a jqXHR we need to remove from queue
var queue = ajaxQueue.queue(),
index = $.inArray( doRequest, queue );
if ( index > -1 ) {
queue.splice( index, 1 );
}
// and then reject the deferred
dfd.rejectWith( ajaxOpts.context || ajaxOpts,
[ promise, statusText, "" ] );
return promise;
};
// run the actual query
function doRequest( next ) {
jqXHR = $.ajax( ajaxOpts )
.done( dfd.resolve )
.fail( dfd.reject )
.then( next, next );
}
return promise;
};
})(jQuery);
jQuery 1.4
If you're using jQuery 1.4, you can utilize the animation queue on an empty object to create your own "queue" for your ajax requests for the elements.
You can even factor this into your own $.ajax() replacement. This plugin $.ajaxQueue() uses the standard 'fx' queue for jQuery, which will auto-start the first added element if the queue isn't already running.
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function(ajaxOpts) {
// hold the original complete function
var oldComplete = ajaxOpts.complete;
// queue our ajax request
ajaxQueue.queue(function(next) {
// create a complete callback to fire the next event in the queue
ajaxOpts.complete = function() {
// fire the original complete if it was there
if (oldComplete) oldComplete.apply(this, arguments);
next(); // run the next query in the queue
};
// run the query
$.ajax(ajaxOpts);
});
};
})(jQuery);
Example Usage
So, we have a <ul id="items"> which has some <li> that we want to copy (using ajax!) to the <ul id="output">
// get each item we want to copy
$("#items li").each(function(idx) {
// queue up an ajax request
$.ajaxQueue({
url: '/echo/html/',
data: {html : "["+idx+"] "+$(this).html()},
type: 'POST',
success: function(data) {
// Write to #output
$("#output").append($("<li>", { html: data }));
}
});
});
jsfiddle demonstration - 1.4 version
A quick and small solution using deferred promises. Although this uses jQuery's $.Deferred, any other should do.
var Queue = function () {
var previous = new $.Deferred().resolve();
return function (fn, fail) {
return previous = previous.then(fn, fail || fn);
};
};
Usage, call to create new queues:
var queue = Queue();
// Queue empty, will start immediately
queue(function () {
return $.get('/first');
});
// Will begin when the first has finished
queue(function() {
return $.get('/second');
});
See the example with a side-by-side comparison of asynchronous requests.
This works by creating a function that will automatically chain promises together. The synchronous nature comes from the fact that we are wrapping $.get calls in function and pushing them into a queue. The execution of these functions are deferred and will only be called when it gets to the front of the queue.
A requirement for the code is that each of the functions you give must return a promise. This returned promise is then chained onto the latest promise in the queue. When you call the queue(...) function it chains onto the last promise, hence the previous = previous.then(...).
You can wrap all that complexity into a function to make a simple call that looks like this:
loadSequantially(['/a', '/a/b', 'a/b/c'], function() {alert('all loaded')});
Below is a rough sketch (working example, except the ajax call). This can be modified to use a queue-like structure instead of an array
// load sequentially the given array of URLs and call 'funCallback' when all's done
function loadSequantially(arrUrls, funCallback) {
var idx = 0;
// callback function that is called when individual ajax call is done
// internally calls next ajax URL in the sequence, or if there aren't any left,
// calls the final user specified callback function
var individualLoadCallback = function() {
if(++idx >= arrUrls.length) {
doCallback(arrUrls, funCallback);
}else {
loadInternal();
}
};
// makes the ajax call
var loadInternal = function() {
if(arrUrls.length > 0) {
ajaxCall(arrUrls[idx], individualLoadCallback);
}else {
doCallback(arrUrls, funCallback);
}
};
loadInternal();
};
// dummy function replace with actual ajax call
function ajaxCall(url, funCallBack) {
alert(url)
funCallBack();
};
// final callback when everything's loaded
function doCallback(arrUrls, func) {
try {
func();
}catch(err) {
// handle errors
}
};
Ideally, a coroutine with multiple entry points so every callback from server can call the same coroutine will be neat. Damn, this is about to be implemented in Javascript 1.7.
Let me try using closure...
function BlockingAjaxCall (URL,arr,AjaxCall,OriginalCallBack)
{
var nextindex = function()
{
var i =0;
return function()
{
return i++;
}
};
var AjaxCallRecursive = function(){
var currentindex = nextindex();
AjaxCall
(
URL,
arr[currentindex],
function()
{
OriginalCallBack();
if (currentindex < arr.length)
{
AjaxCallRecursive();
}
}
);
};
AjaxCallRecursive();
}
// suppose you always call Ajax like AjaxCall(URL,element,callback) you will do it this way
BlockingAjaxCall(URL,myArray,AjaxCall,CallBack);
Yeah, while the other answers will work, they are lots of code and messy looking. Frame.js was designed to elegantly address this situation. https://github.com/bishopZ/Frame.js
For instance, this will cause most browsers to hang:
for(var i=0; i<1000; i++){
$.ajax('myserver.api', { data:i, type:'post' });
}
While this will not:
for(var i=0; i<1000; i++){
Frame(function(callback){
$.ajax('myserver.api', { data:i, type:'post', complete:callback });
});
}
Frame.start();
Also, using Frame allows you to waterfall the response objects and deal with them all after the entire series of AJAX request have completed (if you want to):
var listOfAjaxObjects = [ {}, {}, ... ]; // an array of objects for $.ajax
$.each(listOfAjaxObjects, function(i, item){
Frame(function(nextFrame){
item.complete = function(response){
// do stuff with this response or wait until end
nextFrame(response); // ajax response objects will waterfall to the next Frame()
$.ajax(item);
});
});
Frame(function(callback){ // runs after all the AJAX requests have returned
var ajaxResponses = [];
$.each(arguments, function(i, arg){
if(i!==0){ // the first argument is always the callback function
ajaxResponses.push(arg);
}
});
// do stuff with the responses from your AJAX requests
// if an AJAX request returned an error, the error object will be present in place of the response object
callback();
});
Frame.start()
I am posting this answer thinking that it might help other persons in future, looking for some simple solutions in the same scenario.
This is now possible also using the native promise support introduced in ES6. You can wrap the ajax call in a promise and return it to the handler of the element.
function ajaxPromise(elInfo) {
return new Promise(function (resolve, reject) {
//Do anything as desired with the elInfo passed as parameter
$.ajax({
type: "POST",
url: '/someurl/',
data: {data: "somedata" + elInfo},
success: function (data) {
//Do anything as desired with the data received from the server,
//and then resolve the promise
resolve();
},
error: function (err) {
reject(err);
},
async: true
});
});
}
Now call the function recursively, from where you have the collection of the elements.
function callAjaxSynchronous(elCollection) {
if (elCollection.length > 0) {
var el = elCollection.shift();
ajaxPromise(el)
.then(function () {
callAjaxSynchronous(elCollection);
})
.catch(function (err) {
//Abort further ajax calls/continue with the rest
//callAjaxSynchronous(elCollection);
});
}
else {
return false;
}
}
I use http://developer.yahoo.com/yui/3/io/#queue to get that functionality.
The only solutions I can come up with is, as you say, maintaining a list of pending calls / callbacks. Or nesting the next call in the previous callback, but that feels a bit messy.
You can achieve the same thing using then.
var files = [
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example2.txt',
'example.txt'
];
nextFile().done(function(){
console.log("done",arguments)
});
function nextFile(text){
var file = files.shift();
if(text)
$('body').append(text + '<br/>');
if(file)
return $.get(file).then(nextFile);
}
http://plnkr.co/edit/meHQHU48zLTZZHMCtIHm?p=preview
I would suggest a bit more sophisticated approach which is reusable for different cases.
I am using it for example when I need to slow down a call sequence when the user is typing in text editor.
But I am sure it should also work when iterating through the collection. In this case it can queue requests and can send a single AJAX call instead of 12.
queueing = {
callTimeout: undefined,
callTimeoutDelayTime: 1000,
callTimeoutMaxQueueSize: 12,
callTimeoutCurrentQueueSize: 0,
queueCall: function (theCall) {
clearTimeout(this.callTimeout);
if (this.callTimeoutCurrentQueueSize >= this.callTimeoutMaxQueueSize) {
theCall();
this.callTimeoutCurrentQueueSize = 0;
} else {
var _self = this;
this.callTimeout = setTimeout(function () {
theCall();
_self.callTimeoutCurrentQueueSize = 0;
}, this.callTimeoutDelayTime);
}
this.callTimeoutCurrentQueueSize++;
}
}
There's a very simple way to achieve this by adding async: false as a property to the ajax call. This will make sure the ajax call is complete before parsing the rest of the code. I have used this successfully in loops many times.
Eg.
$.ajax({
url: "",
type: "GET",
async: false
...
I stumbled on a piece of Ajax code that is not 100% safe since it's mixing asynchronous/synchronous type of code... so basically in the code below I have a jQuery.each in which it grabs information on the elements and launch an Ajax get request for each:
$(search).each(function() {
$.ajax({
url: 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
success: function(o){
//Update UI
},
error: function(o){
//Update UI
}
});
});
//code to do after saving...
So obviously the 'code to do after saving...' often gets executed before all the requests are completed. In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback but assuming this is not possible, I changed the code to something like this to make sure all requests came back before continuing which I'm still not in love with:
var recs = [];
$(search).each(function() {
recs[recs.length] = 'save.x3?id='+$(this).attr("id")+'value='$(this).data("value");
});
var counter = 0;
function saveRecords(){
$.ajax({
url: recs[counter],
success: function(o){
//Update progress
if (counter<recs.length){
counter++;
saveRecords();
}else{
doneSavingRecords();
}
},
error: function(o){
//Update progress
doneSavingRecords(o.status);
}
});
}
function doneSavingRecords(text){
//code to do after saving...
}
if (recs.length>0){
saveRecords(); //will recursively callback itself until a failed request or until all records were saved
}else{
doneSavingRecords();
}
So I'm looking for the 'best' way to add a bit of synchronous functionality to a series of asynchronous calls ?
Thanks!!
Better Answer:
function saveRecords(callback, errorCallback){
$('<div></div>').ajaxStop(function(){
$(this).remove(); // Keep future AJAX events from effecting this
callback();
}).ajaxError(function(e, xhr, options, err){
errorCallback(e, xhr, options, err);
});
$(search).each(function() {
$.get('save.x3', { id: $(this).attr("id"), value: $(this).data("value") });
});
}
Which would be used like this:
saveRecords(function(){
// Complete will fire after all requests have completed with a success or error
}, function(e, xhr, options, err){
// Error will fire for every error
});
Original Answer: This is good if they need to be in a certain order or you have other regular AJAX events on the page that would affect the use of ajaxStop, but this will be slower:
function saveRecords(callback){
var recs = $(search).map(function(i, obj) {
return { id: $(obj).attr("id"), value: $(obj).data("value") };
});
var save = function(){
if(!recs.length) return callback();
$.ajax({
url: 'save.x3',
data: recs.shift(), // shift removes/returns the first item in an array
success: function(o){
save();
},
error: function(o){
//Update progress
callback(o.status);
}
});
}
save();
}
Then you can call it like this:
saveRecords(function(error){
// This function will run on error or after all
// commands have run
});
If I understand what you're asking, I think you could use $.ajaxStop() for this purpose.
This is easily solved by calling the same function to check that all AJAX calls are complete. You just need a simple queue shared between functions, and a quick check (no loops, timers, promises, etc).
//a list of URLs for which we'll make async requests
var queue = ['/something.json', '/another.json'];
//will contain our return data so we can work with it
//in our final unified callback ('displayAll' in this example)
var data = [];
//work through the queue, dispatch requests, check if complete
function processQueue( queue ){
for(var i = 0; i < queue.length; i++){
$.getJSON( queue[i], function( returnData ) {
data.push(returnData);
//reduce the length of queue by 1
//don't care what URL is discarded, only that length is -1
queue.pop();
checkIfLast(displayAll(data));
}).fail(function() {
throw new Error("Unable to fetch resource: " + queue[i]);
});
}
}
//see if this is last successful AJAX (when queue == 0 it is last)
//if this is the last success, run the callback
//otherwise don't do anything
function checkIfLast(callback){
if(queue.length == 0){
callback();
}
}
//when all the things are done
function displayAll(things){
console.log(things); //show all data after final ajax request has completed.
}
//begin
processQueue();
Edit: I should add that I specifically aimed for an arbitrary number of items in the queue. You can simply add another URL and this will work just the same.
>> In the ideal world I would like to have the server-side code handle all of them at once and move //code to do after saving in the success callback
You'll need to think about this in terms of events.
Closure's net.BulkLoader (or a similar approach) will do it for you:
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/class_goog_net_BulkLoader.html
http://closure-library.googlecode.com/svn/trunk/closure/goog/docs/closure_goog_net_bulkloader.js.source.html
See:
goog.net.BulkLoader.prototype.handleSuccess_ (for individual calls)
&
goog.net.BulkLoader.prototype.finishLoad_ (for completion of all calls)