I have a peculiar ajax scenario that I just can't wrap my head around.
This is the sequence of events that I am trying to co-ordinate:
A single request returns an array of id numbers.
For each id, a new request is sent.
When all those requests have returned, their data is collected and rendered.
It seems simple enough, but I can't figure out how to express it.
My understanding of chaining deferred objects is that they are all instantiated immediately, and then data flows through them as they are resolved. But how does this work when one of the items is an array, potentially of size zero?
I know I'm going to need $.when().apply(), to watch the array of responses come in.
I think maybe I need a single deferred object that somehow stands in for the array, but I can't figure out how to express this.
Honestly this sounds like a bad pattern to use as you have N+1 requests all being fired. If you have a lot of items in the array and/or lots of users you could end up DDOSing your own server.
Because of this it would be better practice to change point 2 so that all ids are sent in a single request, or even change point 1 to return all the required data from the elements with the given ids.
That being said, you can achieve what you are asking for by saving the promises returned by jQuery's AJAX methods to an array and then applying that to $.when(), like this:
$.ajax({
url: '/endpoint1',
data: { foo: 'bar' },
success: function(data) {
var requests = [];
for (var i = 0; i < data.ids.length; i++) {
requests.push(
$.ajax({
url: '/endpoint2'
data: { id: data.ids[i] }
})
);
}
$.when.apply($, requests).done(function() {
// do something when all AJAX requests have completed here...
});
}
});
Related
I've been searching online for hours and I know that I probably have to do something with the deferred objects, but I can't get it done.
Firstly, here is my code:
$('#uploadButton').on('click', function () {
//some preparation stuff (deleted)
var panels = boxesContainer.find('.panel');
var ajaxes = [];
$.each(panels, function (index, panel) {
//more preparation + declaration of variables
function getIdAndPrepareData() {
$.when(
$.ajax({
type: 'POST',
url: 'url',
data: {
'title': name
}
}))
.done(function (result, textStatus, jqXHR) {
if (result.success) {
var id = result.id;
} else {
getIdAndPrepareData();
}
console.log('Created Id: ' + id);
$.each(panelForms, function (index, form) {
var filesInput = //again, prep and vars
$.each($(filesInput)[0].files, function (index, file) {
var formData = new FormData();
formData.append('file', file);
//more stuff
ajaxes.push({
'formData': formData,
'file': file
});
});
});
}).fail(function () {
getIdAndPrepareData();
});
}
getIdAndPrepareData();
});
$.ajax().promise().done(function () {
console.log('bla bla bla');
});
});
So, basically, I am looping through certain DIV's (.panels) and creating a new database entity via ajax for them. Then I am preparing data to be sent to server via ajax, after all the loops complete. And I need to send this data after all the loops with ajax, because the next ajax calls (that I am planning to make after the iterating through .panels and preparing data) are going to create another entities that will be related with the entities created for .panel DIV's ( I push all this data to ajaxes array, and plan to use it leter on ).
I am using jQuerys deferred objects inside panels loop in order to get the newly created ID of the panel, and hold it in the ajaxes array. But I do not know how to execute any code after the panels loop.
I tried to make a promise (I am quite new to this technique) for all ajax calls at the end ( $.ajax().promise().done ), but id doesn't seem to work. Sometimes the console.log in the promise, fires at the end, sometimes at the beginning.
I am not an expert in jQuery and JS, so I would like to ask for some explanations how to work with asynchronous ajax calls inside loops and what should I do in this situation? I want to execute some code at the end, after all the data is prepared.
Thank you.
You will get answers using arrays of promises and evaluating them using array evaluation against $.when etc, but there is a handy shortcut where you can chain $.when calls with only a slight overhead:
Pseudo code below:
var promise; // undefined is a resolved promise to $.when
for (items in a loop){
promise = $.when(promise, $.ajax({...});
}
promise.done(function(){
// All done
});
Notes:
$.ajax returns a promise. That promise is to call you back on completion with the data or an error.
$.when calls you back when a number of promises have completed (or when any fails)
If you call $.when with an existing promise and a new promise you get back a third promise that will complete when both are done. These can be chained together in sequence.
The downside of this shortcut is that the final data values passed to done are more complex that expected with normal evaluation of an array of promises against done.
I use this technique, in preference to arrays of promises, when I just need to know overall completion and not all the individual data/results. It makes for far simpler code and the overhead of the extra promises is minimal. It also works great with sets of animations.
So far I'm having no issue setting up an AngularJS model in my Rails application and giving it data to access on the front-end. I even set it up to be populated with data from an AJAX request using $http. However, I need this this model to contain the data of multiple $http calls. Here's the code I've got thus far:
function DropboxCtrl($scope, $http) {
var $infiniteLoader = $(".infiniteLoader");
var theUIDS = $infiniteLoader.attr('data-dropbox-uids').split(',');
if($infiniteLoader.attr('data-dropbox-uids') != "") {
var theData = {};
$.each(theUIDS, function(key) {
$http({ url: '/dropbox/files/get', method: 'GET', params: { uid: theUIDS[key] }}).success(function(data) {
theData = data;
});
});
$scope.dropboxes = theData;
}
}
I have a method called DropboxCtrl which will start by getting all the UID's that I need to call a GET request on. I loop through each of them and then append data to theData which is a Javascript object. After the each I make my dropboxes model equal to the value of theData. Current I've got the method returning absolutely nothing and no Javascript errors. I am positive that my url works completely and actually did get the code working with just one AJAX request like such:
$.each(theUIDS, function(key) {
$http({ url: '/dropbox/files/get', method: 'GET', params: { uid: theUIDS[key] }}).success(function(data) {
$scope.dropboxes = data;
});
});
However... that code block only returns the last AJAX call because the other ones are overwritten. Maybe what I'm missing is just incorrect Javascript, however, maybe what I'm missing is just a lack of understanding the "Angular way" of things. I'm skilled in Javascript and jQuery, but very new to Angular. Any help?
AngularJs is a high level Javascript framework. The code ultimately is javascript. Within your $each, you can push results to an array or to an initialized collection like
$scope.dropboxes = [{uid:1234}, {uid:2345}] and so on.
within the $each, locate the record for uid and attach the results.
I usually use underscorejs library for operations on collections, arrays etc.
so something like
_.findWhere($scope.dropboxes, {uid: data.uid }).data = data;
assuming the data that is returned has uid in it. If not then there should be another way to map the results to the request. Note that there is no guarantee of the order of responses, so you cannot use array indexes to map results.
I have a collection which has to call 4 external apis Eg: http://www.abc.com, http://www.fgt.com, http://www.jkl.com and http://www.rty.com.
I have a Collection named Todos.js. Is there a way I can fetch the 4 apis together in a single collection since all the four apis would provide me the same model response
So the response I get from the 4 apis has the same data structure i.e. "name" and "link".
Is there a way I can append all the responses in the same collection? What is the best way to achieve this?
I think the way is to override fetch, where you make the Ajax call to each of the APIs. Store the returned partial sets in a temporary array, and when all 4 are complete, create the collection using this.reset. (You could use JQuery's Deferred I suppose, or just keep an internal count of how many calls have returned.)
Something like this:
var Collection = Backbone.Collection.extend({
fetch: function() {
this.completeCount = 0;
this.errorCount = 0;
this.temp = [];
this.urls = [ 'url1', 'url2', 'url3', 'url4' ];
var self = this;
// make a $.get call for each URL and add
_.each(this.urls, function(url) {
$.get(url, { success: function(data) {
console.log("Got partial collection from " + url);
self.addPartial(data);
// alternatively, just call "self.add(data);" here
}, error: function(response) {
console.log("Oops, the Ajax call failed for some reason... ignoring");
self.completeCount ++;
self.errorCount ++;
} });
});
},
// add a JSON array that contains a subset of the collection
addPartial: function(data) {
this.completeCount ++;
var self = this;
// add each item to temp
_.each(data, function(item) {
self.temp.push(item);
});
// if all have been received, then create the collection
if (this.completeCount == this.urls.length) {
this.reset(this.temp);
}
}
});
Here's a Fiddle where I replaced $.get with a method that just returns dummy data after a short delay.
Response to comment
Adding the responses to the collection as they come in is probably better (it's easier anyway). Here's an updated Fiddle.
I know it's an old question, but if someone reaches in here this information may help.
To preserve the data previosly fetched by a collection, you can change the url and call the method fetch() any times needed with this options:
reset: false,
remove: false,
Like this
yourCollection.fetch({reset: false, remove: false, [other]: [wathever]})
And that's all, no need for overriding the method. (Maybe in 2012 it was necesary, dunno. The fact is that those options work for Backbone 1.1.2 or later). Be aware that im not sure if this will merge or just add the new data even if it's reppeated.
The documentation (http://backbonejs.org/#Collection-fetch) is a little confusing about the 'reset' option, because it says is settled false by default, perhaps that may only apply when the url remains static and single.
I'm writing some JavaScript/AJAX code.
Is there anyway to ensure that the server receives the XML requests in the order that they are sent?
If not with plain Ajax, do I get this guarantee if I send everything over a single WebSocket?
Thanks!
If it is of utmost importance that they're received in the proper order, and attaching an iterating id to the form isn't enough:
msg_number = 1; sendAJAX(msg_number); msg_number++;
Then I'd suggest building your own queue-system, and send each subsequent file as the callback of the previous one.
Rather than each element having its own AJAX-access, create one centralized spot in your application to handle that.
Your different AJAX-enabled sections don't even need to know that it is a queue:
AJAX.send({ url : "......", method : "post", success : func(){}, syncronous : true });
On the other side of that, you could have something like:
AJAX.send = function (obj) {
if (obj.synchronous) {
addToSyncQueue(obj); checkQueue();
} else { fireRequest(); }
};
Inside of your sync queue, all you'd need to do is wrap a new function around the old callback:
callback = (function (old_cb) {
return function (response) {
checkQueue();
old_cb(response);
};
}(obj.success));
obj.success = callback;
AJAX.call(obj);
Inside of checkQueue, you'd just need to see if it was empty, and if it wasn't, use
nextObj = queue.shift(); (if you're .push()-ing objects onto the queue -- so first-in, first-out, like you wanted).
A couple of options come to mind:
Send them synchronously, by waiting for a successful response from the server after each XML request is received (i.e. make a queue).
If you know the number of requests you'll be sending beforehand, send the request number as a tag with each request, e.g. <requestNum>1</requestNum><numRequests>5</numRequests>. This doesn't guarantee the order that they're received in, but guarantees that they can be put back in order afterwards, and has the added benefit of being sure that you have all the data.
At my company we use this little ajaxQueue plugin, written by one of the core jQuery contributors:
http://gnarf.net/2011/06/21/jquery-ajaxqueue/
I'm creating a script that performs several functions and I want to update the user as the functions are completed. I have nested $.ajax() calls with each subsequent call in the previous call's success block.
There are a total of 4 calls made for each loop. Let's call them scan_1 through scan_4. The success block of scan_1 calls scan_2 and so on down the chain.
For example, let's say I'm looping over 3 objects. I want the process to go like this:
Loop 1
scan_1
scan_2
scan_3
scan_4
Loop 2
scan_1
scan_2
scan_3
scan_4
Loop 3
scan_1
scan_2
scan_3
scan_4
The problem is that it's running through all the scan_1 calls first. I must be missing something, but I can't seem to figure it out. Any advice would be much appreciated.
For reference, here is a snippet of scan_1 (irrelevant stuff snipped):
for(var i = 1; i <= 3; i++)
{
$.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json',
success: function (result)
{
if(result.proceed == 'true')
{
$('#scan_progress').append(result.message);
scan_2();
}
else
{
$('#scan_progress').append(result.message);
}
}
});
}
Thoughts?
Thanks in advance.
Sounds like you need to use jQuery deferred. It basically allows you to chain multiple event handlers to the jQuery Ajax object and gives you finer control over when the callbacks are invoked.
Further reading:
http://msdn.microsoft.com/en-us/scriptjunkie/gg723713
http://www.erichynds.com/jquery/using-deferreds-in-jquery/
It's asynchronous - the "success" fires sometime in the future. The script does not wait for it to respond. Since you're firing off three requests in your loop, they will all be "scan1".
"scan_2" will be called as each request completes.
Change the request to synchronous if you want to control the order of events.
You are starting by sending off three ajax calls at once.
Scan1 (loop 1)
Scan1 (loop 2)
Scan1 (loop 3)
When each Scan 1 completes, it's subsequent Scan 2, and then Scan 3 are called.
What did you actually want to happen? Scan 1 2 and 3 of loop 1, then 1 2 and 3 of loop 2, and then 1 2 and 3 of loop 3? That would require more nesting, or possibly deferred objects.
Instead of using the success callback for each $.ajax() call, you can store each set of AJAX requests (their jqXHR objects) in an array and wait for all of them to resolve:
function scan_1 () {
//setup array to store jqXHR objects (deferred objects)
var jqXHRs = [];
for(var i = 1; i <= 3; i++)
{
//push a new index onto the array, `$.ajax()` returns an object that will resolve when the response is returned
jqXHRs[jqXHRs.length] = $.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json'
});
}
//wait for all four of the AJAX requests to resolve before running `scan_2()`
$.when(jqXHRs).then(function () {
if(result.proceed == 'true') {
scan_2();
}
});
}
I've had similar problems working heavily with SharePoint web services - you often need to pull data from multiple sources to generate input for a single process.
To solve it I embedded this kind of functionality into my AJAX abstraction library. You can easily define a request which will trigger a set of handlers when complete. However each request can be defined with multiple http calls. Here's the component (and detailed documentation):
DPAJAX at DepressedPress.com
This simple example creates one request with three calls and then passes that information, in the call order, to a single handler:
// The handler function
function AddUp(Nums) { alert(Nums[1] + Nums[2] + Nums[3]) };
// Create the pool
myPool = DP_AJAX.createPool();
// Create the request
myRequest = DP_AJAX.createRequest(AddUp);
// Add the calls to the request
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [5,10]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [4,6]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [7,13]);
// Add the request to the pool
myPool.addRequest(myRequest);
Note that unlike many of the other solutions provided this method does not force single threading of the calls being made - each will still run as quickly (or as slowly) as the environment allows but the single handler will only be called when all are complete. It also supports the setting of timeout values and retry attempts if your service is a little flakey.
I've found it insanely useful (and incredibly simple to understand from a code perspective). No more chaining, no more counting calls and saving output. Just "set it and forget it".