Chained jQuery AJAX with promise - javascript

I am currently working on a project where 4 get requests are fired simultaneously. I am at the same time using fade effects, and the asynchronous nature of this results in empty data intermittently.
I have been looking into this method as described in
Prefer way of doing multiple dependent ajax synchronous call to replace how I am currently doing
$.get('ajax_call_1').then(function(value) {
return $.get('ajax_call_2');
}).then(function(result) {
// success with both here
}, function(err) {
// error with one of them here
});
But, my question is: How can I access the return from each request individually with the above?

You've said the requests are sent simultaneously. The way you've written your code, they are sent sequentially though. Instead, with Promise.all, you can wait for all of the requests' promises and you'll be given an array with the results:
Promise.all([
$.get('ajax_call_1'),
$.get('ajax_call_2'),
$.get('ajax_call_3'),
$.get('ajax_call_4')
]).then(function(results) {
var first = results[0];
var second = results[1];
...
}).catch(function(err) {
// called if one of the requests fails
});

Related

jQuery: Retrieve data in a separate thread

I have a web application that has to perform the following task. For a chosen date range, it makes GET request to a web service for each date in the range; this can take a while, and because I want to visualize the data later, all the calls are synchronous (the result of each request gets stored into an array). This retrieval takes a while (several seconds) which means the main thread "freezes."
What would be a good way to avoid this? (E.g. doing the retrieval in a separate thread and getting notified once it's done.)
Consider using promises.
They enable you to perform non-blocking calls to API. It's basically what you are asking for.
EDIT: You can use when() specifically to be notified, when all operations are done.
You should make your GET-requests async and then visualize when all the requests have completed.
var get1 = $get(..
var get2 = $get(..
var get3 = $get(..
$.when(get1, get2, get3).done(function (...) {
// do something with the response
visualize();
});
In fact there is a simple solution. Let's implement a function which needs to be executed when all responses arrived:
function onFinished(responses) {
//Do something
}
Now, let's suppose you have a function which returns the dates as array:
function getDates(range) {
//Do something
}
Also, we need a getURL, like this:
function getURL(date) {
//Do something
}
Finally, let's suppose you have a variable called dateRange which has the range you will use as input in getDates. So we can do this:
var requestDates = getDates(dateRange);
var requestsPending = requestDates.length;
var responses = [];
for (var requestIndex in requestDates) {
$.ajax({
url: getURL(requestDates[requestIndex]),
method: "GET",
//You might pass data as well here if needed in the form of
//data: yourobject,
}).done(function(data, textStatus, jqXHR) {
//Handle response, parse it and store the result using responses.push()
}).fail(function(jqXHR, textStatus, errorThrown) {
//Handle failed requests
}).always(function(param1, param2, param3) {
if (--requestsPending === 0) {
onFinished(responses);
}
});
}
This will send AJAX requests for each date you need and wait for their responses asynchronously, so, you effectively do not wait for the sum of the pending time, but for the longest pending time, which is a great optimization. It is impossible to solve this in a multithreaded fashion, as Javascript is single-threaded, so you need to wait asynchronously for the answers, as the requests won't wait for each-other on the server. If you own the server as well, then you do not need to send a request for each date, but to implement a server-side API function where you will handle date ranges, so client-side will send a single request and wait for the answer.

AJAX requests not executing in parallel using $.when(...)

My script makes around 15 ajax calls to my server. I want them to execute in parallel. I am trying to use when to execute a bunch of ajax requests in parallel but for some reason it's still doing them one by one.
My code is below:
var requests = $.map(results, function(result, i) {
var suggestionContainer = $('<div class="suggestion-' + i + '">' + result + '</div>');
resultsContainer.append(suggestionContainer);
return $.get('check.php', { keyword: result }).done(function(res) {
res = JSON.parse(res);
suggestionContainer.append(generateSocialMediaList(res.social_media))
.append(generateDomainList(res.domains));
}).fail(function() {
suggestionContainer.remove();
});
});
$.when(requests).done(function() {
console.log('Complete')
}, function() {
alert('Something Failed');
});
Is there anything I'm doing wrong?
The reason why I am making 15 requests is because check.php makes a call to a third party API. The API is slow, and unfortunately there is no alternative. Making 15 parallel requests would be much quicker than 1 request and wait for check.php to complete.
The code works like so:
A request to suggestions.php is made (that's not included as it's not required to solve the problem). The results are stored in an array results.
The results (there's about 15) are returned and iterated over with map.
I insert the suggestion into the page. I then return an promise.
The requests array now contains 10-15 promises.
I use when to execute the requests in parallel (that's what I'm trying to do at least).
Upon success the DOM is updated and the results from check.php are inserted into the DOM.
Your code will execute the requests with as much parallelism as the browser allows.
Browsers used to limit concurrent requests to a single domain to 6. This may have changed.
The responses to these requests will be serviced one by one because JavaScript is single-threaded, and this is what you are observing.
Also: double-check your AJAX call is not failing. when will reject as soon as any of the promises is rejected.

Multiple AJAX calls in loop

I need a way to send multiple AJAX calls at the same time in Javascript/Angular.
After some searching i couldn't find an answer.
What i want to do is send all my requests as fast as possible.
If i execute my calls in a for loop or in a queue of promises with the $q library in Angular, a request gets sent, waits for it to execute the callback, and then sends the next one.
This is a example code:
var array = [];
Page.get({id:1}, function(result){
for(var i = 0; i < result.region[0].hotspots.length; i++){
var promise = Hotspot.get({id: result.region[0].hotspots[i].id});
array.push(promise);
}
$q.all(array).then(function(data){
console.log(data);
});
});
Page is a angular resource with a get method which requires a ID.
What i want is that they all get sent at the same time and they call their callback when ready. The order in which the calls get returned doesn't really matter.
Thanks
Think outside the box with Web Workers
An interesting aproach to solve this question, is to use web workers to execute the requests in a different thread. if you are not familiar with web workers i advice you to start by this great tutorial of techsith. Basically, you will be able to execute multiple jobs at the same time. See also the W3Schools Documentation.
This article from Html5Rocks teach us how to use Web Workers without a separate script file.
Have you tried using Async.js module?
You can achieve desired behavior using something like
Page.get({id:1}, function(result){
async.each(result.region[0].hotspots, callAsync, function(err, res){
console.log(res);
}
});
function callAsync(hotspot, callback){
callback(null, Hotspot.get({id: hotspot.id});
}
From Doc :
each(coll, iteratee, [callback])
Applies the function iteratee to each item in coll, in parallel. The
iteratee is called with an item from the list, and a callback for when
it has finished. If the iteratee passes an error to its callback, the
main callback (for the each function) is immediately called with the
error.
The $http service sends XHRs in parallel. The code below demostrates 10 XHRs being sent to httpbin.org and subsequently being received in a different order.
angular.module('myApp').controller('myVm', function ($scope, $http) {
var vm = $scope;
vm.sentList = [];
vm.rcvList = [];
//XHR with delay from 9 to 1 seconds
for (var n=9; n>0; n--) {
var url = "https://httpbin.org/delay/" + n;
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
});
};
//XHR with 3 second delay
var url = "https://httpbin.org/delay/3";
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
})
})
The DEMO on JSFiddle.

jQuery: AJAX request fires multiple times until response is successfully using $.Deferreds

Problem!
While trying to perform a set of AJAX request, most of the time at least one of the request is always getting a pending response, this is resulting in a loop of requests until it gets a succesful response. Please note that I using jQuery.when, this way I can ensure that both requests have been executed.
The mentioned behaviour is resulting on:
Multiple requests to the same source
jQuery.always is executes as many times as requests performed
The interface is crashing due to multiple updates on it's DOM.
Example
var request = [];
request.push(getProductPrice().done(
function(price) {
updateProductPrice(price);
}
);
request.push(getProductInfo().done(
function(information) {
updateProductInformation(information);
}
);
jQuery.when.apply(undefined, request).always(function() {
doSomeStuff1();
doSomeStuff2();
...
...
...
doSomeStuffN();
});
function updateProductPrice(obj) {
return jQuery.get(...);
}
function updateProductInformation(obj) {
return jQuery.get(...);
}
Questions?
Is there any reason on why I am getting a pending response?
Is this problem realted to jQuery.when trying to release the AJAX request in order to fire-up the callbacks?
Facts
If I do the request to the mentioned sources via synchronous, I will never get a pending response. I am just trying to avoid the use of async: false.
Update #1
By pending status I meant the response given by the web browser to my request, which is nothing but the ajax call waiting for it's response. The main problem resides on how those AJAX request are being treated, I am noticing that the functions updateProdcutPrice() and updateProductInformation() are being called N times until the response from the server is succesful, this is resulting that the functions declared on the .always()'s callback for the requestes performed on updateProdcutPrice() and updateProductInformation() are also being called that many times.

Queuing/throttling jQuery ajax requests

I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example

Categories

Resources