I need a way to send multiple AJAX calls at the same time in Javascript/Angular.
After some searching i couldn't find an answer.
What i want to do is send all my requests as fast as possible.
If i execute my calls in a for loop or in a queue of promises with the $q library in Angular, a request gets sent, waits for it to execute the callback, and then sends the next one.
This is a example code:
var array = [];
Page.get({id:1}, function(result){
for(var i = 0; i < result.region[0].hotspots.length; i++){
var promise = Hotspot.get({id: result.region[0].hotspots[i].id});
array.push(promise);
}
$q.all(array).then(function(data){
console.log(data);
});
});
Page is a angular resource with a get method which requires a ID.
What i want is that they all get sent at the same time and they call their callback when ready. The order in which the calls get returned doesn't really matter.
Thanks
Think outside the box with Web Workers
An interesting aproach to solve this question, is to use web workers to execute the requests in a different thread. if you are not familiar with web workers i advice you to start by this great tutorial of techsith. Basically, you will be able to execute multiple jobs at the same time. See also the W3Schools Documentation.
This article from Html5Rocks teach us how to use Web Workers without a separate script file.
Have you tried using Async.js module?
You can achieve desired behavior using something like
Page.get({id:1}, function(result){
async.each(result.region[0].hotspots, callAsync, function(err, res){
console.log(res);
}
});
function callAsync(hotspot, callback){
callback(null, Hotspot.get({id: hotspot.id});
}
From Doc :
each(coll, iteratee, [callback])
Applies the function iteratee to each item in coll, in parallel. The
iteratee is called with an item from the list, and a callback for when
it has finished. If the iteratee passes an error to its callback, the
main callback (for the each function) is immediately called with the
error.
The $http service sends XHRs in parallel. The code below demostrates 10 XHRs being sent to httpbin.org and subsequently being received in a different order.
angular.module('myApp').controller('myVm', function ($scope, $http) {
var vm = $scope;
vm.sentList = [];
vm.rcvList = [];
//XHR with delay from 9 to 1 seconds
for (var n=9; n>0; n--) {
var url = "https://httpbin.org/delay/" + n;
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
});
};
//XHR with 3 second delay
var url = "https://httpbin.org/delay/3";
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
})
})
The DEMO on JSFiddle.
Related
I am currently working on a project where 4 get requests are fired simultaneously. I am at the same time using fade effects, and the asynchronous nature of this results in empty data intermittently.
I have been looking into this method as described in
Prefer way of doing multiple dependent ajax synchronous call to replace how I am currently doing
$.get('ajax_call_1').then(function(value) {
return $.get('ajax_call_2');
}).then(function(result) {
// success with both here
}, function(err) {
// error with one of them here
});
But, my question is: How can I access the return from each request individually with the above?
You've said the requests are sent simultaneously. The way you've written your code, they are sent sequentially though. Instead, with Promise.all, you can wait for all of the requests' promises and you'll be given an array with the results:
Promise.all([
$.get('ajax_call_1'),
$.get('ajax_call_2'),
$.get('ajax_call_3'),
$.get('ajax_call_4')
]).then(function(results) {
var first = results[0];
var second = results[1];
...
}).catch(function(err) {
// called if one of the requests fails
});
SOLVED: I solved my problem by doing each XMLHttpRequiest() recursively. Basically at the end of my xhr.onload, I would make another request and actively check if I've reach the end of my data - when I have I return.
I'm fairly new in JavaScript and have some familiarity with the D3 Library. I'm trying to read a CSV file from my computer using the D3 Library and sending specific information from my file to an API through an XMLHttpRequest().
With each call to the API which returns a JSON object to me, I store that object in a dataset for later use. I'm trying to have it so that my whole CSV file is read and processed before I work with the dataset, however I'm running into a problem since the API calls are asynchronous.
My code looks something like this:
var myData = [];
d3.csv("myFile.csv", function(data)
{
for (var i = 0; i < data.length; i++)
// Get appropriate data from data object
// Make API call with XMLHttpRequest() and store in myData array
});
// Handle fully updated myData array here
As it is, my code currently goes through my loop in almost an instant and makes all the API calls asynchronously and then proceeds to work on my data without waiting for anything to update.
Is there a way to ensure that my CSV file has been processed and all the API calls have returned before I can work with this dataset? I've tried callback functions and promises but had no success.
You can easily do this with a simple counter
var counter = 0;
var myData = [];
d3.csv("myFile.csv", function(data)
{
for (var i = 0; i < data.length; i++){
// Get appropriate data from data object
$.get("your/api/path", function(result){
counter++; // this is important. It increments on each xhr call made.
myData.push(result);
if(counter === data.length) cb(myData); // This will check if the current xhr request is the last xhr request. If yes then call our cb function.
});
}
});
function cb(data){
// This will run only when all the http requests are complete
// Do something with data
}
All this code does is, it makes sure that all of our requests should be completed first before calling our cb function (here you will write your further logic). This approach guarantees that cb will run only when all xhr requests are completed.
I think the answer in this post could help
d3: make the d3.csv function syncronous
You can as well use the Promise API.
I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example
I have an ajax call that retrieves data and on the success of it, runs a loop and runs functions that run more ajax calls.
CODE:
success: function(data){
// FIRST MAKE SURE DATA WAS FOUND
console.log(data);
if(data["status"] == "found")
{
// CREATE NEEDED ARRAY FROM SINGLE STRING
var restrictionArray = data["data_retrieved"].split(',');
loop_amount = restrictionArray.length; //<!-- AMOUNT TO BE LOOPED FOR BUILDING FORMS
//var Action = this.Elevation.getActionsByOptionId(Option.getID())[i];
for(var j = 0; j < loop_amount; j++)
{
var EditRowRestriction = OptionRules.prototype.getEditRowRestriction(j);
var check = $(EditRowRestriction).find("select");
console.log(check[0]);
var EditRowRestirction_select_ability = OptionRules.prototype.getEditRowRestriction_select_ability(j);
//var EditRowRestirction_access_ability = OptionRules.prototype.getEditRowRestriction_access_ability(j);
EditRowRestriction.onremove = function()
{
$(this).next().empty(); <!-- RESTRICTION SELECT ABILITY REMOVE
//$(this).next().next().empty(); <!-- RESTRICTION ACCESS ABILITY REMOVE
$(this).empty();
//var Action = this.Action;
//that.removeAction(Action);
}
tbody.appendChild(EditRowRestriction);
tbody.appendChild(EditRowRestirction_select_ability);
console.log(check[1]);
}
}
},
error:function(){
alert("An error occured, please try again.");
}
Heres the problem, inside the for loop; those methods link to another method that invokes an ajax call. What happens here is even before the ajax call's finish, the loop is always continuing and those methods are always being called. What I want to do is to stop the loop until those methods have returned based on the ajax call's being finished. And than to invoke the last 2 lines of code within the loop:
tbody.appendChild(EditRowRestriction);
tbody.appendChild(EditRowRestirction_select_ability);
What would be my best approach to accomplishing this?
Suggestions, thoughts?
It would be best to consolidate all of this looping with a single server-side script, however, if that isn't an option, you can use .then:
var def = $.Deferred(function(def){
def.resolve();
}).promise(); // used to start .then chain
for (var i = 0; i < 10; i++) {
def = def.then(function () {
return $.ajax({});
});
}
def.done(function(){
// All requests from chain are done
console.log('all done');
});
http://jsfiddle.net/kg45U/
You could modify the calls to $.ajax within the loop to execute synchronously, rather than asynchronously. (See SO question How can I get jQuery to perform a synchronous, rather than asynchronous, AJAX request?.) This would have the effect of "pausing" the loop while the inner ajax calls execute. This would be the most straightforward approach, but does have the disadvantage of locking up the user's browser while those ajax requests execute.
The alternative is to break up the code with the loops into pieces, so that you complete the processing with the loop in a callback function that is invoked after the inner-ajax calls have completed.
I'm trying to make long poll ajax calls, back to back. The problem with the current way I'm doing it is that I make each successive call from the callback function of the previous call. Is this a problem? Firebug doesn't show any of my ajax calls as completed, even thought the data is returned and the callback is executed. The recursive structure seems inefficient. Any ideas?
window.addEvent('domready', function()
{
server = new Request({
url: "chat.php",
method: 'get',
link: 'ignore',
onSuccess: callback,
});
request = server.send();
}
function callback(data)
{
console.log(data);
var data = JSON.decode(data);
messId = data.max;
for(var i = 0; i < data.messages.length; i++)
{
print("", data.messages[i].text);
}
var sendString = "messId="+messId;
request = server.send(sendString);
}
You're right, you have to maintain a stack and closures for no purpose when you do long polling that way and depending on the situation and the implementation you might get a stack overflow or at least run low on memory... though I don't know for sure what optimizations the various js implementations perform (e.g. tail recursion which would make those problems go away).
The easy alternative is to use window.setTimeout(funcName) which will immediately call the function funcName when the current scope resolves, from the global scope.