JavaScript how to determine when AJAX calls are done in a loop - javascript

I've spent the last few days trying to tackle this issue and have read all sorts of solutions on StackOverflow and other sites.
I'm building a site that grabs XML data from an outside source and then gets more XML depending on the results to build a network graph. The problem is that I have to essentially wait until this loop of AJAX calls (which may loop into more AJAX calls) is finished before drawing.
I don't know if this just has an especially high cognitive load, but it really has me stumped.
My Code:
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
Note: I understand I may be completely misunderstanding JavaScript synchronicity here, and I very likely am. I have used callbacks and promises successfully in the past, I just can't seem to wrap my head around this one.
If I have not been totally clear, please let me know.
I did try implementing a counter of sorts that is incremented in the process() function, like so:
if (processCount < 15) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
} else {
draw(nodes, edges);
}
However, this ended up with several draw() calls which made my performance abysmal.

There are nice new well-supported APIs and language constructs we can use. The Fetch API, await, and for...of loops.
The Fetch API uses Promises. Promises can be awaited. The for...of loop is aware of await and won't continue the loop until the await has passed.
// Loop through, one-at-a-time
for (const node of outerNodes) {
// Make the HTTP request
const res = await fetch(someUrl);
// Do something with the response here...
}
Don't forget a try/catch (which also works with await), and check res.ok.

Brad's answer changes the code to by synchronious and to me that defeats the purpose. If you are constantly waiting on all request to be finished then it could take a while, while normal browsers can handle multiple requests.
The problem you have in your original questions is with scope. Since each call to cont(outerNodes) will trigger it's own scope, it has no idea what are calls are doing. So basically if you call cont(outerNodes) twice, each call will handle it's own list of outernodes and then call draw.
The solution is to share information between the different scopes, which can be done with one, but preferably two global variables: 1 to track active processes and 1 to track errors.
var inProcess = 0;
var nrErrors = 0;
function cont(outerNodes) {
//make sure you have outerNodes before you call outerNodes.length
if (outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
inProcess++; //add one more in process
getXML(node["label"], node["id"]);
}
}
//only trigger when nothing is in proces.
if (inProcess==0) {
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
inProcess--; //one is done
cont(outerNodes);
},
error: function() {
inProcess--; //one is done
nrErrors++; //one more error
cont(null); //run without new outerNodes, to trigger a possible draw
}
});
}
Please note that I track nrErrors but dont do anything with it. You could use it to stop further processing all together or warn the user that the draw is not complete.
[important] Keep in mind that this works in javascript because at best it mimics multithreading. That means the the call to inProcess--; and then right after cont(outerNodes); is always execute directly after eachother.
If you would port this to a true multithreading environment, it could very well be that another scope/version of cont(null); would cut in between the two lines and there would still be multiple draws.

The best way to solve this question should be using either promise or callback.
If you really want to avoid promise or callback(Although i don't know why...)
You can try with a counter.
let processCount = 0;
// Increasing the processCount in getXML callback method
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
processCount++;
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
while (processCount < outerNodes.length) {
// do nothing, just wait'
}
draw(nodes, edges);

If after testing it many times, you know that it will never take more than say 5 seconds... you can use a setTimeout.
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// Display a 5 second progress bar here
setTimeout(function(){ draw(nodes, edges); },5000);
}

Related

Async call from function, calling API

I'm still learning about async.
I can't get my asynchro code to work. I have an fonction, that call an API.
I call this function on another function, to be able to manipulate later the data and merge them.
But my code always goes forward, without waiting the first function answer.
Async / promise problem
var exchangeListing = {
"exchange":
[
{
"name": "Kraken",
"pair": ['XBTUSD','XETHUSD']
},{
"name": "Coinbase",
"pair": ["BTC-USD","ETH-USDC"]
},{
"name":"Bittrex",
"pair": ["BTC-USD","BTC-ETH"]
}
]
}
function BXgetPrice(currencies) {
var ret;
bittrex.getmarketsummary( { market : currencies}, function( data, err ) {
if(data!=null) {
ret = {
message: {
type: 'success',
data: {
last:data.result[0].Last,
volume:data.result[0].Volume,
date:data.result[0].TimeStamp,
}
}
};
return ret;
} else {
return;
}
});
return ret;
}
//Loop to get all data from all exchanges we need, and merge in a JSON, to send to HTML page
function exchangeListe(exchangeListing) {
var exchangeListing = JSON.stringify(exchangeListing);
var exchangeListing = JSON.parse(exchangeListing);
console.log(exchangeListing);
for(var i = 0, len = exchangeListing.exchange.length; i < len; i++) {
var a = exchangeListing.exchange[i].name;
for(var j=0, l=exchangeListing.exchange[i].pair.length; j<l; j++) {
if(a=="Kraken") {
//add code to manipulate data
} else if(a=="Bittrex") {
console.log("bittrex");
BXgetPrice(exchangeListing.exchange[i].pair[j], function(data,err){
console.log("hi"+data);
//add code to manipulate data
}) ;
} else if(a=="Coinbase") {
//add code to manipulate data
}
}
}
}
In Exchange list, i need my loop to do all the exchange available and all pair, get data from the API call, and merge them after.
For now, when launch, the exchangeListe(exchangeListing), i don't get the data from the function BXgetPrice. Data is empty.
I try to add the async function on both, and declare as const + use await, but nothing help, i that case i get Promise {}
Thanks for your help
This behavior is by design. Asynchronous call means that you do something (like send an AJAX request) and do not wait for it, your program keeps going when the result is not yet received. You expect your code to run in synchronous manner, but it is not. You could make your code synchronous, but it would be a bad idea, because then you are waiting for synchronous responses and your page might be frozen, which is bad UX. The right way to do this is to either call exchangeListe in your callback (the function which is passed to the asynchronous function is the callback, that is, it will be executed when the async job is done), or you could refactor your code and use promises instead, with equivalent results, but more elegant code.

Async request into for loop angular.js

I have an array and i need to send values of array to webservice through http post request one by one . For the node.js , i'm using "async" package to do that for ex: async.eachSeries doing it well , how can i do that same thing for angular.js , my normal async code;
//this code sends all queries of array (maybe 5.000 request at same time , it is hard to process for webservice :=) ) at same time and wait for all responses.
//it works but actually for me , responses should wait others at end of loop should work one by one
//like async.eachSeries module!
for (var i = 0; i < myArr.lenght; i++) {
(function (i) {
var data = {
"myQuery": myArr[i].query
};
$http.post("/myServiceUrl", data).success(function (result) {
console.log(result);
});
})(i);
}
Both Matt Way and Chris L answers Correct , you can investigate Chris's answer for understanding about async to sync functions in for loops.
You can use $q to create a similar requirement by chaining promises together. For example:
var chain = $q.when();
angular.forEach(myArr, function(item){
chain = chain.then(function(){
var data = {
myQuery: item.query
};
return $http.post('/myServiceUrl', data).success(function(result){
console.log(result);
});
});
});
// the final chain object will resolve once all the posts have completed.
chain.then(function(){
console.log('all done!');
});
Essentially you are just running the next promise once the previous one has completed. Emphasis here on the fact that each request will wait until the previous one has completed, as per your question.
function logResultFromWebService(value)
{
$http.post("/myServiceUrl", value).success(console.log);
}
angular.forEach(myArray, logResultFromWebService);
If I understand your question correctly. You want to run a for loop in a synchronized manner such that the next iteration only occurs once the previous iteration is completed. For that, you can use a synchronized loop/callbacks. Especially if the order matters.
var syncLoop = function (iterations, process, exit) {
var index = 0,
done = false,
shouldExit = false;
var loop = {
next: function () {
if (done) {
if (shouldExit && exit) {
return exit(); // Exit if we're done
}
}
// If we're not finished
if (index < iterations) {
index++; // Increment our index
process(loop); // Run our process, pass in the loop
// Otherwise we're done
} else {
done = true; // Make sure we say we're done
if (exit) exit(); // Call the callback on exit
}
},
iteration: function () {
return index - 1; // Return the loop number we're on
},
break: function (end) {
done = true; // End the loop
shouldExit = end; // Passing end as true means we still call the exit callback
}
};
console.log('running first time');
loop.next();
return loop;
}
For your particular implementation:
syncLoop(myArray.length, function (loop) {
var index = loop.iteration();
var data = {
"myQuery": myArray[index].query
};
$http.post("/myServiceUrl", data).success(function (result) {
console.log(result);
loop.next();
});
}, function () {
console.log('done');
});
If you intend on doing something with the data once returned (such as perform calculations) you can do so with this method because you will return the data in a specified order.
I implemented something similar in a statistical calculation web app I built.
EDIT:
To illustrate the problem I had when using $q.when I have set up a fiddle. Hopefully this will help illustrate why I did this the way I did.
https://jsfiddle.net/chrislewispac/6atp3w8o/
Using the following code from Matt's answer:
var chain = $q.when(promise.getResult());
angular.forEach(myArr, function (item) {
chain = chain.then(function () {
$rootScope.status = item;
console.log(item);
});
});
// the final chain object will resolve once all the posts have completed.
chain.then(function () {
console.log('all done!');
});
And this fiddle is an example of my solution:
https://jsfiddle.net/chrislewispac/Lgwteone/3/
Compare the $q version to my version. View the console and imagine those being delivered to the user interface for user intervention in the process and/or performing statistical operations on the sequential returns.
You will see that it does not sequentially give the numbers 1,2,3,4 etc. either in the console or in the view in Matt's answer. It 'batches' the responses and then returns them. Therefore, if step 3 is not to be run depending on the response in step 2 there is not, at least in the answer provided, a way to break out or explicitly control the synchronous operation here. This presents a significant problem when attempting to perform sequential calculations and/or allow the user to control break points, etc.
Now, I am digging through both the $q libraries and the Q library to see if there is a more elegant solution for this problem. However, my solution does work as requested and is very explicit which allows me to place the function in a service and manipulate for certain use cases at my will because I completely understand what it is doing. For me, that is more important than using a library (at least at this stage in my development as a programmer and I am sure there are lots of other people at the same stage on StackOverflow as well).
If the order doesn't matter in which they are sent
var items = [/* your array */];
var promises = [];
angular.forEach(items, function(value, key){
var promise = $http.post("/myServiceUrl", { "myQuery": value.query });
promises.push(promise);
});
return $q.all(promises);

Wait for all $http requests to complete in Angular JS

I have a page than can make a different number of $http requests depending on the length of a variables, and then I want to send the data to the scope only when all the requests are finished. For this project I do not want to use jQuery, so please do not include jQuery in your answer. At the moment, the data is sent to the scope as each of the requests finish, which isn't what I want to happen.
Here is part of the code I have so far.
for (var a = 0; a < subs.length; a++) {
$http.get(url).success(function (data) {
for (var i = 0; i < data.children.length; i++) {
rData[data.children.name] = data.children.age;
}
});
}
Here is the part that I am sceptical about, because something needs to be an argument for $q.all(), but it is not mentioned on the docs for Angular and I am unsure what it is meant to be.
$q.all().then(function () {
$scope.rData = rData;
});
Thanks for any help.
$http call always returns a promise which can be used with $q.all function.
var one = $http.get(...);
var two = $http.get(...);
$q.all([one, two]).then(...);
You can find more details about this behaviour in the documentation:
all(promises)
promises - An array or hash of promises.
In your case you need to create an array and push all the calls into it in the loop. This way, you can use $q.all(…) on your array the same way as in the example above:
var arr = [];
for (var a = 0; a < subs.length; ++a) {
arr.push($http.get(url));
}
$q.all(arr).then(function (ret) {
// ret[0] contains the response of the first call
// ret[1] contains the second response
// etc.
});

Callback to prevent looping before completion of ajax

I have read countless examples of similar posts calling for help and also explanations of the theory behind callbacks, but I just can't grasp it. I have gotten to the stage where I'd rather find a solution for my particular scenario and move on even if I don't really understand 'why/how' it works.
I have an ajax call that needs to loop through and need to find a way to prevent the next call before the previous has completed. Could you suggest how I might use a callback or other method to make this happen.
Here's the code (which works, but doesn't run the ajax calls 1-by-1 so I'm getting memory errors and page crashes). The function that runs is quite intensive and can take up to 20 seconds (but as little as 1 second)
function returnAjax(startLoc,startRow)
{
var url='index.php?option=com_productfinderrtw&format=raw&task=goThroughProcess';
var data = 'startloc='+startLoc+'&starttour='+startRow;
var request = new Request({
url: url,
method:'get',
data: data,
onSuccess: function(responseText){
document.getElementById('fields-container').innerHTML= responseText;
//I realise this is where on-success code cneeds to go- is this where the callback belongs?
}
}).send();
}
function iterator (startLoc,startRow) {
if (startRow <20)
{
startRow++;
}
else
{
startRow = 1;
startLoc++;
}
return [startLoc, startRow];
}
function runRAA() {
var startLoc = 0;
var startRow = 1;
while (startLoc < 47)
{
returnAjax(startLoc,startRow);
$counter = iterator(startLoc,startRow);
var newLoc = $counter[0];
var newRow = $counter[1];
startLoc = newLoc;
startRow = newRow;
}
}
runRAA() is the main function that runs on a button press. How can I rearrange this to make sure that returnAjax doesn't run until the previous time is completed?
Thanks in advance for this. I KNOW that similar questions have been asked, so I beg that you don't direct me to other explanations- chances are I've read them but just don't grasp the concept.
Cheers!
PS. I understand that the iterator() function needs to run only when the returnAjax() is complete also, as iterator() sets the new parameter values for each instance of the returnAjax() function
Allow to pass a callback parameter that will be called when the ajax call is completed.
function returnAjax(startLoc, startRow, callback) {
//...
onSuccess: function(responseText) {
document.getElementById('fields-container').innerHTML= responseText;
if (callback) {
callback.apply(this, arguments); //call the callback
}
}
//...
}
Then you can do something like this:
function runRAA(startLoc, startRow) {
startLoc = startLoc || 0;
startRow = startRow || 1;
if (startLoc < 47) {
returnAjax(startLoc, startRow, function (responseText) {
var counter = iterator(startLoc, startRow);
//do something with the response
//perform the next ajax request
runRAA(counter[0], counter[1]);
}));
}
}
runRAA(); //start the process

how to wait until Array is filled (asynchronous)

I have an array which is filled asynchronous and contains 28 items. I want to wait until the array is filled with all items.
function checkIfFinished(){
return(Results.length >= 28);
}
var isfinished = false;
while(isfinished){
if(checkIfFinished()){
returnResults();
isfinished = true;
}
else
//Wait 100ms
}
Well, but in Javascript there is no wait function! I tried it with setTimeout, but I don't know how to insert it... I just get errors with too much recursion and stuff :D
Thank you!
Try:
var timeout = setInterval(function() {
if(checkIfFinished()) {
clearInterval(timeout);
isFinished = true;
}
}, 100);
This will call your check-function every 100 ms until checkIfFinished() gives true back to you.
If you're using jQuery 1.5+, this sounds like a perfect opportunity to use deferred objects and promises in your code. I'm assuming that you're using AJAX calls to populate your array.
In a nutshell, something like this should work for you:
$(function() {
var $ajaxcalls = [],
myArray = [];
// set up all the ajax calls that will populate my array
for(var i=0; i < 28; i++) {
$ajaxcalls[i] = $.ajax({
url : 'http://your.domain.com/blah',
data : i
}).success(function(m) {
myArray.push(m);
});
}
// this will setup the promise ---
// what will run when all 28 AJAX calls complete?
$.when.apply(null, $ajaxcalls).then(function() {
returnResults();
});
});
I've written about this some time back as well. I really think it's a nifty feature / concept that can be really powerful when used correctly. Javascript timers and schedules should work as well, but they can be unwieldy and may result in a bit of wait time before the actual completing logic fires.

Categories

Resources