jQuery AJAX Requests in loop Array issue - javascript

I've got some AJAX requests in a for loop. The loop steps through an array of URLs and sends requests to all of them, checks for something on each of the pages, and then pages containing the right data will have some data written to an array. The array is a 'global' variable.
The loops works fine. The requests work fine. The array isn't working properly. The done() function for request 1 will push to the array, but the next done() will just overwrite. This consistently happens where done() calls in rapid succession will overwrite sections of the array instead of adding to it. I don't know why this is happening.
I understand the AJAX calls come back in odd orders and timings, but I thought my use of array.push() would get around the need for specifying index's. What's going on?
var globalArray = [];
for (var i = 0; i < URLList.length; i++)
{
(function (i) {
$.ajax(
{
url: URLList[i],
cache: false
}).done(function(html)
{
if (html.indexOf('someString') != -1)
{
globalArray.push(URLList[i]);
}
});
})(i);
}
So basically, even if the second done() triggers after the first done(), the second done's array.push() will pretend like the array is empty. I don't know why it's not checking each time.

Improvement for #Varrinder 's answer
var URLList = ["url1", "url2", "url3"];
var ajaxResulstArray = [];
makeRequest();
function makeRequest() {
var urlToRequest = URLList.shift();
if (urlToRequest != undefined) {
$.ajax({
url: urlToRequest, cache:false
}).done(function(html) {
if (html.indexOf("someString") != -1) {
ajaxResulstArray.push(urlToRequest);
}
}).always(makeRequest);
}
}

Hope im not completly missing the point here.
I'd try Array.shift()
Something like below:
var URLList = ["url1", "url2", "url3"];
var ajaxResulstArray = [];
makeRequest();
function makeRequest() {
var urlToRequest = URLList.shift();
if (urlToRequest != undefined) {
$.ajax({
url: urlToRequest, cache:false
}).done(function(html) {
if (html.indexOf("someString") != -1) {
ajaxResulstArray.push(urlToRequest);
}
makeRequest();
});
}
}

I have declared a global integer.
So far, this now seems to work. I can only surmise that push() takes too long to figure things out. Incidentally, in my console.log I also check the globalArray.length. Surprisingly, it doesn't seem to keep up very well with the globalArrayIndex method I'm not using. I guess .length is really slow, so many quick succession done() calls caused the issue.
var globalArray = [];
var globalArrayIndex = 0;
for (var i = 0; i < URLList.length; i++)
{
(function (i) {
$.ajax(
{
url: URLList[i],
cache: false
}).done(function(html)
{
if (html.indexOf('someString') != -1)
{
globalArray[globalArrayIndex] = (URLList[i]);
globalArrayIndex++;
}
});
})(i);
}

Related

JavaScript how to determine when AJAX calls are done in a loop

I've spent the last few days trying to tackle this issue and have read all sorts of solutions on StackOverflow and other sites.
I'm building a site that grabs XML data from an outside source and then gets more XML depending on the results to build a network graph. The problem is that I have to essentially wait until this loop of AJAX calls (which may loop into more AJAX calls) is finished before drawing.
I don't know if this just has an especially high cognitive load, but it really has me stumped.
My Code:
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
Note: I understand I may be completely misunderstanding JavaScript synchronicity here, and I very likely am. I have used callbacks and promises successfully in the past, I just can't seem to wrap my head around this one.
If I have not been totally clear, please let me know.
I did try implementing a counter of sorts that is incremented in the process() function, like so:
if (processCount < 15) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
} else {
draw(nodes, edges);
}
However, this ended up with several draw() calls which made my performance abysmal.
There are nice new well-supported APIs and language constructs we can use. The Fetch API, await, and for...of loops.
The Fetch API uses Promises. Promises can be awaited. The for...of loop is aware of await and won't continue the loop until the await has passed.
// Loop through, one-at-a-time
for (const node of outerNodes) {
// Make the HTTP request
const res = await fetch(someUrl);
// Do something with the response here...
}
Don't forget a try/catch (which also works with await), and check res.ok.
Brad's answer changes the code to by synchronious and to me that defeats the purpose. If you are constantly waiting on all request to be finished then it could take a while, while normal browsers can handle multiple requests.
The problem you have in your original questions is with scope. Since each call to cont(outerNodes) will trigger it's own scope, it has no idea what are calls are doing. So basically if you call cont(outerNodes) twice, each call will handle it's own list of outernodes and then call draw.
The solution is to share information between the different scopes, which can be done with one, but preferably two global variables: 1 to track active processes and 1 to track errors.
var inProcess = 0;
var nrErrors = 0;
function cont(outerNodes) {
//make sure you have outerNodes before you call outerNodes.length
if (outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
inProcess++; //add one more in process
getXML(node["label"], node["id"]);
}
}
//only trigger when nothing is in proces.
if (inProcess==0) {
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
inProcess--; //one is done
cont(outerNodes);
},
error: function() {
inProcess--; //one is done
nrErrors++; //one more error
cont(null); //run without new outerNodes, to trigger a possible draw
}
});
}
Please note that I track nrErrors but dont do anything with it. You could use it to stop further processing all together or warn the user that the draw is not complete.
[important] Keep in mind that this works in javascript because at best it mimics multithreading. That means the the call to inProcess--; and then right after cont(outerNodes); is always execute directly after eachother.
If you would port this to a true multithreading environment, it could very well be that another scope/version of cont(null); would cut in between the two lines and there would still be multiple draws.
The best way to solve this question should be using either promise or callback.
If you really want to avoid promise or callback(Although i don't know why...)
You can try with a counter.
let processCount = 0;
// Increasing the processCount in getXML callback method
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
processCount++;
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
while (processCount < outerNodes.length) {
// do nothing, just wait'
}
draw(nodes, edges);
If after testing it many times, you know that it will never take more than say 5 seconds... you can use a setTimeout.
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// Display a 5 second progress bar here
setTimeout(function(){ draw(nodes, edges); },5000);
}

Wait until all Ajax requests in a for loop are done before moving on?

I have to make a call to the Agile Central API to get a list of defect suites and then iterate through the list and make a nested call to get the list of defects in each suite, the nested call depends on the outer call. I then have to append the rows of data to a table and then call doneCallback() to signal the end of data collection. The problem I'm having is that doneCallback() is being called before the requests have completed so none of the data is actually passed on
I've tried the approaches in this post: Wait until all jQuery Ajax requests are done? and this post: how to wait until Array is filled (asynchronous). In the console I can see that all the data I want is there but nothing gets appended. My question is: how can I make sure I don't call doneCallback() until all the requests that are made in the loop have finished and pushed the data? Here's my code right now:
function getSuites() {
return $.ajax({
url: suitesURL("71101309592") + "&fetch=Name,FormattedID,Defects",
type: "GET",
xhrFields: {
withCredentials: true
},
headers: {
"zsessionid": apiKey
}
});
}
function getDefects(_ref) {
return $.ajax({
url: _ref,
type:"GET",
xhrFields: {
withCredentials: true
},
headers: {
"zsessionid": apiKey
}
});
}
// Download the data
myConnector.getData = function (table, doneCallback) {
console.log("Getting Data...");
var ajaxCalls = [], tableData = [];
var suitesJSON = getSuites();
suitesJSON.done(function(data) {
var suites = data.QueryResult.Results;
for(var i = 0; i < suites.length; i++) {
(function(i) {
var defectsJSON = getDefects(suites[i].Defects._ref + "?fetch=Name,FormattedID,State,Priority,CreationDate,c_RootCause,c_RootCauseCRM");
ajaxCalls.push(defectsJSON);
defectsJSON.done(function(data) {
var defects = data.QueryResult.Results;
for(var j = 0; j < defects.length; j++) {
tableData.push({
"suiteName": suites[i].Name, // This is the name of the suite collected in the outer call
"defectName": defects[j].Name,
"FormattedID": defects[j].FormattedID,
"State": defects[j].State,
"Priority": defects[j].Priority,
"CreationDate": defects[j].CreationDate,
"RootCause": defects[j].c_RootCause,
"RootCauseCRM": defects[j].c_RootCauseCRM
});
}
});
})(i);
}
});
$.when.apply($, ajaxCalls).then(function() {
console.log(tableData);
table.appendRows(tableData);
doneCallback();
});
};
You should use a better model to get multiple items. Using a for loop to query for multiple gets is the problem, and the solution should be to refactor so that you make one request that returns everything you need.
If this doesn't seem possible to you, I've researched a way to do what you want in jQuery.
$.when(
$.get(path, callback), $.get(path, callback), $.get(path, callback)
.then({
//This is called after all requests are done
});
You could create an array of all your requests like [$.get(path, callback), request2, request 3, etc...] and then use the spread method to put them as arguments like
var args = [$.get(path, callback), request2, request 3, etc...];
$.when(...args).then(() => {/*call here*/});
This link has the rest of the information
https://css-tricks.com/multiple-simultaneous-ajax-requests-one-callback-jquery/
I think the problem is that you are calling $.wait right after getSuites() is executed.
$.wait 'sees' the ajaxCalls array empty (because getSuites() hasn't finish yet) and executes doneCallback().
Try to call $.wait INSIDE the suitesJSON.done function, that way it will be called after the ajaxCalls array is filled with the first response:
myConnector.getData = function (table, doneCallback) {
console.log("Getting Data...");
var ajaxCalls = [], tableData = [];
var suitesJSON = getSuites();
suitesJSON.done(function(data) {
var suites = data.QueryResult.Results;
for(var i = 0; i < suites.length; i++) {
(function(i) {
var defectsJSON = getDefects(suites[i].Defects._ref + "?fetch=Name,FormattedID,State,Priority,CreationDate,c_RootCause,c_RootCauseCRM");
ajaxCalls.push(defectsJSON);
defectsJSON.done(function(data) {
var defects = data.QueryResult.Results;
for(var j = 0; j < defects.length; j++) {
tableData.push({
"suiteName": suites[i].Name, // This is the name of the suite collected in the outer call
"defectName": defects[j].Name,
"FormattedID": defects[j].FormattedID,
"State": defects[j].State,
"Priority": defects[j].Priority,
"CreationDate": defects[j].CreationDate,
"RootCause": defects[j].c_RootCause,
"RootCauseCRM": defects[j].c_RootCauseCRM
});
}
});
})(i);
}
$.when.apply($, ajaxCalls).then(function() {
console.log(tableData);
table.appendRows(tableData);
doneCallback();
});
});
};

Run ajax request until it returns results

I currently rely on a simple ajax call to fetch some data from our query service api. It is unfortunately not the most robust api and can at times return an empty result set. As such, I want to retry the ajax call until resultSet.length > 0.
I could use setTimeOut and break the loop if I find results, but it seems like an inelegant solution, especially as the time to completion is anywhere between 1s and 6s. I currently have the following, but it doesn't seem to break the loop when needed, and remains inelegant. Any help would be appreciated!
var resultSet = 0;
function fetchQueryData(query, time, iter) {
(function myLoop(i){
if (i == iter) {
fetchData(resultSet, dataset, query);
} else {
setTimeout(function(){
if (resultSet == 0) {
fetchData(resultSet, dataset, query);
}
if (--i) myLoop(i);
}, time)
}
})(iter);
}
fetchQueryData('select * from table', 6000, 5);
function fetchData(resultSet, dataset, query) {
var dataString = 'query=' + encodeURIComponent(query);
$.ajax({
type : "POST",
data: dataString,
url : "/queryapi",
success: function(json) {
var data = [];
var schema = json.data.schema;
var rows = json.data.rows;
if (typeof schema != 'undefined') {
resultSet = 1;
for (var i = 0; i < rows.length; i++) {
var obj = {};
for (var j = 0; j < schema.length; j++) {
obj[schema[j]['name']] = rows[i][j];
}
data.push(obj);
}
}
});
}
Instead of using a setTimeout, wrap the request in a function, and call that same function in the success callback of the request if the returned set is empty.
This will prevent you from sending more than one request to your API at a time, and will also terminate as soon as you get back a satisfactory response.
(In short, you're using recursion instead of an explicit loop.)

Handling multiple call asynchronous callbacks

I am learning node.js with learnyounode.
I am having a problem with JUGGLING ASYNC.
The problem is described as follows:
You are given three urls as command line arguments. You are supposed to make http.get() calls to get data from these urls and then print them in the same order as their order in the list of arguments.
Here is my code:
var http = require('http')
var truecount = 0;
var printlist = []
for(var i = 2; i < process.argv.length; i++) {
http.get(process.argv[i], function(response) {
var printdata = "";
response.setEncoding('utf8');
response.on('data', function(data) {
printdata += data;
})
response.on('end', function() {
truecount += 1
printlist.push(printdata)
if(truecount == 3) {
printlist.forEach(function(item) {
console.log(item)
})
}
})
})
}
Here is the questions I do not understand:
I am trying to store the completed data in response.on('end', function(){})for each url using a dictionary. However, I do not know how to get the url for that http.get(). If I can do a local variable inside http.get(), that would be great but I think whenever I declare a variable as var url, it will always point to the last url. Since it is global and it keeps updating through the loop. What is the best way for me to store those completed data as the value with the key equal to the url?
This is how I would go about solving the problem.
#!/usr/bin/env node
var http = require('http');
var argv = process.argv.splice(2),
truecount = argv.length,
pages = [];
function printUrls() {
if (--truecount > 0)
return;
for (i = 0; i < pages.length; i++) {
console.log(pages[i].data + '\n\n');
}
}
function HTMLPage(url) {
var _page = this;
_page.data = '### [URL](' + url + ')\n';
http.get(url, function(res) {
res.setEncoding('utf8');
res.on('data', function(data) {
_page.data += data;
});
res.on('end', printUrls);
});
}
for (var i = 0; i < argv.length; i++)
pages.push(new HTMLPage(argv[i]));
It adds the requests to an array on the start of each request, that way once done I can iterate nicely through the responses knowing that they are in the correct order.
When dealing with asynchronous processing, I find it much easier to think about each process as something with a concrete beginning and end. If you require the order of the requests to be preserved then the entry must be made on creation of each process, and then you refer back to that record on completion. Only then can you guarantee that you have things in the right order.
If you were desperate to use your above method, then you could define a variable inside your get callback closure and use that to store the urls, that way you wouldn't end up with the last url overwriting your variables. If you do go this way though, you'll dramatically increase your overhead when you have to use your urls from process.argv to access each response in that order. I wouldn't advise it.
I went about this challenge a little differently. I'm creating an array of functions that call http.get, and immediately invoking them with their specifc context. The streams write to an object where the key is the port of the server which that stream is relevant to. When the end event is triggered, it adds to that server to the completed array - when that array is full it iterates through and echos in the original order the servers were given.
There's no right way but there are probably a dozen or more ways. Wanted to share mine.
var http = require('http'),
request = [],
dataStrings = {},
orderOfServerInputs = [];
var completeResponses = [];
for(server in process.argv){
if(server >= 2){
orderOfServerInputs[orderOfServerInputs.length] = process.argv[server].substr(-4);
request[request.length] = function(thisServer){
http.get(process.argv[server], function(response){
response.on("data", function(data){
dataStrings[thisServer.substr(-4)] = dataStrings[thisServer.substr(-4)] ? dataStrings[thisServer.substr(-4)] : ''; //if not set set to ''
dataStrings[thisServer.substr(-4)] += data.toString();
});
response.on("end", function(data){
completeResponses[completeResponses.length] = true;
if(completeResponses.length > 2){
for(item in orderOfServerInputs){
serverNo = orderOfServerInputs[item].substr(-4)
console.log(dataStrings[serverNo]);
}
}
});
});
}(process.argv[server]);
}
}
Immediately-Invoked Function Expression (IIFE) could be a solution to your problem. It allows us to bind to function a specific value, in your case, the url which gets the response. In the code below, I bind variable i to index and so, whichever url gets the response, that index of print list will be updated. For more information, refer to this website
var http = require('http')
var truecount = 0;
var printlist = [];
for(var i = 2; i < process.argv.length; i++) {
(function(index){
http.get(process.argv[index], function(response) {
response.setEncoding('utf8');
response.on('data', function(data) {
if (printlist[index] == undefined)
printlist[index] = data;
else
printlist[index]+= data;
})
response.on('end', function() {
truecount += 1
if(truecount == 3) {
printlist.forEach(function(item) {
console.log(item)
})
}
})
})
})(i)
}

Callback to prevent looping before completion of ajax

I have read countless examples of similar posts calling for help and also explanations of the theory behind callbacks, but I just can't grasp it. I have gotten to the stage where I'd rather find a solution for my particular scenario and move on even if I don't really understand 'why/how' it works.
I have an ajax call that needs to loop through and need to find a way to prevent the next call before the previous has completed. Could you suggest how I might use a callback or other method to make this happen.
Here's the code (which works, but doesn't run the ajax calls 1-by-1 so I'm getting memory errors and page crashes). The function that runs is quite intensive and can take up to 20 seconds (but as little as 1 second)
function returnAjax(startLoc,startRow)
{
var url='index.php?option=com_productfinderrtw&format=raw&task=goThroughProcess';
var data = 'startloc='+startLoc+'&starttour='+startRow;
var request = new Request({
url: url,
method:'get',
data: data,
onSuccess: function(responseText){
document.getElementById('fields-container').innerHTML= responseText;
//I realise this is where on-success code cneeds to go- is this where the callback belongs?
}
}).send();
}
function iterator (startLoc,startRow) {
if (startRow <20)
{
startRow++;
}
else
{
startRow = 1;
startLoc++;
}
return [startLoc, startRow];
}
function runRAA() {
var startLoc = 0;
var startRow = 1;
while (startLoc < 47)
{
returnAjax(startLoc,startRow);
$counter = iterator(startLoc,startRow);
var newLoc = $counter[0];
var newRow = $counter[1];
startLoc = newLoc;
startRow = newRow;
}
}
runRAA() is the main function that runs on a button press. How can I rearrange this to make sure that returnAjax doesn't run until the previous time is completed?
Thanks in advance for this. I KNOW that similar questions have been asked, so I beg that you don't direct me to other explanations- chances are I've read them but just don't grasp the concept.
Cheers!
PS. I understand that the iterator() function needs to run only when the returnAjax() is complete also, as iterator() sets the new parameter values for each instance of the returnAjax() function
Allow to pass a callback parameter that will be called when the ajax call is completed.
function returnAjax(startLoc, startRow, callback) {
//...
onSuccess: function(responseText) {
document.getElementById('fields-container').innerHTML= responseText;
if (callback) {
callback.apply(this, arguments); //call the callback
}
}
//...
}
Then you can do something like this:
function runRAA(startLoc, startRow) {
startLoc = startLoc || 0;
startRow = startRow || 1;
if (startLoc < 47) {
returnAjax(startLoc, startRow, function (responseText) {
var counter = iterator(startLoc, startRow);
//do something with the response
//perform the next ajax request
runRAA(counter[0], counter[1]);
}));
}
}
runRAA(); //start the process

Categories

Resources