I have a range of hosted images, but the amount and url can change. However, they always follow the same incrementing naming convention. In order to store the images that exist, I loop through the available images until I encounter an error. At that point I store the results and continue with the operations.
function checkForExistingImage(u, r = [], i = 0){
var url = u + i;
$.get(url).done(function() {
r.push(url);
checkForExistingImage(u, r, i + 1);
}).fail(function(){
//continue working with the 'r' array
});
}
However, this will always result in a 403 (Image not found) error in the console, because the last image checked will never exist.
How can I not trigger this error, or maybe suppress it if needed?
I would definitely rewrite this in a more civilised manner. This function does not log any errors outside of those that are explicitly logged by console.log (which you can remove).
It should be safer to use this as it does not bombard the server with too many requests per second, but you can reduce or remove the timeout if that's not a problem.
function Timeout(time) {
return new Promise(function(resolve) {setTimeout(resolve, time);});
}
async function checkForExistingImages(baseUrl, maxImages, startImage = 0) {
const results = [];
// Some sanity check on params
if(maxImages < startImage) {
let tmp = maxImages;
maxImages = startImage + 1;
startImage = maxImages;
}
// from i to max
for(let i=startImage; i<maxImages; ++i) {
// Create image URL, change this as needed
const imageURL = baseUrl + i + ".png";
// `fetch` does not throw usually, but we wanted to avoid errors in console
try {
// Response will have 200/404/403 or something else
const response = await fetch(imageURL);
if(response.status == 200) {
results.push(imageURL);
}
else {
console.info("Image",imageURL,"was removed.");
// stop loading
break;
}
}
// complete failure, some major error occured
catch(e) {
console.warn("Image",imageURL, "failed to send request!");
break;
}
// If you're getting throttled by the server, use timeout
await Timeout(200);
}
return results;
}
Related
I am new to node.js and am trying to get an initial page key from an api response header and loop through the pages until the page key header no longer exists, meaning I've reached the last page and the loop can end.
For some reason the pageKey never gets changes in the while loop the way I have it set up:
async function doThis() {
let gotAllPages = false;
let pageKey;
var response = await got(url, options);
try {
pageKey = response.headers['next-page'];
} catch (e) {
gotAllPages = true;
}
while (!gotAllPages) {
var pageOptions = {
method: 'GET',
headers: {
Authorization: `xxx`,
nextPage: pageKey
}
};
response = await got(url, pageOptions);
try {
pageKey = response.headers['next-page'];
console.log(pageKey);
} catch (e) {
console.log("No header value found, loop is done.");
gotAllPages = true;
break;
}
}
}
This ends up going in an infinite loop because the pageKey never gets changed from the first response so it just keeps trying to grab the same page.
Being new to Node.js, I don't really understand if there's a scoping issue with how I've set the variables up or what, so any advice would be appreciated!
Try-catch blocks do not fail when setting a variable to undefined or null. Instead of using a try-catch block, use an if statement.
pageKey = response.headers['next-page'];
if (!pageKey) {
console.log("No header value found, loop is done.");
gotAllPages = true;
break;
}
The key access response.headers['next-page']; isn't throwing anything, so the catch where gotAllPages is set never runs.
We can fix this, and tidy up a bit...
async function getAllPages() {
let results = [];
let nextPage = 0;
let options = {
method: 'GET',
headers: { Authorization: `xxx` }
};
while (nextPage !== undefined) {
options.headers.nextPage = nextPage;
let response = await got(url, options);
results.push(response);
nextPage = response.headers['next-page'];
}
return result;
}
This assumes a couple things: (1) that the API expects nextPage === 0 on the first request, (2) that the API signals no more pages by leaving the next-page header undefined, (3) that the caller wants the full response, rather than some part like response.data or response.body, (4) that the caller catches errors thrown here and handles them.
I have a Google Apps Script web app ("Web App") that executes as the user, then calls individual functions from another Apps Script project ("API Executable") via the Apps Script API using UrlFetchApp.fetch() and executes them as me (see Get user info when someone runs Google Apps Script web app as me).
A limitation of this method is that UrlFetchApp.fetch() has a 60s timeout, and one of my functions often takes longer than this. The API Executable function finishes running successfully, but the web app throws a timeout exception. I would like to handle this exception by running a second "follow-up" function that finds and returns the URL of the Google Sheet successfully created by the original function. However, I'll need to pass the follow-up function one of the parameters passed to the original function, and it appears I can't do this within a standard try...catch block.
My idea was to throw an exception that contains the needed parameter, but I can't figure out how to throw my own timeout exception; since Google Apps Script is synchronous, there's no way to track how long UrlFetchApp.fetch() has been running while it's running.
Is there a way to throw your own timeout exception? Or, is there another way I can pass the needed parameter to a function that executes if there's a timeout error?
I tagged Javascript in this post as well since there's a lot of overlap with Google Apps Script and I figured it would improve my chance of connecting with someone who has an answer--hope that's okay. Below is the function I'm using in my web app to call my API Executable functions, in case that's helpful.
EDIT: Based on #TheMaster's comment, I decided to write the script as though parameters passed to executeAsMe() WERE being passed to the catch() block, to see what happened. I expected an exception regarding the fact the opt_timeoutFunction was undefined, but strangely it looks like only the first line of the catch() block is even running, and I'm not sure why.
function executeAsMe(functionName, paramsArray, opt_timeoutFunction, opt_timeoutParams) {
try {
console.log('Using Apps Script API to call function ' + functionName.toString() + ' with parameter(s) ' + paramsArray.toString());
var url = 'https://script.googleapis.com/v1/scripts/Mq71nLXJPX95eVDFPW2DJzcB61X_XfA8E:run';
var payload = JSON.stringify({"function": functionName, "parameters": paramsArray, "devMode": true})
var params = {method:"POST",
headers: {Authorization: 'Bearer ' + getAppsScriptService().getAccessToken()},
payload:payload,
contentType:"application/json",
muteHttpExceptions:true};
var results = UrlFetchApp.fetch(url, params);
var jsonResponse = JSON.parse(results).response;
if (jsonResponse == undefined) {
var jsonResults = undefined;
} else {
var jsonResults = jsonResponse.result;
}
} catch(error) {
console.log('error = ' + error); // I'm seeing this in the logs...
console.log('error.indexOf("Timeout") = ' + error.indexOf("Timeout").toString); // ...but not this. It skips straight to the finally block
if (error.indexOf('Timeout') > 0) { // If error is a timeout error, call follow-up function
console.log('Using Apps Script API to call follow-up function ' + opt_timeoutFunction.toString() + ' with parameter(s) ' + paramsArray.toString());
var url = 'https://script.googleapis.com/v1/scripts/Mq71nLXJPX95eVDFPW2DJzcB61X_XfA8E:run';
var payload = JSON.stringify({"function": opt_timeoutFunction, "parameters": opt_timeoutParams, "devMode": true})
var params = {method:"POST",
headers: {Authorization: 'Bearer ' + getAppsScriptService().getAccessToken()},
payload:payload,
contentType:"application/json",
muteHttpExceptions:true};
var results = UrlFetchApp.fetch(url, params);
var jsonResponse = JSON.parse(results).response;
if (jsonResponse == undefined) {
var jsonResults = undefined;
} else {
var jsonResults = jsonResponse.result;
}
}
} finally {
console.log('jsonResults = ' + jsonResults);
return jsonResults;
}
}
I ended up using the '''catch()''' block to throw an exception back to the client side and handle it there.
Google Apps Script:
function executeAsMe(functionName, paramsArray) {
try {
console.log('Using Apps Script API to call function ' + functionName.toString() + ' with parameter(s) ' + paramsArray.toString());
var url = 'https://script.googleapis.com/v1/scripts/Mq71nLXJPX95eVDFPW2DJzcB61X_XfA8E:run';
var payload = JSON.stringify({"function": functionName, "parameters": paramsArray, "devMode": true})
var params = {method:"POST",
headers: {Authorization: 'Bearer ' + getAppsScriptService().getAccessToken()},
payload:payload,
contentType:"application/json",
muteHttpExceptions:true};
var results = UrlFetchApp.fetch(url, params);
var jsonResponse = JSON.parse(results).response;
if (jsonResponse == undefined) {
var jsonResults = undefined;
} else {
var jsonResults = jsonResponse.result;
}
return jsonResults;
} catch(error) {
console.log('error = ' + error);
if (error.toString().indexOf('Timeout') > 0) {
console.log('Throwing new error');
throw new Error('timeout');
} else {
throw new Error('unknown');
}
} finally {
}
}
Client-side Javascript (a simplified version):
function createMcs() {
var userFolder = getDataFromHtml().userFolder;
google.script.run
.withSuccessHandler(createMcsSuccess)
.withFailureHandler(createMcsFailure)
.withUserObject(userFolder)
.executeAsMe('createMasterCombinedSchedule', [userFolder]);
}
function createMcsSuccess(mcsParameter) {
if (mcsParameter == undefined) {
simpleErrorModal.style.display = "block"; // A generic error message
} else {
document.getElementById("simpleAlertHeaderDiv").innerHTML = 'Master Combined Schedule Created Successfully';
document.getElementById("simpleAlertBodyDiv").innerHTML = 'Your Master Combined Schedule was created successfully. Click here to view.';
simpleAlertModal.style.display = "block";
}
}
function createMcsFailure(mcsError, userFolder, counter) { // The exception I threw will activate this function
if (!counter) { // Added a counter to increment every time checkForCreatedMcs() runs so it doesn't run indefinitely
var counter = 0;
}
if (mcsError.message == 'Error: timeout' && counter < 5) { // If timeout error, wait 10s and look for MCS URL
window.setTimeout(checkForCreatedMcs(mcsError, userFolder, counter), 10000);
} else if (mcsError.message == 'Error: timeout' && counter == 5) { // If it hasn't worked after 5 tries, show generic error message
simpleErrorModal.style.display = "block";
} else { // For any error that's not a timeout exception, show generic error message
simpleErrorModal.style.display = "block";
}
}
function checkForCreatedMcs(mcsError, userFolder, counter) {
counter++;
google.script.run
.withSuccessHandler(checkForCreatedMcsSuccess)
.withUserObject([mcsError, userFolder, counter])
.executeAsMe('checkIfMcsExists', [userFolder]); // checkIfMcsExists() is a pre-existing function in my API Executable project I had already used elsewhere
}
function checkForCreatedMcsSuccess(mcsExistsParameter, params) {
var mcsError = params[0];
var userFolder = params[1];
var counter = params[2];
if (mcsExistsParameter == undefined) { // If function returns undefined, show generic error message
simpleErrorModal.style.display = "block";
} else if (mcsExistsParameter == false) { // If function returns false, wait 10s and try again
createMcsFailure(mcsError, userFolder, counter);
} else { // If function returns URL, show success modal with link
document.getElementById("simpleAlertHeaderDiv").innerHTML = 'Master Combined Schedule Created Successfully';
document.getElementById("simpleAlertBodyDiv").innerHTML = 'Your Master Combined Schedule was created successfully. Click here to view.';
simpleAlertModal.style.display = "block";
}
}
I am sure there has to be a tidier/less complex way to do this, but this worked!
I am working on a script that will fetch posts from a paginated REST API provided by a website managed by WordPress.
What I realized was that WordPress REST API is paginated and got a cap of 100 objects per request.
Within the code below, I am trying to fetch posts page by page and with an amount of 20 posts per page. In the end, I am joining all the fetched objects into one big object.
My problem is, that the fetch fails with HTTP 404 response, since the last request contain less than 20 posts.
I would like to adjust the variable named 'limitPerPage', if the fetch returns with a 404 and decrement the variable until i get a 200 HTTP response.
My challenge is, that I not experienced working with fetch promise.
Please see my current script below:
console.log('REST API is this: ' + apiUrl);
const getPosts = async function(pageNo = 1) {
let limitPerPage = 20;
let requestUrl = apiUrl + `?page=${pageNo}&per_page=${limitPerPage}`;
let apiResults = await fetch(requestUrl)
.then(function(response){
return response.json();
})
.catch(function(error){
console.log(error.status);
});
return apiResults;
}
const getEntirePostList = async function(pageNo = 1) {
const results = await getPosts(pageNo);
console.log('Retreiving data from API for page : ' + pageNo);
if (results.length > 0) {
return results.concat(await getEntirePostList(pageNo+1));
} else {
return results;
}
};( async () => {
const entireList = await getEntirePostList();
console.log(entireList);
})
();
I expect the code to decrement the variable 'limitPerPage' by 1, if the fetch returns a 404 HTTP response.
I do not necessary ask for a final solution to my problem. I would appreciate a suggestion to another way for me to structure my code, to get the result I need.
Thanks!
I think the following code should work. Always use try...catch block inside async functions
let entireList = [];
let finishEvent = new Event('finished');
document.addEventListener('finished', function (e) {
console.log(entireList);
}, false);
const getPosts = function (pageNo = 1) {
let limitPerPage = 20;
let requestUrl = `${apiUrl}?page=${pageNo}&per_page=${limitPerPage}`;
return fetch(requestUrl)
.then(function (response) {
return response.json();
})
.catch(function (error) {
console.log(error.status);
return false;
});
}
const getEntirePostList = async function (pageNo = 1) {
try {
const results = await getPosts(pageNo);
console.log('Retreiving data from API for page : ' + pageNo);
if (results && (results.length > 0)) {
entireList.concat(results);
getEntirePostList(pageNo + 1);
} else {
document.dispatchEvent(finishEvent);
}
return;
} catch(e) {
console.log(e)
}
};
getEntirePostList();
I managed to solve the issue myself with the help from the suggestions above. Please find the solution here:
// Constant variable with the assigned value of a joined string containing the base URL structure of the REST API request.
const apiUrl = websiteUrl + '/wp-json/wp/v2/punkt/';
// Logging out the base URL of the REST API.
console.log('REST API is this: ' + apiUrl);
// Local variable with the default value of true assigned. Variable is used to control the paginated fetch of the API.
let keepfetchingPosts = true;
// Local variable that contains the limit of posts per request.
let limitPerPage = 20;
// Constant variable, which is assigned a function as the value. The function is async and will return a promise.
// The function takes one argument. The argument holds the value of the page number.
const getPosts = async function(pageNo = 1) {
// Local variable assigned with the base URL string of the REST API request. Additional queries are added to the end of the string.
let requestUrl = apiUrl + `?page=${pageNo}&per_page=${limitPerPage}`;
// Logging out the REST API request
console.log('URL is this: ' + requestUrl);
// Logging out the argument 'pageNo'
console.log('Retreiving data from API for page : ' + pageNo);
// Local variable assigned with a fetch function that returns a promise. The URL request are used as the function argument.
let apiResults = await fetch(requestUrl)
// If request is success, then log and return the following to the local variable.
.then(function(response){
// Logging out the status code of the response.
console.log('HTTP response is: ' + response.status);
// return JSON and status code of the XHR request
return {
data: response.json(),
status: response.status
}
})
// Catch the error and log it out within the console.
.catch(function(error){
console.log('HTTP response is: ' + error.status)
});
// If the length of the request is less than the limitPerPage variable and status code is 200, then...
if (apiResults.length < limitPerPage && apiResults.status === 200){
// Set the boolean to false
keepfetchingPosts = false;
// Return the JSON of the successfull response.
return apiResults.data;
} else if (apiResults.status === 200) {
// If the status code is 200, then return the JSON of the successfull response
return apiResults.data;
} else {
// Otherwise, set the boolean to false
keepfetchingPosts = false;
}
}
// Defining a constant variable that holds and async fynction. An async functon will always return a promise.
// The function takes one argument, which is set to 1 by default.
const getEntirePostList = async function(pageNo = 1) {
// Try and catch statement used to handle the errors that might occur.
try {
// Constant variable which is set to the return variable of the function getPost(). Get post returns the successfull paginated response of the request.
const results = await getPosts(pageNo);
// Logging out a string including the length of the array.
console.log('Current array contain ' + results.length + ' items...');
// Conditional statement that checks if the length of the array named 'results' is less than the variable named limitPerPage. Condition is also checked, if bolean is true.
// If the conditions are met, the code will join the arrays into one big array.
if (results.length < limitPerPage && keepfetchingPosts === true) {
// Logging out a string that indicates an attempt to combine that last array to the existing array.
console.log('Combining last array!');
// Return the combined array.
return results;
} else if (keepfetchingPosts === true) {
// Logging out a string that indicates an attempt to combine the recent fetched array to the existing array.
console.log('Combining arrays!');
// Returning the new combined array and increments the pageNo variable with 1.
return results.concat(await getEntirePostList(pageNo+1));
} else {
// Logging out a string that indicates the script will stop fetching more posts from the REST API.
console.log('Stop fetching posts and return results');
// Returning the complete array.
return results;
}
// Catch statement that takes the argument of the error that occured.
} catch(error) {
// Logging out the error.
console.log(error);
}
};( async () => {
// Constant variable with the assigned value received from the function
const entireList = await getEntirePostList();
// Logging out the enite list of results collected from the REST API
console.log(entireList);
})
();
The code above returns a complete array of the JSON response from all paginated REST API calls.
You could use a while loop, and decrement limitPerPage if the status code isn't 200:
console.log('REST API is this: ' + apiUrl);
const getPosts = async pageNo => {
let limitPerPage = 20;
let requestUrl = apiUrl + `?page=${pageNo}&per_page=${limitPerPage}`;
let res = { status: 0 }
while (res.status !== 200) {
res = await fetch(requestUrl)
.then(r => ({ data: r.json(), status: r.status }))
.catch(({status}) => console.log(status))
limitPerPage--
}
return res.data
}
const getEntirePostList = async (pageNo = 1) => {
const results = await getPosts(pageNo);
console.log('Retreiving data from API for page : ' + pageNo);
return results.length > 0
? results.concat(await getEntirePostList(pageNo + 1))
: results;
}
(async () => {
const entireList = await getEntirePostList();
console.log(entireList);
})()
I'm trying to run some code once all the jqXHR elements of an array are completed (have either succeeded or failed).
You can see the full code here: http://jsfiddle.net/Lkjcrdtz/4/
Basically I'm expecting the always hook from here:
$.when
.apply(undefined, reqs)
.always(function(data) {
console.log('ALL ALWAYS', data);
});
to run when all the requests that were piled up there have either succeeded or failed. Currently, you can observe in the console that ALL ALWAYS is logged earlier.
A simple solution for modern browsers would be to use the newer fetch() API along with Promise.all()
var makeReq = function(url, pos) {
var finalUrl = url + pos;
// intentionally make this request a failed one
if (pos % 2 === 0) {
finalUrl = "https://jsonplaceholder.typicode.com/423423rfvzdsv";
}
return fetch(finalUrl).then(function(resp) {
console.log('Request for user #', pos);
// if successful request return data promise otherwise return something
// that can be used to filter out in final results
return resp.ok ? resp.json() : {error: true, status: resp.status, id: pos }
})
};
// mock API
var theUrl = "https://jsonplaceholder.typicode.com/users/";
var reqs = [];
for (var i = 1; i <= 5; i++) {
reqs.push(makeReq(theUrl, i));
}
Promise.all(reqs).then(function(results) {
console.log('---- ALL DONE ----')
// filter out good requests
results.forEach(function(o) {
if (o.error) {
console.log(`No results for user #${o.id}`);
} else {
console.log(`User #${o.id} name = ${o.name}`);
}
})
})
I'm new to promises and I'm sure there's an answer/pattern out there but I just couldn't find one that was obvious enough to me to be the right one. I'm using node.js v4.2.4 and https://www.promisejs.org/
This should be pretty easy I think...I need to do multiple blocks of async in a specific order, and one of the middle blocks will be looping through an array of HTTP GETs.
//New Promise = asyncblock1 - FTP List, resolve the returned list array
//.then(asynchblock2(list)) - loop through list array and HTTP GET needed files
//.then(asynchblock3(list)) - update local log
I tried creating a new Promise, resolving it, passing the list to the .then, doing the GET loop, then the file update. I tried using a nested promise.all inside asynchblock2, but it's actually going in reverse order, 3, 2, and 1 due to the timing of those events. Thanks for any help.
EDIT: Ok, this is the pattern that I'm using which works, I just need a GET loop in the middle one now.
var p = new Promise((resolve, reject) => {
setTimeout(() => {
console.log('2 sec');
resolve(1);
},
2000);
}).then(() => {
return new Promise((resolve) => {
setTimeout(() => {
console.log('1.5 sec');
// instead of this section, here I'd like to do something like:
// for(var i = 0; i < dynamicarray.length; i++){
// globalvar[i] = ftpclient.getfile(dynamicarray[i])
// }
// after this loop is done, resolve
resolve(1);
},
1500);
});
}).then(() => {
return new Promise((resolve) => {
setTimeout(() => {
console.log('1 sec');
resolve(1);
},
1000);
});
});
EDIT Here is the almost working code!
var pORecAlert = (function(){
var pa;
var newans = [];
var anstodownload = [];
var anfound = false;//anfound in log file
var nexttab;
var lastchar;
var po;
var fnar = [];
var antext = '';
//-->> This section works fine; it's just creating a JSON object from a local file
try{
console.log('trying');
porfile = fs.readFileSync('an_record_files.json', 'utf8');
if(porfile == null || porfile == ''){
console.log('No data in log file - uploaded_files_data.json being initialized!');
plogObj = [];
}
else{
plogObj = JSON.parse(porfile);
}
}
catch(jpfp){
console.log('Error parsing log file for PO Receiving Alert: ' + jpfp);
return endPORecAlertProgram();
};
if((typeof plogObj) === 'object'){
console.log('an_record_files.json log file found and parsed for PO Receiving Alert!');
}
else{
return mkError(ferror, 'pORecAlert');
};
//finish creating JSON Object
pa = new Client();
pa.connect(ftpoptions);
console.log('FTP Connection for FTP Check Acknowledgement begun...');
pa.on('greeting', function(msg){
console.log('FTP Received Greeting from Server for ftpCheckAcknowledgement: ' + msg);
});
pa.on('ready', function(){
console.log('on ready');
//START PROMISE LIST
var listpromise = new Promise((reslp, rejlp) => {
pa.list('/public_html/test/out', false, (cerr, clist) => {
if(cerr){
return mkError(ferror, 'pORecAlert');
}
else{
console.log('Resolving clist');
reslp(clist);
}
});
});
listpromise.then((reclist) => {
ftpplist:
for(var pcl = 0; pcl < reclist.length; pcl++){
console.log('reclist iteration: ' + pcl);
console.log('checking name: ', reclist[pcl].name);
if(reclist[pcl].name.substring(0, 2) !== 'AN'){
console.log('Not AN - skipping');
continue ftpplist;
}
else{//found an AN
for(var plc = 0; plc < plogObj.length; plc++){
if(reclist[pcl].name === plogObj[plc].anname){
//console.log('Found reclist[pcl].name in local log');
anfound = true;
};
};
if(anfound === false){
console.log('Found AN file to download: ', reclist[pcl].name);
anstodownload.push(reclist[pcl].name);
};
};
};
console.log('anstodownload array:');
console.dir(anstodownload);
return anstodownload;
}).then((fnar) => {
//for simplicity/transparency, here is the array being overwritten
fnar = new Array('AN_17650_37411.699.txt', 'AN_17650_37411.700', 'AN_17650_37411.701', 'AN_17650_37411.702.txt', 'AN_17650_37411.801', 'AN_17650_37411.802.txt');
return Promise.all(fnar.map((gfname) => {
var nsalertnames = [];
console.log('Getting: ', gfname);
debugger;
pa.get(('/public_html/test/out/' + gfname), function(err, anstream){//THE PROBLEM IS THAT THIS GET GETS TRIGGERED AN EXTRA TIME FOR EVERY OTHER FILE!!!
antext = '';
console.log('Get begun for: ', gfname);
debugger;
if(err){
ferror.nsrest_trace = 'Error - could not download new AN file!';
ferror.details = err;
console.log('Error - could not download new AN file!');
console.log('************************* Exiting *************************')
logError(ferror, gfname);
}
else{
// anstream.on('data', (anchunk) => {
// console.log('Receiving data for: ', gfname);
// antext += anchunk;
// });
// anstream.on('end', () => {
// console.log('GET end for: ', gfname);
// //console.log('path to update - gfname ', gfname, '|| end text.');
// fs.appendFileSync(path.resolve('test/from', gfname), antext);
// console.log('Appended file');
// return antext;
// });//end end
};
});//get end
}));//end Promise.all and map
}).then((res99) => {
// pa.end();
// return Promise(() => {
console.log('end all. res99: ', res99);
// //res4(1);
// return 1;
// });
});
});
})();
-->> What happens here:
So I added the almost working code. What is happening is that for every other file, an additional Get request gets made (I don't know how it's being triggered), which fails with an "Unable to make data connection".
So for my iteration over this array of 6, there ends up being 9 Get requests. Element 1 gets requested (works and expected), then 2 (works and expected), then 2 again (fails and unexpected/don't know why it was triggered). Then 3 (works and expected), then 4 (works and expected), then 4 again (fails and unexpected) etc
what you need is Promise.all(), sample code for your app:
...
}).then(() => {
return Promise.all(arry.map(item => ftpclient.getFile(item)))
}).then((resultArray) => {
...
So thanks for the help (and the negative votes with no useful direction!)
I actually reached out to a good nodejs programmer and he said that there seemed to be a bug in the ftp module I was using, and even when trying to use a blackbird .map, the quick succession of requests somehow kicked off an error. I ended up using promise-ftp, blackbird, and promiseTaksQueue - the kicker was that I needed interval. Without it the ftp would end up causing a strange illogical error in the ftp module.
You need the async library. Use the async.eachSeries in situations where you need to use asynchronous operations within a loop, then execute a function when all of those are complete. There are many variations depending on the flow you want but this library does it all.
https://github.com/caolan/async
async.each(theArrayToLoop, function(item, callback) {
// Perform async operation on item here.
doSomethingAsync(item).then(function(){
callback();
})
}, function(err){
//All your async calls are finished continue along here
});