Multiple paginated GET API calls in parallel/async in Node - javascript

I am making call to the bitbucket API to get all the files that are in a repo. I have reached to a point where I can get the list of all the folders in the repo and make the first API call to all the root folders in the repo in parallel and get the the list of first 1000 files for all folders.
But the problem is bitbucket api can give me only 1000 files per folder at a time.
I need to append a query param &start =nextPageStart and make the call again, until it is null and isLastPage is true per API. How can I achieve that with below code??
I get the nextPageStart from first call to the api. See the API response below.
Below is the code that I have so far.
Any help or guidance is appreciated.
Response from individual API thats called per folder.
{
"values": [
"/src/js/abc.js",
"/src/js/efg.js",
"/src/js/ffg.js",
...
],
"size": 1000,
"isLastPage": false,
"start": 0,
"limit": 1000,
"nextPageStart": 1000
}
function where i made asynchronous calls to get the list of files
export function getFilesList() {
const foldersURL: any[] = [];
getFoldersFromRepo().then((response) => {
const values = response.values;
values.forEach((value: any) => {
//creating API URL for each folder in the repo
const URL = 'https://bitbucket.abc.com/stash/rest/api/latest/projects/'
+ value.project.key + '/repos/' + value.slug + '/files?limit=1000';
foldersURL.push(URL);
});
return foldersURL;
}).then((res) => {
// console.log('Calling all the URLS in parallel');
async.map(res, (link, callback) => {
const options = {
url: link,
auth: {
password: 'password',
username: 'username',
},
};
request(options, (error, response, body) => {
// TODO: How do I make the get call again so that i can paginate and append the response to the body till the last page.
callback(error, body);
});
}, (err, results) => {
console.log('In err, results function');
if (err) {
return console.log(err);
}
//Consolidated results after all API calls.
console.log('results', results);
});
})
.catch((error) => error);
}

I was able to get it working be creating a function with callback.
export function getFilesList() {
const foldersURL: any[] = [];
getFoldersFromRepo().then((response) => {
const values = response.values;
values.forEach((value: any) => {
//creating API URL for each folder in the repo
const URL = 'https://bitbucket.abc.com/stash/rest/api/latest/projects/'
+ value.project.key + '/repos/' + value.slug + '/files?limit=1000';
foldersURL.push(URL);
});
return foldersURL;
}).then((res) => {
// console.log('Calling all the URLS in parallel');
async.map(res, (link, callback) => {
const options = {
url: link,
auth: {
password: 'password',
username: 'username',
},
};
const myarray = [];
// This function will consolidate response till the last Page per API.
consolidatePaginatedResponse(options, link, myarray, callback);
}, (err, results) => {
console.log('In err, results function');
if (err) {
return console.log(err);
}
//Consolidated results after all API calls.
console.log('results', results);
});
})
.catch((error) => error);
}
function consolidatePaginatedResponse(options, link, myarray, callback) {
request(options, (error, response, body) => {
const content = JSON.parse(body);
content.link = options.url;
myarray.push(content);
if (content.isLastPage === false) {
options.url = link + '&start=' + content.nextPageStart;
consolidatePaginatedResponse(options, link, myarray, callback);
} else {
// Final response after consolidation per API
callback(error, JSON.stringify(myarray));
}
});
}

I think the best way is to wrap it in a old school for loop (forEach doesn't work with async, since it's synchronous and it will cause all the requests to be spawn at the same time).
What I understood is that you do some sort of booting query where you get the values array and then you should iterate among the pages. Here some code, I didn't fully grasp the APIs so I'll give a simplified (and hopefully readable) answer, you should be able to adapt it:
export async function getFilesList() {
logger.info(`Fetching all the available values ...`);
await getFoldersFromRepo().then( async values => {
logger.info("... Folders values fetched.");
for (let i = 0; ; i++ ) {
logger.info( `Working on page ${i}`);
try {
// if you are using TypeScript, the result is not the promise but the succeeded value already
const pageResult: PageResult = await yourPagePromise(i);
if (pageResult.isLastPage) {
break;
}
} catch(err) {
console.err(`Error on page ${i}`, err);
break;
}
}
logger.info("Done.");
});
logger.info(`All finished!`);
}
The logic behind is that first getFoldersFromRepo() returns a promise which returns the values, and then I sequentially iterate on all available pages through the yourPagePromise function (which returns a promise). The async/await construct allows to write more readable code, rather then having a waterfall of then().
I'm not sure it respects your APIs specs, but it's the logic you can use as foundation! ^^

Related

Javascript Return Value in graphql

I have an apollo-server setup in my backed, I want to use a post request API inside one of my resolvers and then use the response from that API to return it to my client. but the problem am having is the return statement is running before the API response get returned. there is my code sample bellow.
module.exports = {
Mutation: {
checkFace: async () => {
console.log("Checking.....");
let confidence;
var parameters = {
image_url1: "link to image 1",
image_url2: "link to image 2",
};
facepp.post("/compare", parameters, function (err, res) {
if (!err) {
confidence = res.confidence;
} else {
console.log(err);
}
});
return confidence
},
},
};
That's because the faceapp.post is probably running asynchronously. Your checkFace function is correctly used as async that's fine, but inside where you call the POST you should await the response and then return it
confidence = await res.confidence;
Also when using the function make sure you await for it to finish, so call it with
let someResponse = await Mutation.checkFace();
console.log(someResponse);
or
Mutation.checkFace().then(response => {
console.log(response);
});
Whichever you prefer depending on your situation.
https://nodejs.dev/learn/modern-asynchronous-javascript-with-async-and-await

Unable to run a loop to update Object Array in SQLite with React Native

So this has been troubling me for a while, I have an array of objects that I want to insert into my SQLite DB. Each of the objects have 5 parameters and I have the SQL Query in place to run it. I was using a loop to iterate through the array and populate each of the objects via db transactions to SQLite. However, the db tasks are asynchronous which leads to the loop being completed before the task is run and incorrect data being populated into the db. The while loop in the code below doesn't work and I have tried the same thing with a for loop to no avail.
var i=0;
while(i<rawData.length){
console.log(rawData[i],i)
db.transaction(function (tx) {
console.log(rawData,i," YAY")
tx.executeSql(
'Update all_passwords SET title=?,userID=?,password=?,notes=?,category=? WHERE ID =? ',
[rawData[i].title,rawData[i].userID,rawData[i].password,rawData[i].notes,rawData[i].category,rawData[i].id],
(tx, results) => {
console.log("saved all data")
tx.executeSql(
"SELECT * FROM all_passwords ORDER BY id desc",
[],
function (tx, res) {
i++
console.log("Print Out Correct Data")
for(var i=0;i<res.rows.length;i++){
console.log(res.rows.item(i), i )
}
});
}
);
console.log("EXIT")
}
,
(error) => {
console.log(error);
}
);
}
I'm not familiar using async tasks with hooks but I believe that might be a potential solution. My intention is to populate the rawaData array of objects into the SQLDb in one go while I use a state to maintain the loading screen.
I did refer the below sources but wasn't able to come up with anything concrete.
react native insertion of array values using react-native-sqlite-storage
https://medium.com/javascript-in-plain-english/how-to-use-async-function-in-react-hook-useeffect-typescript-js-6204a788a435
Thanks in advance!
I made a little write up for you on how I would solve it. Read the comments in the code. If anything is unclear feel free to ask!
const rawData = [
{ title: "title", userID: "userID", password: "password", notes: "notes", category: "category", id: "id" },
{ title: "title_1", userID: "userID_1", password: "password_1", notes: "notes_1", category: "category_1", id: "id_1" },
{ title: "title_2", userID: "userID_2", password: "password_2", notes: "notes_2", category: "category_2", id: "id_2" }
];
// You can mostly ignore this. It's just a mock for the db
const db = {
tx: {
// AFAIK if there is a transaction it's possible to execute multiple statements
executeSql: function(sql, params, success, error) {
// just for simulating an error
if (params.title === "title_2") {
error(new Error("Some sql error"));
} else {
console.log(sql, params.title);
success();
}
}
},
transaction: function(tx, error) {
// simulating async
setTimeout(() => {
return tx(this.tx);
}, parseInt(Math.random() * 1000));
}
}
// Lets make a class which handles our dataccess in an async way
class DataAccess {
// as transaction has callback functions it's wrapped in a promise
// on success the transaction is resolved
// if there is an error it will be thrown
transaction = () => {
return new Promise(resolve => {
db.transaction(tx => resolve(tx), error => {
throw error;
});
});
}
// the actual executeSql function which "hides" all the transaction stuff
// awaits a transaction and executes the sql on it
// if the execution was successfull resolve
// if not throw the error
executeSql = async(sql, params) => {
const tx = await this.transaction();
tx.executeSql(sql, params, () => Promise.resolve(), error => {
throw error;
});
}
}
const dal = new DataAccess();
// all sql execute tha was possible
async function insert_with_execute() {
// promise all does not guarantee execution order
// but it is a possibility to await an array of promises (async functions)
await Promise.all(rawData.map(async rd => {
try {
await dal.executeSql("sql_execute", rd);
} catch (error) {
console.log(error.message);
}
}));
}
// no sql executed cause of error and all in the same transaction
async function insert_with_transaction() {
const tx = await dal.transaction();
for (let i = 0; i < rawData.length; i++) {
tx.executeSql("sql_transaction", rawData[i], () => console.log("success"), error => console.log(error.message));
}
}
async function test() {
await insert_with_execute();
console.log("---------------------------------")
await insert_with_transaction();
}
test();
Apparently the best approach to take is using anonymous functions that create a separate instance of execution for each value of i. This is a good example of how to do it....
Javascript SQL Insert Loop

Synchronously call a REST API in JavaScript

I am new to JavaScript and the npm world. I try to upload some data to my REST service via a REST post call. These data I fetch from a csv file. So far so good. On each fetched line I convert the data (for my needs) and call the REST API for uploading those. Since I have many line (approx. 700) the API gets called quite often consecutively. After some calls (guess 500 or so) I get an Socket error
events.js:136
throw er; // Unhandled 'error' event
^
Error: connect ECONNRESET 127.0.0.1:3000
at Object._errnoException (util.js:999:13)
at _exceptionWithHostPort (util.js:1020:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1207:14)
I guess this is because I call the REST API to often. What I don't understand is:
How should I make the call synchronously in order to avoid so many connections?
Or should't I?
What would be the proper solution in JS for this?
I have tried with Promises and so on but all this didn't helped but moved the issue some function calls priorly...
This is my code:
readCsv()
function readCsv() {
var csvFile = csvFiles.pop()
if (csvFile) {
csv({ delimiter: ";" }).fromFile(csvFile).on('json', async (csvRow) => {
if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
var data = await addCallLog(
csvRow["Datum"],
csvRow["Zeit"],
csvRow["Menge-Zeit"],
csvRow["Zielrufnummer"],
csvRow["Produkt"]);
}
}).on('done', (error) => {
//console.log('end')
readCsv()
})
} else {
}
}
function addCallLog(date, time, duration, number, product) {
return new Promise(resolve => {
args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}
As you can see I had the same issue with reading more than one csv files in parallel. I solved this by calling recursively the readCsv function and pop the next file after the other when the file read was done.
You can't call things synchronously. But, you can sequence the async REST calls which is what I presume you mean.
A problem here is that await addCallLog() won't keep the next json events from being generated so you will end with a zillion requests in flight at the same time and apparently you have so many that you run out of resources.
One way around that is to collect the rows you want into an array and then use a regular for loop to iterate that array and you can use await sucessfully in the for loop. Here's what that would look like:
readCsv()
function readCsv() {
var csvFile = csvFiles.pop()
if (csvFile) {
let rows = [];
csv({ delimiter: ";" }).fromFile(csvFile).on('json', (csvRow) => {
if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
rows.push(csvRow);
}
}).on('done', async (error) => {
for (let csvRow of rows) {
var data = await addCallLog(
csvRow["Datum"],
csvRow["Zeit"],
csvRow["Menge-Zeit"],
csvRow["Zielrufnummer"],
csvRow["Produkt"]
);
}
readCsv();
})
} else {
}
}
function addCallLog(date, time, duration, number, product) {
return new Promise(resolve => {
args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}
Your coding appears to be missing error handling. The client.methods.addCallLog() needs a way to communicate back an error.
You probably also need a error event handler for the csv iterator.
After filling the buffer in a prev. function I check that buffer for data and upload those one by one using the "then" callback of the promise
var callLogBuffer = []
checkForUpload()
function checkForUpload() {
console.log("checkForUpload")
if (callLogBuffer.length > 0) {
addCallLog(callLogBuffer.pop()).then((data) => {
checkForUpload()
})
}
}
function addCallLog(callLog) {
return new Promise(resolve => {
args.data = { number: callLog.number, name: "", timestamp: getTimestamp(callLog.date, callLog.time), duration: getDuration(callLog.duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}

Nodejs, close mongo db connection via callback

I have the problem with callbacks, async thinking etc.
Execution program:
Connect to mongoDb.
Create url - https://example.com + add part from locArray.
Send get request (for each).
Save data to mongo db.
Close connection.
Problem:
If the connection was closed on last line in jsonDataFromApi - "server instance pool was destroyed" before all data from each request was saved to db
So callback(db) was sent to another place - closeMongoDb
but error was appeared
"Cannot read property 'close' of undefined".
I think, the problem is with async, send callbacks etc.
const MongoClient = require('mongodb').MongoClient;
const Array = require('node-array');
const request = require('request');
var locationArray = [
'location1',
'location2',
'location3',
'location4'
];
var dataFromLocApi = (loc, callback) => {
request({
url: `https://example.com/${loc}`,
json: true
}, (error, response, body) => {
if (error){
callback('Error connection to url.');
} else{
callback(undefined, body.result);
}
});
};
var jsonDataFromApi = (urldb, callback) => {
MongoClient.connect(urldb, (err, db) => {
if (err) {
console.log('MongoDb connection error.');
}
console.log('MongoDb - connected.');
locationArray.forEachAsync(function(loc, index, arr) {
dataFromLocApi(loc, (errorMessage, results) => {
if (errorMessage) {
console.log(errorMessage);
} else {
console.log(JSON.stringify(results, undefined, 2));
db.collection('testCollection').insert(results, function(error, record) {
if (error)
throw error;
console.log("data saved");
});
}
});
}, function() {
console.log('complete');
});
callback(db);
});
}
var closeMongoDb = (urldb, callback) => {
jsonDataFromApi(urldb, (error, db) => {
if (error){
callback('Close connection - failure');
} else{
db.close();
console.log('MongoDb connections was closed.');
}
});
}
closeMongoDb('mongodb://127.0.0.1:27017/testDb', (err, db) => {
console.log('DONE');
} );
There is definitely a problem with asynchrony there.
You're not waiting for the items to be processed before calling the db.close().
Also, the functions that you have defined have the unclear semantics. For example, the function closeMongoDb should basically close the DB and that's it. But here does the other job: fetches the data and closes the DB afterwards.
Also, I'd probably use the async module instead of node-array as the last one seems to solve other problem.
I've refactored the code. Please read my comments. I tried to make it as clear as possible.
const MongoClient = require("mongodb").MongoClient;
const request = require("request");
// We are going to use the async module
// This is a classical module to handle async behavior.
const async = require("async");
// As you can see this function accepts a callback
// If there is an error connecting to the DB
// it passes it up to the caller via callback(err)
// This is a general pattern
const connectToDb = function(urldb, callback) {
MongoClient.connect(urldb, (err, db) => {
if (err) {
console.log("MongoDb connection error.");
callback(err);
return;
}
// If everything is OK, pass the db as a data to the caller.
callback(undefined, db);
});
};
// This method fetches the data for a single location.
// The logic with errors/data is absolutely the same.
const getData = (loc, callback) => {
request(
{
url: `https://example.com/${loc}`,
json: true
},
(error, response, body) => {
if (error) {
callback("Error connection to url.");
return;
}
callback(undefined, body.result);
}
);
};
// This function goes over each location, pulls the data and saves it to the DB
// Last parameter is a callback, I called it allDataFetchedCb to make it clear
// that we are calling it after ALL the locations have been processed
// And everything is saved to the DB.
const saveDataFromLocations = function(locations, db, allDataFetchedCb) {
// First param here is an array of items
// The second one is an async function that we want to execute for each item
// When a single item is processed we call the callback. I named it 'locProcessedCB'
// So it's clear what happens.
// The third parameter is a callback that is going to be called when ALL the items
// have been processed.
async.each(
locations,
function(loc, locProcessedCb) {
getData(loc, (apiErr, results) => {
if (apiErr) {
console.log(apiErr);
// Well, we couldn't process the item, pass the error up.
locProcessedCb(apiErr);
return;
}
console.log(
`Obtained the data from the api: ${JSON.stringify(
results,
undefined,
2
)}`
);
db.collection("testCollection").insert(results, function(dbError) {
if (dbError) {
// Also an error, we couldn't process the item.
locProcessedCb(dbError);
return;
}
// Ok the item is processed without errors, after calling this
// So we tell the async.each function: ok, good, go on and process the next one.
locProcessedCb();
});
});
},
function(err) {
// We gonna get here after all the items have been processed or any error happened.
if (err) {
allDataFetchedCb(err);
return;
}
console.log("All the locations have been processed.");
// All good, passing the db object up.
allDataFetchedCb(undefined, db);
}
);
};
// This function is an entry point.
// It calls all the above functions one by one.
const getDataAndCloseDb = function(urldb, locations, callback) {
//Well, let's connect.
connectToDb(urldb, (err, db) => {
if (err) {
callback(err);
return;
}
// Now let's get everything.
saveDataFromLocations(locations, db, (err, db) => {
if (err) {
callback(err);
return;
}
// If somehow there is no db object, or no close method we wanna know about it.
if (!db || !db.close) {
callback(new Error("Unable to close the DB Connection."));
}
// Closing the DB.
db.close(err => {
// If there's no error err === undefined or null
// So this call is equal to callback(undefined);
callback(err);
});
});
});
};
const locationArray = ["location1", "location2", "location3", "location4"];
// Finally calling the function, passing all needed data inside.
getDataAndCloseDb("mongodb://127.0.0.1:27017/testDb", locationArray, err => {
if (err) {
console.error(
`Unable to fetch the data due to the following reason: ${err}`
);
return;
}
console.log("Done successfully.");
});
I didn't run this code as I don't have the URL etc. So please try it yourself and debug if needed.

chaining promises to force async

I'm using promises to fetch large albums of images and when pull random samples from that album. I have managed to request all the albums and then push the links to images to an array of objects.
Now I want to print out that array but only after I've actually filled it. Whenever I add a .then() on the end it prints out only the initialized empty array.
What can I do to force async and only print the array once it's filled. (I'm printing it out at the bottom)
let findImagesCatalyst = new Promise(function(resolve, reject) {
//url options
const options = {
url: 'https://api.imgur.com/3/gallery/hot/time/',
headers: {
"Authorization": "Client-ID xxxx"
}
};
//inital request
request(options, function(err, res, body) {
//parse the response
body = JSON.parse(body)
//access the data in the response
const responseData = body.data;
//filter only those with image counts great than 50
const largeAlbums = responseData.filter(findDumps)
//test to see if a dump is present
if (largeAlbums.length > 0) {
largeAlbums.forEach(function(i) {})
resolve(largeAlbums)
} else {
reject()
}
})
})
//if successful in finding a dump, then go through them and find their albumIds
.then(function(largeAlbums) {
let dumpIds = largeAlbums.map(index => index.id)
return dumpIds;
})
//with the album/dump ids, get each of them with a new request
.then(function(dumpIds) {
//for each of the dumpIds create the needed url using ES6 and then request it.
dumpIds.forEach(function(i) {
const albumUrlOptions = {
url: `https://api.imgur.com/3/album/${i}/images`,
headers: {
"Authorization": "Client-ID xxxx"
}
}
//make a request to each of the albums/dumps
request(albumUrlOptions, function(err, res, body) {
body = JSON.parse(body)
const responseData = body.data
//pick one sample image from the album/dump
let sampleImage = responseData[randomSelector(responseData.length)].link;
dumps.push({
"dump": i,
'sample': sampleImage
})
})
})
return dumps;
})
.then(function(dumps) {
console.log(dumps)
})
You're second .then should return Promise.all of the (promisified) requests
.then(function(dumpIds) {
//for each of the dumpIds create the needed url using ES6 and then request it.
return Promise.all(dumpIds.map(function(i) {
const albumUrlOptions = {
url: `https://api.imgur.com/3/album/${i}/images`,
headers: {
"Authorization": "Client-ID xxxx"
}
};
return new Promise((resolve, reject) => {
//make a request to each of the albums/dumps
request(albumUrlOptions, function(err, res, body) {
body = JSON.parse(body)
const responseData = body.data
//pick one sample image from the album/dump
let sampleImage = responseData[randomSelector(responseData.length)].link;
resolve({
"dump": i,
'sample': sampleImage
});
});
});
}))
})
As you are using node.js, which has very good ES2015+ implementation, you can simplify (in my opinion) your code by, firstly, creating a "promisified version of request
let requestP = (options) => new Promise((resolve, reject) => {
request(options, (err, res, body) => {
if (err) {
return reject(err);
}
resolve({res, body});
});
});
The rest of the code could be then re-written as follows
const options = {
url: 'https://api.imgur.com/3/gallery/hot/time/',
headers: {
"Authorization": "Client-ID xxxx"
}
};
//inital request
let findImagesCatalyst = requestP(options)
.then(({res, body}) => {
//parse the response
body = JSON.parse(body)
//access the data in the response
const responseData = body.data;
//filter only those with image counts great than 50
const largeAlbums = responseData.filter(findDumps)
//test to see if a dump is present
if (largeAlbums.length > 0) {
largeAlbums.forEach(function(i) {})
return(largeAlbums)
} else {
return Promise.reject();
}
})
//if successful in finding a dump, then go through them and find their albumIds
.then((largeAlbums) => largeAlbums.map(index => index.id))
//with the album/dump ids, get each of them with a new request
.then((dumpIds) =>
//for each of the dumpIds create the needed url using ES6 and then request it.
Promise.all(dumpIds.map((i) => {
const albumUrlOptions = {
url: `https://api.imgur.com/3/album/${i}/images`,
headers: {
"Authorization": "Client-ID xxxx"
}
};
return requestP(albumUrlOptions)
.then(({res, body}) => {
body = JSON.parse(body)
const responseData = body.data
//pick one sample image from the album/dump
let sampleImage = responseData[randomSelector(responseData.length)].link;
return({
"dump": i,
'sample': sampleImage
});
});
}))
)
.then(function(dumps) {
console.log(dumps)
});
So, you have a few building blocks here:
Request for imgur albums reflected into options object.
findDumps — a simple function that you filter the list of albums against.
A function that applies the preceding two and returns an array of large albums. It's an asynchronous function, so it likely employs Promise.
A function that takes every item of the array of large albums and receives a single image. It's an asynchronous function, so, again, a Promise.
You want to wait until all the single images have been received.
Finally, you expect an array of objects of two properties: "dump" and "sample".
Let's try to contruct an example.
const findImagesCatalyst = new Promise((resolveImagesCatalyst, rejectImagesCatalyst) => {
const options = {
url: 'https://api.imgur.com/3/gallery/hot/time/',
headers: {
Authorization: 'Client-ID xxxx'
}
};
request(options, (err, res, body) => {
//access the data in the response
const responseData = JSON.parse(body).data;
//filter only those with image counts great than 50
const largeAlbums = responseData.filter(findDumps);
//test to see if a dump is present
if (largeAlbums.length > 0) {
// /!\ The trickiest part here: we won't resolve this promise until an "inner Promise" has been resolved.
// Note that next line declares a new function to resolve inner Promise, resolveLargeAlbum. Now we have two functions:
// - resolveImagesCatalyst - to resolve the main Promise, and
// - resolveLargeAlbum — to resolve every image request, and there can be many of them.
const imagesPromises = largeAlbums.map(largeAlbum => new Promise((resolveLargeAlbum, rejectLargeAlbun) => {
// take id from every large album
const dumpId = largeAlbum.id;
// declare options for inner request
const options = {
url: `https://api.imgur.com/3/album/${i}/images`,
headers: {
"Authorization": "Client-ID xxxx"
}
};
request(albumUrlOptions, (err, res, body) => {
const responseData = JSON.parse(body).data;
//pick one sample image from the album/dump
const sampleImage = responseData[randomSelector(responseData.length)].link;
if (sampleImage) {
// A-HA!
// It's inner Promise's resolve function. For N albums, there will be N resolveLargeAlbum calls. Just a few lines below, we're waiting for all of them to get resolved.
resolveLargeAlbum({
dump: dumpId,
sample: sampleImage
});
} else {
rejectLargeAlbun('Sorry, could not receive sample image:', dumpId, responseData);
}
});
}));
// Now we have imagePromises, an array of Promises. When you have an array of Promises, you can use Promise.all to wait until all of them are resolved:
Promise.all(imagesPromises).then(responses => {
// Take a look at responses: it has to be an array of objects of two properties: dump and sample.
// Also, note that we finally use outer Promise's resolve function, resolveImagesCatalyst.
resolveImagesCatalyst(responses);
}).catch(errors => {
rejectImagesCatalyst(errors);
});
} else {
rejectImagesCatalyst('Sorry, nope.');
}
});
});
That's a huge one. What you really need to see is that
With Promise.all, you can wait for a collection of Promises to get resolved, and the "then" part won't get executed until all of them have been resolved.
You can put a Promise into a Promise, and resolve outer Promise when inner Promise gets resolved.
The code is really hard to read, because the order of execution is not top-to-bottom. If you use Webpack with Babel, you might want to take a look at async/await. With async/await, the code looks synchronous: you read it from top to bottom and that's exactly the order results of its execution appear, but under the hood, it's all asynchronous. Pretty neat ES6 feature, imho.
Make sure there is no existing Node module that is handles your imgur searching business. Search on npms.io.
If there is no existing module, find one that is close and expand it for your use case (hot images).
If you really can't find a module for imgur to expand, then make your own. All of the imgur request stuff goes in its own module (and own file).
Make sure that module supports promises.
Your code should look something like this:
import {getHotAlbums, getAlbumImages, config} from 'imgur';
config({clientID: 'BLAHXXXX'});
async function getHotImages() {
let hotAlbums = await getHotAlbums();
hotAlbums = hotAlbums.filter(a => a.imageCount > 50);
const sampleImages = [];
let albumIDs = hotAlbums.map(a => a.id);
for (let albumID of albumIDs) {
const images = await getAlbumImages(albumID);
const randomImageNum = Math.round(Math.random()*images.length)+1;
sampleImages.push(images[randomImageNum].link);
}
return sampleImages;
}

Categories

Resources