NodeJS - Map forEach Async/Await - javascript

I am working with node. I have an array of ids. I want to filter them based on a response of a call of other API. So i want to populate each id and know if they assert or not the filter i am doing based on the API.
I am using async/await. I found that the best approach is using Promises.all, but this is not working as expected. What i am doing wrong?
static async processCSGOUsers (groupId, parsedData) {
let steamIdsArr = [];
const usersSteamIds = parsedData.memberList.members.steamID64;
const filteredUsers = await Promise.all(usersSteamIds.map(async (userId) => {
return csGoBackpack(userId).then( (response) => {
return response.value > 40;
})
.catch((err) => {
return err;
});
}));
Object.keys(usersSteamIds).forEach(key => {
steamIdsArr.push({
steam_group_id_64: groupId,
steam_id_64: usersSteamIds[key]
});
});
return UsersDao.saveUsers(steamIdsArr);
}
Apart from that, it is happening something weird. When i was debbuging this, data parameters on this method is coming fine. When i reach on the line of the Promise.all i got a "reference error" on each parameter. I do not why.

Wait for all responses, then filter based on the results:
const responses = await Promise.all(usersSteamIds.map(csGoBackpack));
// responses now contains the array of responses for each user ID
// filter the user IDs based on the corresponding result
const filteredUsers = usersSteamIds.filter((_, index) => responses[index].value > 40);

If you don't mind using a module, you can do this kind of stuff in a straightforward way using these utilities

Related

Perform fetch request within a Firestore transaction: receiving "Cannot modify a WriteBatch that has been committed"

I'm trying to perform a fetch request within a transaction but when the code executes I receive the following error.
Error: Cannot modify a WriteBatch that has been committed.
The steps the function is performing are the following:
Compute document references (taken from an external source)
Query the documents available in Firestore
Verify if document exists
Fetch for further details (lazy loading mechanism)
Start populating first level collection
Start populating second level collection
Below the code I'm using.
await firestore.runTransaction(async (transaction) => {
// 1. Compute document references
const docRefs = computeDocRefs(colName, itemsDict);
// 2. Query the documents available in Firestore
const snapshots = await transaction.getAll(...docRefs);
snapshots.forEach(async (snapshot) => {
// 3. Verify if document exists
if (!snapshot.exists) {
console.log(snapshot.id + " does not exists");
const item = itemsDict[snapshot.id];
if (item) {
// 4. Fetch for further details
const response = await fetchData(item.detailUrl);
const detailItemsDict = prepareDetailPageData(response);
// 5. Start populating first level collection
transaction.set(snapshot.ref, {
index: item.index,
detailUrl: item.detailUrl,
title: item.title,
});
// 6. Start populating second level collection
const subColRef = colRef.doc(snapshot.id).collection(subColName);
detailItemsDict.detailItems.forEach((detailItem) => {
const subColDocRef = subColRef.doc();
transaction.set(subColDocRef, {
title: detailItem.title,
pdfUrl: detailItem.pdfUrl,
});
});
}
} else {
console.log(snapshot.id + " exists");
}
});
});
computeDocRefs is described below
function computeDocRefs(colName, itemsDict) {
const identifiers = Object.keys(itemsDict);
const docRefs = identifiers.map((identifier) => {
const docId = `${colName}/${identifier}`
return firestore.doc(docId);
});
return docRefs;
}
while fetchData uses axios under the hood
async function fetchData(url) {
const response = await axios(url);
if (response.status !== 200) {
throw new Error('Fetched data failed!');
}
return response;
}
prepareMainPageData and prepareDetailPageData are functions that prepare the data normalizing them.
If I comment the await fetchData(item.detailUrl), the first level collection with all the documents associated to it are stored correctly.
On the contrary with await fetchData(item.detailUrl) the errors happens below the following comment: // 5. Start populating first level collection.
The order of the operation are important since I do now want to make the second call if not necessary.
Are you able to guide me towards the correct solution?
The problem is due to the fact that forEach and async/await do not work well together. For example: Using async/await with a forEach loop.
Now I've completely changed the approach I'm following and now it works smoothly.
The code now is like the following:
// Read transaction to retrieve the items that are not yet available in Firestore
const itemsToFetch = await readItemsToFetch(itemsDict, colName);
// Merge the items previously retrieved to grab additional details through fetch network calls
const fetchedItems = await aggregateItemsToFetch(itemsToFetch);
// Write transaction (Batched Write) to save items into Firestore
const result = await writeFetchedItems(fetchedItems, colName, subColName);
A big thanks goes to Doug Stevenson and Renaud Tarnec.

How to iterate a JSON array and add data from an async arrow function?

I'm new on MEAN stack and also on JS. What I'm trying to accomplish is to adapt the response that I get from the DB adding to it another field.
I have a mongoose method that gave me all the Courses that exist and I want to add to that information all the Inscriptions for each one. So I'm trying this:
exports.getAllCourses = async(req, res) => {
try {
const rawCourses = await Course.find();
const courses = await courseAdapter.apply(rawCourses)
await res.json({courses});
} catch (error) {
console.log(error);
res.status(500).send("Ocurrio un error imprevisto :/");
}
};
My courseAdapter
exports.apply = (courses) => {
return courses.map(async course=> (
{
...course._doc,
number: await coursetUtils.getNumberOfInscriptions(course._doc._id)
}
));
}
And my courseUtils:
exports.getNumberOfInscriptions = async courseId => {
return await CourseInscription.countDocuments({courseId: courseId});
}
I think my problem is with the async-await function because with this code i get this:
{"courses":[
{},
{}
]}
or changing some stuff i get this:
{"courses":[
{"courseInfo":{...},
"number":{}
},
{"courseInfo":{...},
"number":{}
}
]}
But never the number of inscription on the response. By the way i use function getNumberOfInscriptions() in other part of my code for make a validation and works.
Trying a lot of stuff i get to this:
I change the way I process the data from DB in the apply function and I treat it like an array.
exports.apply = async (courses) => {
var response = [];
for (let c of courses) {
var doc = c._doc;
var tmp = [{course: doc, inscriptionNumber: await courseUtils.getNumberOfInscriptions(c._doc._id)}];
response = response.concat(tmp);
}
return response;
}
I think is not a pretty good way to accomplish my goal, but it works. If I find something better, performance or clean I will posted.
Anyways I still don't know what I was doing wrong on my previous map function when I call my async-await function. If anybody knows, please let me know.

Run Mongo find Synchronously

I have a problem where I've got 20+k rows in a csv file and I'm trying to update them based on documents of a matching field in a Mongo DB that contains 350k docs.
The trick is that I need to perform some logic on the matches and then re-update the csv.
I'm using PapaParse to parse/unparse the csv file
Doing something like works to get all my matches
const file = fs.createReadStream('INFO.csv');
Papa.parse(file, {
header: true,
complete: function(row) {
getMatchesAndSave(row.data.map(el => { return el.fieldToMatchOn }));
}
});`
function getMatchesAndSave(fields) {
Order.find({fieldToMatchOn: { $in: fields}}, (err, results) => {
if (err) return console.error(err);
console.log(results);
});
}
That gets me matches fast. However, I can't really merge my data back into the csv bc the csv has a unique key column that Mongo has no idea about.
So all the data is really dependent of what's in the csv.
Therefore I thought of doing something like this
`
const jsonToCSV = [];
for (let row of csvRows) {
db.Collection.find({fieldToMatchOn: row.fieldToMatchOn}, (err, result) => {
//Add extra data to row based on result
row.foo = result.foo;
//push to final output
jsonToCSV.push(row);
}
}
papa.unparse(jsonToCSV);
//save csv to file
The issue with the above implementation (as terribly inefficient as it may seem) - is that the Find calls are asynchronous and nothing gets pushed to jsonToCSV.
Any tips? Solving this with $in would be ideal, are there any ways to access the current element in the $in (so looking for the iterator)..that way I could process on that.
You can try async/await to iterate csvRows array, like this:
const search = async () => {
const jsonToCSV = await Promise.all(csvRows.map(async row => {
/* find returns a promise, so we can use await, but to use await is
mandatory use it inside an async function. Map function not returns a
promise, so this can be solve using Promise.all. */
try {
const result = await db.Collection.find({ fieldToMatchOn: row.fieldToMatchOn });
row.foo = result.foo;
return row;
} catch (e) {
// do somenthing if error
}
}));
papa.unparse(jsonToCSV);
}
// call search function
search();
Check this https://flaviocopes.com/javascript-async-await-array-map to a better understanding.

returning mapped array from chained promises

function createDataSet(username, region, champion, amount) {
var dataArray = []; //what I want to return, if possible with .map()
return getUserId(username, region) //required for getUserMatchlist()
.then(userId => {
getUserMatchlist(userId, region, champion, amount); //returns an array of objects
})
.then(matchlist => {
matchlist.forEach(match => {
getMatchDetails(match.gameId.toString(), region) //uses the Id from the matchlist objects to make another api request for each object
.then(res => {
dataArray.push(res); //every res is also an object fetched individually from the api.
// I would like to return an array with all the res objects in the order they appear in
})
.catch(err => console.log(err));
});
});
}
I'm trying to send data that I fetched from multiple apis to my frontend. Fetching the data isn't a problem, however, using .map() didn't work and from what I've read doesn't work well with promises. What is the best way for me to return that object? (function will be executed when a get request is received and dataArray will be sent back)
Promise.all(listOfPromises) will resolve to an array containing the resolved result of each promise in listOfPromises.
To apply that to your code, you would want something like (pseudocode):
Promise.all(matchlist.map(match => getMatchDetails(...)))
.then(listOfMatchDetails => {
// do stuff with your list!
});

in React, returning a state as an array based on the value of an object in the array

I'm using React to build out an app that returns the weather using OpenWeatherAPI Forecast By Hour.
I am successful in pulling my data from OWA and putting it in my state with the following code:
state = {
loadedPageForecasts: []
}
componentDidMount () {
//console.log(this.props);
if (this.props.match.params.id) {
axios.get('http://api.openweathermap.org/data/2.5/forecast?q=London,uk&appid=MYAPIKEY')
.then(response => {
//const forecastData = response.data.list.;
this.setState({loadedPageForecasts: response.data.list});
});
}
}
console.log(loadedPageForecasts) returns an array with all the data I need, so I'm good there.
Now, I need to filter out from this array and into a new array based on the date that this is. To simplify it, I have just manually selected one of the dates.
I use the below code:
const paramsid = this.state.loadedPageForecasts;
console.log(paramsid);
let result = paramsid.filter(obj => {
return obj.type.includes("2018-06-28");
});
console.log(result);
This returns undefined. So I try below:
const paramsid = this.state.loadedPageForecasts;
console.log(paramsid);
let result = paramsid.filter(obj => {
return paramsid.includes("2018-06-28");
});
console.log(result);
This just returns an empty array.
I think I'm calling the object wrong, but I thought "includes" would scan all objects in the array for this.
I also tried:
let result = paramsid.filter(obj => {
return this.state.dt_txt.includes("2018-06-28");
});
and
let result = paramsid.filter(obj => {
return this.props.dt_txt.includes("2018-06-28");
});
Based on the below screenshot when I do console.log(paramsid)
I'm still pretty new to React and I'm trying to get as much practice as I can. Any help is appreciated!
You want to check if the obj has a dt_txt that includes the date you are looking for in the function given to filter:
let result = paramsid.filter(obj => {
return obj.dt_txt.includes("2018-06-28");
});
Do this:
let result = paramsid.filter(obj => {
return obj.dt_txt.includes("2018-06-28");
});
Also, make sure you are calling this after the setState is successful (In a setState callback). It'll make sure that you are performing the following operation in an asynchronous manner.

Categories

Resources