Nested Map loop async/await vs nested for loop async/await - javascript

I have a nested map loop with async awaits where I do 2 promise.all statements.
The data is formatted in 1 large array in which contains possibly multiple arrays of objects.
The inner loop will map through the objects in an array, and the outer loop will map through the arrays within the main array.
I'm using map since it is easy to keep this functioning with parallel performance versus sequential. I'm not sure if it is worth keeping it parallel or if there is a better way of doing this (maybe a for-each loop).
Here is the code (simplified/summarized) that I am using currently.
const outerPromise = information.map(async order => {
const innerPromise = order.moreInformation.map(async singleOrder => {
if (something) {
const response = await axios({ ... });
return response.specificDataField;
}
});
const orders = await Promise.all(innerPromise);
return orders.filter((obj) => obj);
});
const orders = await Promise.all(outerPromise);
return orders;
Sorry if the formatting is slightly off, the indentation might be off, I couldn't get it to format properly.
Any help will be greatly appreciated. Thank you!
P.S. This is being written in JS/NodeJs (ExpressJs)
Edit:
I don't think the issue is with me filtering afterwards since it is the objects that would possibly come out null; I would need to filter after getting back the innerPromise (to see which are null?).
That being said, it is really the 2 map statements that make me feel as if there is a better way of doing this.

Related

Should spread operator behave differently in asynchronous conditions?

I am trying to replace an instance of hardcoding in a web application with 2 very similar API calls, as I don't have time to rewrite a whole new procedure and endpoint to handle the same functionality right now. I am able to get the data from the 2 individual calls fine with Promise.all. But the template for this page (it is using handlebars) is expecting a single object to loop through and display the options in a drop down.
I have been trying to use spread operators to bring the contents of the results together with no duplicates, and it does work as expected if I mock it up synchronously in the console, but in practice the results get nested into another object as if it were an array.
This is the code actually running:
Promise.all([
this.isUsers = GetUsersByRole("Inside Sales", app.user.id),
this.asmUsers = GetUsersByRole("Area Sales Managers", app.user.id)
]).then((is, asm) => {
// is and asm contain the correct data here, I just want to merge it
this.users = {...is, ...asm};
var dummy = {...is, ...asm};
console.log("dummy", dummy);
});
In the console, manually copying the data from each successful data grab into its own object and then combining with the spread operator as above gives the expected result. But this code returns some sort of nesting of it instead:
If the spread operator (or using Object.attach, which I have also tried) worked as expected, I believe I would have the result I'm looking for. I don't know how many little hacks I've tried to merge the objects properly, but none seem to have the right behavior in this context.
I may just need some more reading on Promises and async operations, so feel free to link me
EDIT
Destructuring the results of the Promise with array brackets like so:
Promise.all([
this.isUsers = GetUsersByRole("Inside Sales", app.user.id),
this.asmUsers = GetUsersByRole("Area Sales Managers", app.user.id)
]).then(([is, asm]) => {
this.users = {...is, ...asm};
var dummy = {...is, ...asm};
console.log("Dummy", dummy);
});
...leaves the dummy object completely empty as far as I can tell:
empty dummy object
Trying to spread the data objects with square brackets:
var dummy = [...is, ...asm];
Results in a type error, as they as Objects:
Uncaught (in promise) TypeError: is is not iterable
EDIT
Thank you to everyone for their help, I had to convert the function I was using to make the api calls to return a Promise and then the code worked as expected

Nested Object.keys() are printing properties multiple times instead of only once

I have two objects that I need to loop through so I can use their properties later on. However if I print the variables each of them is printed twice. I understand that because I have the Object.keys() inside other Object.keys(). Is there any way to loop through these two objects and only get each variable one time?
My code:
Object.keys(newData.name).map(async key => {
Object.keys(newData._temp.images).map(async keyImage => {
console.log(newData.name[key].name,'printed 2x instead of once');
console.log(newData._temp.images[keyImage].rawFile.preview, 'printed 2x instead of once');
});
});
Thank you in advance.
your logic here of nesting the loops is wrong.
these 2 object does not seem to be connected to one another, meaning you do not need the data from the first loop in order to perform the other loops. just split it into 2 seperate loops, would save you both time and repititions:
let nameKeys = Object.keys(newData.name).map(key => newData.name[key].name);
let imagesKeys = Object.keys(newData._temp.images).map(keyImage =>
newData._temp.images[keyImage].rawFile.preview);
now you can access nameKeys and imageKeys whenever you want, and they will contain the values you previously logged. My naming might be a bit off tho, feel free to change that :D
Also, as others mentioned- no need for the async keyword... you do not perform any async operation inside (yet, at least. if thats what you're planning then go ahead and keep it).
These iterators do not need to be nested. The second iterator is not looping through an item of the first iterator.
Object.keys(newData.name).forEach(key => {
console.log(newData.name[key].name);
});
Object.keys(newData._temp.images).forEach(keyImage => {
console.log(keyImage[keyImage].rawFile.preview);
});
If you are only iterested in outputting data, then .map() is not the right function to use because this is used when you care about the return value. Use .forEach() if you just want to loop through things.
Also, the async keyword is not needed here.. unless you plan to do some async/await stuff in the loops later!
You could iterate over the indices once and then access the values in both arrays:
const names = Object.keys(newData.name);
const images = Object.keys(newData._temp.images);
for(let i = 0; i < Math.min(names.length, images.length); i++) {
const name = names[i];
const image = images[i];
//...
}

Is there a more elegant way to push and return an array?

I am still figuring out Promises, but while working with them, I've realized it would be nice to reduce an array of fetch objects and put some throttles next to them. While creating my slow querying function, I realized I couldn't think of an elegant way to push onto an array and return that array better than this.
SO. My question is; Is there a more elegant way of pushing to an array and returning an array in one step in Javascript than this?
const mQry = q => fetch(q).then(r=>r.json()); // Fetches and returns json
const throttle = t => new Promise(r=>setTimeout(r,t)); // adds a promised timeout
const slowQrys = (q,t) => // pass in an array of links, and a number of milliseconds
Promise.all(q.reduce((r,o)=> // reduce the queries
// Here's the big issue. Is there any more elegant way
// to push two elements onto an array and return an array?
[...r, mQry(...o), throttle(t)]
,[]);
And before anyone says, I am super aware that splitting out an array could be not efficient, but I'm probably never using more than 10 items, so it's not a super big deal.
A cleaner and more efficient equivalent of the general operation
q.reduce((r, o) =>
[...r, f(...o), g(t)])
uses flatMap:
q.flatMap(o =>
[f(...o), g(t)])
However, in the context of your question, creating a throttle(t) next to each fetch operation in a Promise.all is completely and unambiguously wrong. All of the setTimeout timers will be running in parallel and resolve at the same time, so there’s no point in creating more than one. They don’t interact with the fetch operations, either, just delay the overall fulfilment of the promise slowQrys returns and muddle the array it resolves to.
I would guess that your intent is to chain your fetch(s), such that two consecutive fetch are mandatorily spaced by at least t ms
The chaining is thus
Promise.all([fetch, wait]), Promise.all([fetch, wait]), ...
The way to write that thus be
const slowQrys = (links, t)=>links.reduce((p, link)=>{
return p.then(_=>{
return Promise.all([
fetch(link),
wait(t)
])
})
}, Promise.resolve())

Best way to loop through array object

leagueInfo = {"data":[{"tier":"Gold"},{"tier":"Bronze"}]}
So far I have been doing 2 for loops like this:
for (const key of Object.keys(leagueInfo)) {
console.log('5on5 ranked', leagueInfo[key]);
// Array (2) is output
for (const values of leagueInfo[key]) {
console.log('5on5 ranked', values.tier );
// Output is :
// Gold
// Bronze
}
}
Do I really need 2 loops or is there a shorter way of doing this?
leagueInfo.data.forEach(item => console.log(item.tier));
There are several ways.
You could use methods from the lodash or underscore libraries, that are replicas of how the .foreach or for loops work.
If the data that you have is always the same and similar to the one posted you can do the following to iterate through the data items that you have in the array. Keep in mind that the first iteration you are doing is useless, since you could access the property directly.
var leagueInfo = {"data":[{"tier":"Gold"},{"tier":"Bronze"}]}
leagueInfo.data.forEach((item) => {
console.log(item);
console.log(item.tier);
})
There is dozens of ways to iterate objects or arrays. And usually with functions specifically adapted for certain goals in mind. if you want to only console.log iteration result you can use .map()
var leagueInfo = {"data":[{"tier":"Gold"},{"tier":"Bronze"}]};
Object.values(leagueInfo).map(function(dataArray) {
console.log('5on5 ranked', dataArray);
dataArray.map(function(values) {
console.log('5on5 ranked', values.tier );
})
})
And here's a link to W3Schools where you can find all possible actions with arrays.
https://www.w3schools.com/jsref/jsref_obj_array.asp

Promises and upserting to database in bulk

I am currently parsing a list of js objects that are upserted to the db one by one, roughly like this with Node.js:
return promise.map(list,
return parseItem(item)
.then(upsertSingleItemToDB)
).then(all finished!)
The problem is that when the list sizes grew very big (~3000 items), parsing all the items in parallel is too memory heavy. It was really easy to add a concurrency limit with the promise library and not run out of memory that way(when/guard).
But I'd like to optimize the db upserts as well, since mongodb offers a bulkWrite function. Since parsing and bulk writing all the items at once is not possible, I would need to split the original object list in smaller sets that are parsed with promises in parallel and then the result array of that set would be passed to the promisified bulkWrite. And this would be repeated for the remaining sets if list items.
I'm having a hard time wrapping my head around how I can structure the smaller sets of promises so that I only do one set of parseSomeItems-BulkUpsertThem at time (something like Promise.all([set1Bulk][set2Bulk]), where set1Bulk is another array of parallel parser Promises?), any pseudo code help would be appreciated (but I'm using when if that makes a difference).
It can look something like this, if using mongoose and the underlying nodejs-mongodb-driver:
const saveParsedItems = items => ItemCollection.collection.bulkWrite( // accessing underlying driver
items.map(item => ({
updateOne: {
filter: {id: item.id}, // or any compound key that makes your items unique for upsertion
upsert: true,
update: {$set: item} // should be a key:value formatted object
}
}))
);
const parseAndSaveItems = (items, offset = 0, limit = 3000) => { // the algorithm for retrieving items in batches be anything you want, basically
const itemSet = items.slice(offset, limit);
return Promise.all(
itemSet.map(parseItem) // parsing all your items first
)
.then(saveParsedItems)
.then(() => {
const newOffset = offset + limit;
if (items.length >= newOffset) {
return parseAndSaveItemsSet(items, newOffset, limit);
}
return true;
});
};
return parseAndSaveItems(yourItems);
The first answer looks complete. However here are some other thoughts that came to mind.
As a hack-around, you could call a timeout function in the callback of your write operation before the next write operation performs. This can give your CPU and Memory a break inbetween calls. Even if you add one millisecond between calls, that is only adding 3 seconds if you have a total of 3000 write objects.
Or you can segment your array of insertObjects, and send them to their own bulk writer.

Categories

Resources