Mongoose inserting same data three times instead of iterating to next data - javascript

I am trying to seed the following data to my MongoDB server:
const userRole = {
role: 'user',
permissions: ['readPost', 'commentPost', 'votePost']
}
const authorRole = {
role: 'author',
permissions: ['readPost', 'createPost', 'editPostSelf', 'commentPost',
'votePost']
}
const adminRole = {
role: 'admin',
permissions: ['readPost', 'createPost', 'editPost', 'commentPost',
'votePost', 'approvePost', 'approveAccount']
}
const data = [
{
model: 'roles',
documents: [
userRole, authorRole, adminRole
]
}
]
When I try to iterate through this object / array, and to insert this data into the database, I end up with three copies of 'adminRole', instead of the three individual roles. I feel very foolish for being unable to figure out why this is happening.
My code to actually iterate through the object and seed it is the following, and I know it's actually getting every value, since I've done the console.log testing and can get all the data properly:
for (i in data) {
m = data[i]
const Model = mongoose.model(m.model)
for (j in m.documents) {
var obj = m.documents[j]
Model.findOne({'role':obj.role}, (error, result) => {
if (error) console.error('An error occurred.')
else if (!result) {
Model.create(obj, (error) => {
if (error) console.error('Error seeding. ' + error)
console.log('Data has been seeded: ' + obj)
})
}
})
}
}
Update:
Here is the solution I came up with after reading everyone's responses. Two private functions generate Promise objects for both checking if the data exists, and inserting the data, and then all Promises are fulfilled with Promise.all.
// Stores all promises to be resolved
var deletionPromises = []
var insertionPromises = []
// Fetch the model via its name string from mongoose
const Model = mongoose.model(data.model)
// For each object in the 'documents' field of the main object
data.documents.forEach((item) => {
deletionPromises.push(promiseDeletion(Model, item))
insertionPromises.push(promiseInsertion(Model, item))
})
console.log('Promises have been pushed.')
// We need to fulfil the deletion promises before the insertion promises.
Promise.all(deletionPromises).then(()=> {
return Promise.all(insertionPromises).catch(()=>{})
}).catch(()=>{})
I won't include both promiseDeletion and promiseInsertion as they're functionally the same.
const promiseDeletion = function (model, item) {
console.log('Promise Deletion ' + item.role)
return new Promise((resolve, reject) => {
model.findOneAndDelete(item, (error) => {
if (error) reject()
else resolve()
})
})
}
Update 2: You should ignore my most recent update. I've modified the result I posted a bit, but even then, half of the time the roles are deleted and not inserted. It's very random as to when it will actually insert the roles into the server. I'm very confused and frustrated at this point.

You ran into a very common problem when using Javascript: You shouldn't define (async) functions in a regular for (-in) loop. What happens, is that while you loop through the three values the first async find is being called. Since your code is async, nodejs does not wait for it to finish, before it continues to the next loop iteration and counts up to the third value, here the admin rule.
Now, since you defined your functions in the loop, when the first async call is over, the for-loop already looped to the last value, which is why admin is being inserted three times.
To avoid this, you can just move the async functions out of the loop to force a call by value rather than reference. Still, this can bring up a lot of other problems, so I'd recommend you to rather have a look at promises and how to chain them (e.g. Put all mongoose promises in an array and the await them using Promise.all) or use the more modern async/await syntax together with the for-of loop that allows for both easy readability as well as sequential async command instructions.
Check this very similar question: Calling an asynchronous function within a for loop in JavaScript
Note: for-of is being discussed as to performance heavy, so check if this applies to your use-case or not.

When using async functions in loops could cause some problems.
You should change the way you work with findOne to make it synchronous function
First you need to set your function to async, and then use the findOne like so:
async function myFucntion() {
let res = await Model.findOne({'role':obj.role}).exec();//Exec will fire the function and give back a promise which the await can handle.
//do what you need to do here with the result..
}

Related

Updating async forEach to update every document property based off property from another collection

I have this piece of code that works. However, I am not sure why and I feel like it might behave inconsistently.
await Listing.find({}, (err, listings) => {
if (err) {
console.log(err);
}
listings.forEach(async (listing) => {
//console.log(listing);
let championsUpdate = {};
for (let key in listing["champions"]) {
championsUpdate[key] = rankingDB[key];
}
await Listing.updateOne(
{ _id: listing._id },
{ $set: { champions: championsUpdate } }
);
});
});
Pretty much I am finding all the listings that I need to update, and then for each listing I am updating one of the properties based off the data I retrieved earlier.
So far it's been behaving appropriately but I remember being told to avoid using async await in a forEach loop because it does not behave as we expect. But I can't figure out why this is working and if I should avoid the forEach and use a forOf. I am also worried about having nested async awaits.
Does anyone know if something like this is ok? For more context on my application
Because the callback in the forEach loop is async, things that follow the call to forEach may execute before forEach finishes, and it will not wait for each iteration to finish before continuing.
For example the await in front of the updateOne call is actually pointless, since the outer async function isn't awaited on, which shows that it is probably not behaving the way you intend it to.
The reason it is recommended that you not use async inside a forEach is that it is almost never behaving the way you intend it to, or you don't actually need it, or you may forget that you called it that way later and unintentionally cause a race condition later (ie: it makes the execution order hard to reason about and virtually unpredictable). It has valid use if you actually do not care about the results of the call, when they are called, and when they resolve.
Below is a demo showing how it can cause unexpected results. sleep with random time to make each .updateOne call resolve at random times. Notice that call to console.log(`Will execute before...`) executes before the forEach iterations.
async function runit() {
await Listing.find({}, (err, listings) => {
if (err) {
console.log(err);
}
listings.forEach(async (listing) => {
//console.log(listing);
let championsUpdate = {};
for (let key in listing["champions"]) {
championsUpdate[key] = rankingDB[key];
}
await Listing.updateOne(
{ _id: listing._id },
{ $set: { champions: championsUpdate } }
);
});
});
console.log(`Will execute before the updateOne calls in forEach resolve`)
}
runit()
<script>
// mock data with sequential _id from 0...9
listings = Array(10).fill().map((_,_id)=>({_id, champions: {1:1,2:2}}))
rankingDB = Array(10).fill({1:1.1,2:2.2})
// Promise that resolves in ms milliseconds
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
// mock Listing
Listing = {
find(x, fn) {
console.log('find')
return new Promise(res=>
res(fn(undefined,listings)))
},
// updateOne resolves after random time < 1000 ms
async updateOne({_id}) {
await sleep(Math.random()*1000)
console.log('updateOne with listing _id=',_id)
}
}
</script>

Node.js - How to return callback with array from for loop with MySQL query?

I'm trying to get list of virtual communities on my Node.js app and then return it with callback function. When i call a getList() method with callback it returns a empty array.
const mysqli = require("../mysqli/connect");
class Communities{
getList(callback){
var list = [];
mysqli.query("SELECT * FROM communities", (err, communities) => {
for(let i = 0; i < communities.length; i++){
mysqli.query("SELECT name FROM users WHERE id='"+ communities[i].host +"'", (err, host) => {
list.push({
"id": communities[i].id,
"name": communities[i].name,
"hostID": communities[i].host,
"hostName": host[0].name,
"verified": communities[i].verified,
"people": communities[i].people
});
});
}
callback(list);
});
}
}
new Communities().getList((list) => {
console.log(list);
});
I need to make for loop to asynchronous and call callback when for loop ends. Please let me know how to do this. Thanks.
Callbacks get really ugly if you have to combine multiple of them, thats why Promises were invented to simplify that. To use Promises in your case you have to create a Promise first when querying the database¹:
const query = q => new Promise((resolve, reject) => mysqli.query(q, (err, result) => err ? reject(err) : resolve(result)));
Now doing multiple queries will return multiple promises, that can be combined using Promise.all to one single promise²:
async getList(){
const communities = await query("SELECT * FROM communities");
const result = await/*³*/ Promise.all(communities.map(async community => {
const host = await query(`SELECT name FROM users WHERE id='${community.host}'`);/*⁴*/
return {
...community,
hostName: host[0].name,
};
}));
return result;
}
Now you can easily get the result with:
new Communities().getList().then(list => {
console.log(list);
});
Read on:
Working with Promises - Google Developers
Understanding async / await - Ponyfoo
Notes:
¹: If you do that more often, you should probably rather use a mysql library that does support promises natively, that safes a lot of work.
²: Through that the requests are done in parallel, which means, that it is way faster than doing one after another (which could be done using a for loop & awaiting inside of it).
³: That await is superfluous, but I prefer to keep it to mark it as an asynchronous action.
⁴: I guess that could also be done using one SQL query, so if it is too slow for your usecase (which I doubt) you should optimize the query itself.

How to loop async query code of firebase?

I am trying to loop and get different documents from firestore. The 'document ids' are provided by an array named 'cart' as you can see in the code below.
The programming logic which I have tried goes like this the while loop in every iteration gets document from firestore and in first 'then' section it saves the data which it just have got and in second 'then' it increments the 'i' and does the next cycle of loop.
The problem is while loop doesn't wait for that get request to finish. It just keeps looping and crashes.
The thing is even if I somehow manage to do the loop part correct. How would I manage the overall execution flow of program so that only after completing the loop part further code gets executed since the code below uses the cart array which loop part updates.
let i = 0
while (i < cart.length) {
let element = cart[i]
db.collection(`products`).doc(element.productID).get().then((doc1) => {
element.mrp = doc1.data().mrp
element.ourPrice = doc1.data().ourPrice
return console.log('added price details')
}).then(() => {
i++;
return console.log(i)
}).catch((error) => {
// Re-throwing the error as an HttpsError so that the client gets the error details.
throw new functions.https.HttpsError('unknown', error.message, error);
});
}
return db.collection(`Users`).doc(`${uid}`).update({
orderHistory: admin.firestore.FieldValue.arrayUnion({
cart,
status: 'Placed',
orderPlacedTimestamp: timestamp,
outForDeliveryTimestamp: '',
deliveredTimestamp: ''
})
}).then(() => {
console.log("Order Placed Successfully");
})
Your question is not about firebase, you're asking about looping asynchronously. You can see some promises examples here, and async/await here
You can use reduce on the promises.
Note that all the promises are being created at the same time, but the call to the server is done one after the other.
cart.reduce(
(promise, element) =>
promise.then(() => {
return db.collection(`products`)
.doc(element.productID)
.get()
.then(doc1 => {
element.mrp = doc1.data().mrp;
element.ourPrice = doc1.data().ourPrice;
});
}),
Promise.resolve()
);
If you can, use async/await instead. Here all the promises are being created one after the other.
async function fetchCart() {
for (const element of cart) {
const doc1 = await db.collection(`products`).doc(element.productID);
element.mrp = doc1.data().mrp;
element.ourPrice = doc1.data().ourPrice;
console.log('added price details');
}
}
Each call to Cloud Firestore happens asynchronously. So your while loop fires off multiple such requests, but it doesn't wait for them to complete.
If you have code that needs all the results, you will need to uses Promises to ensure the flow. You're already using the promise in the while loop to get doc1.data().mrp. If cart is an array, you can do the following to gather all promises of when the data is loaded:
var promises = cart.map(function(element) {
return db.collection(`products`).doc(element.productID).get().then((doc1) => {
return doc1.data();
});
});
Now you can wait for all data with:
Promise.all(promises).then(function(datas) {
datas.forEach(function(data) {
console.log(data.mrp, data.ourPrice);
});
});
If you're on a modern environment, you can use async/await to abstract away the then:
datas = await Promise.all(promises);
datas.forEach(function(data) {
console.log(data.mrp, data.ourPrice);
});

How to write an arbitrarily long Promise chain

I receive an object bigListFromClient that includes an arbitrary number of objects each of which may have an arbitrary number of children. Every object needs to be entered into my database, but the DB needs to assign each of them a unique ID and child objects need to have the unique ID of their parents attached to them before they are sent off to the DB.
I want to create some sort of Promise or other calling structure that would call itself asynchronously until it reached the last object in bigListFromClient but I'm having trouble figuring out how to write it.
for(let i = 0; i < bigListFromClient.length; i++){
makeDbCallAsPromise(bigListFromClient[i].queryString, console.log); //I'm not just accepting anything from a user here, but how I get my queryString is kind of out of scope for this question
for(let j = 0; j < bigListFromClient[i].children.length; j++){
//the line below obviously doesn't work, I'm trying to figure out how to do this with something other than a for loop
makeDbCallAsPromise(bigListFromClient[i].children[j].queryString + [the uniqueID from the DB to insert this correctly as a child], console.log);
}
}
//this promise works great
makeDbCallAsPromise = function(queryString){
return new Promise((resolve, reject) => {
connection = mysql.createConnection(connectionCredentials);
connection.connect();
query = queryString;
connection.query(query, function (err, rows, fields) {
if (!err) {
resolve(rows);
} else {
console.log('Error while performing Query.');
console.log(err.code);
console.log(err.message);
reject(err);
}
});
connection.end();
})
};
My attempts at solving this on my own are so embarrassingly bad that even describing them to you would be awful.
While I could defer all the calls to creating children until the parents have been created in the DB, I wonder if the approach I've described is possible.
There are essentially two ways to do this. One is making the database calls sequential and the other one is making the calls parallel.
Javascript has a built-in function for parallel called Promise.all, you pass it an array of Promise instances and it returns a Promise instance containing the array.
In your case your code would look like this:
const result = Promise.all(
bigListFromClient.map(item =>
makeDbCallAsPromise(item.queryString).then(result =>
Promise.all(
item.children.map(item =>
makeDbCallAsPromise(item.queryString + [result.someId])
)
)
])
})
result will now contain a Promise that resolves to an array of arrays. These arrays contain the result of intserting children.
Using a more modern approach (with async await), sequential and with all results in a flat array:
const result = await bigListFromClient.reduce(
async (previous, item) => {
const previousResults = await previous
const result = await makeDbCallAsPromise(item.queryString)
const childResults = await item.children.reduce(
async (result, item) =>
[...(await result), await makeDbCallAsPromise(item.queryString + [result.someId])],
[]
)
return [...previousResults, result, ...childResults)
]),
[]
})
Depending on what you want to achieve and how you want to structure your code you can pick and choose from the different approaches.
For this sort of operation, try looking into bulk inserting. If you are intent on performing a single DB query/transaction per iteration, loop recursively over each parent and/or execute the same procedure for each child.
const dbCall = async (elm) => {
elm.id = Math.random().toString(36).substring(7)
if (elm.children) {
await Promise.all(elm.children.map(child => {
child.parentId = elm.id
return dbCall(child)
}))
}
return elm
}
const elms = [
{
queryString: '',
children: [
{
queryString: ''
}
]
}
]
Promise.all(elms.map(dbCall)).then(elm => /* ... */)

Javascript/NodeJS: Array empty after pushing values in forEach loop

I got a little bit of a problem. Here is the code:
Situation A:
var foundRiders = [];
riders.forEach(function(rider){
Rider.findOne({_id: rider}, function(err, foundRider){
if(err){
console.log("program tried to look up rider for the forEach loop finalizing the results, but could not find");
} else {
foundRiders.push(foundRider);
console.log(foundRiders);
}
});
});
Situation B
var foundRiders = [];
riders.forEach(function(rider){
Rider.findOne({_id: rider}, function(err, foundRider){
if(err){
console.log("program tried to look up rider for the forEach loop finalizing the results, but could not find");
} else {
foundRiders.push(foundRider);
}
});
});
console.log(foundRiders);
So in Situation A when I console log I get that foundRiders is an array filled with objects. In situation B when I put the console.log outside the loop, my roundRiders array is completely empty...
How come?
As others have said, your database code is asynchronous. That means that the callbacks inside your loop are called sometime later, long after your loop has already finishes. There are a variety of ways to program for an async loop. In your case, it's probably best to move to the promise interface for your database and then start using promises to coordinate your multiple database calls. You can do that like this:
Promise.all(riders.map(rider => {
return Rider.findOne({_id: rider}).exec();
})).then(foundRiders => {
// all found riders here
}).catch(err => {
// error here
});
This uses the .exec() interface to the mongoose database to run your query and return a promise. Then, riders.map() builds and returns an array of these promises. Then,Promise.all()monitors all the promises in the array and calls.then()when they are all done or.catch()` when there's an error.
If you want to ignore any riders that aren't found in the database, rather than abort with an error, then you can do this:
Promise.all(riders.map(rider => {
return Rider.findOne({_id: rider}).exec().catch(err => {
// convert error to null result in resolved array
return null;
});
})).then(foundRiders => {
foundRiders = foundRiders.filter(rider => rider !== null);
console.log(founderRiders);
}).catch(err => {
// handle error here
});
To help illustrate what's going on here, this is a more old fashioned way of monitoring when all the database callbacks are done (with a manual counter):
riders.forEach(function(rider){
let cntr = 0;
Rider.findOne({_id: rider}, function(err, foundRider){
++cntr;
if(err){
console.log("program tried to look up rider for the forEach loop finalizing the results, but could not find");
} else {
foundRiders.push(foundRider);
}
// if all DB requests are done here
if (cntr === riders.length) {
// put code here that wants to process the finished foundRiders
console.log(foundRiders);
}
});
});
The business of maintaining a counter to track multiple async requests is what Promise.all() has built in.
The code above assumes that you want to parallelize your code and to run the queries together to save time. If you want to serialize your queries, then you could use await in ES6 with a for loop to make the loop "wait" for each result (this will probably slow things down). Here's how you would do that:
async function lookForRiders(riders) {
let foundRiders = [];
for (let rider of riders) {
try {
let found = await Rider.findOne({_id: rider}).exec();
foundRiders.push(found);
} catch(e) {
console.log(`did not find rider ${rider} in database`);
}
}
console.log(foundRiders);
return foundRiders;
}
lookForRiders(riders).then(foundRiders => {
// process results here
}).catch(err => {
// process error here
});
Note, that while this looks like it's more synchronous code like you may be used to in other languages, it's still using asynchronous concepts and the lookForRiders() function is still returning a promise who's result you access with .then(). This is a newer feature in Javascript which makes some types of async code easier to write.

Categories

Resources