I got two cases which I tried to understand what is happen but it still not clear enough form me ,so I read that if I want to run async call in for loop or map I need to use Promise.all
but let me share what happen with me
First I used map to update many records in my database ,It updated some and not all the data
AllAuthToken.map(async singleToken =>{
let deviceIds = uuidv4();
let newDeviceArray = {
[deviceIds]: singleToken.deviceToken,
};
await AuthToken.updateOne(
{ _id: singleToken._id },
{
$set: {
tokensDeviceArray: [newDeviceArray],
deviceId: [deviceIds],
},
},
{ $new: true }
);
}
Then after this happened I used for loop and it updated all the data
for (let i = 0; i < AllAuthToken.length; i++) {
let deviceIds = uuidv4();
let newDeviceArray = {
[deviceIds]: AllAuthToken[i].deviceToken,
};
await AuthToken.updateOne(
{ _id: AllAuthToken[i]._id },
{
$set: {
tokensDeviceArray: [newDeviceArray],
deviceId: [deviceIds],
},
},
{ $new: true }
);
}
So what happen so that the first case failed and the second passed
The reason is that .map executes its callback one after the other without waiting. The await does its thing, but that only delays the code that follows after that await. But it does not prevent the .map iteration to continue with the next call of the callback, as it is not part of the async function in which the await occurs.
The for loop on the other hand, is part of the same async function as where the await occurs, and so the iteration only continues when the awaited promise resolves.
If you prefer to have the database requests executed in "parallel", i.e. where you launch the next request without waiting for the previous one to resolve, then use Promise.all, like this:
let promises = AllAuthToken.map(singleToken => {
let deviceIds = uuidv4();
let newDeviceArray = {
[deviceIds]: singleToken.deviceToken,
};
return AuthToken.updateOne(
{ _id: singleToken._id },
{
$set: {
tokensDeviceArray: [newDeviceArray],
deviceId: [deviceIds],
},
},
{ $new: true }
);
};
await Promise.all(promises);
/* ... now all is done ... */
Related
I build a scraper where the scraped data gets compared with already existing data to avoid duplicates, create new entries and update old entries. I'm doing this with a for loop, which loops over a findOne function where are two awaits in. The problem is that my for loop is ignoring (because it's sync?) my awaits and goes over to a part, where it is important that all of these awaits are done.
async function comparedata(length) {
console.log("Starting comparing entries in data");
for (let x = 0; x < length; x++) {
const model = new dataModel({
link: dataLinks[x],
name: dataGames[x].replace('Download', ' '),
logo: dataLogos[x],
provider: 'data',
});
model.collection.findOne({ "link": dataLinks[x] }, async function (err, found) {
if (err) throw err;
if (found == null) {
await model.save().then((result) => {
console.log(result) // Is not happening because the for loop goes through to the next function and closes the server
}).catch((err) => { console.log(err) });
}
else if (found != null) {
if (dataGames[x] != found.name) {
await model.collection.findOneAndUpdate({ link: dataLinks[x] }, { $set: { name: dataGames[x] } });
}
}
})
}
closeServer()//Closes the server is happening before new entries or updates are made.
}
My idea was to work with promises, but even if I tried to, it was just getting resolved too fast and closes the server again.
You should be able to simplify your logic like this:
async function comparedata(length) {
console.log('Starting comparing entries in data');
try {
for (let x = 0; x < length; x++) {
let found = await dataModel.findOne({ link: dataLinks[x] });
if (!found) {
found = await dataModel.create({
link: dataLinks[x],
name: dataGames[x].replace('Download', ' '),
logo: dataLogos[x],
provider: 'data',
});
} else if (found.name !== dataGames[x]) {
found.name = dataGames[x];
await found.save();
}
console.log(found);
}
} catch (e) {
console.log(e);
}
closeServer();
}
For the first iteration of for loop, the findOne's callback is put in the callback queue by the event loop and it proceeds with the next iteration not waiting for the awaits, this goes till the last iteration and after the last iteration it immediately calls the closeServer(), after this closeServer() call the tasks put into the callback queue (i.e the findOne's callbacks) are considered and executed by the event loop. Inorder to get an understanding of this, you have to learn about the event loop and how the event loop executes the javascript code. Please check about the event loop working mechanism youtube video here
You can use the promise style execution of findOne and overcome this issue.
And according to me, using await & then() functionalities on the same statement is not a good practice.
My suggestion,
async function comparedata(length) {
console.log("Starting comparing entries in data");
for (let x = 0; x < length; x++) {
const model = new dataModel({
link: dataLinks[x],
name: dataGames[x].replace('Download', ' '),
logo: dataLogos[x],
provider: 'data',
});
// using .exec() at the end allows us to go with the promise way of dealing things rather than callbacks
// assuming that 'dataModel' is the Schema, so I directly called findOne on it
const found = await dataModel.findOne({ "link": dataLinks[x] }).exec();
if (found == null) {
// wrap the await in try...catch for catching errors while saving
try{
await model.save();
console.log("Document Saved Successfully !");
}catch(err) {
console.log(`ERROR while saving the document. DETAILS: ${model} & ERROR: ${err}`)
}
} else if (found != null) {
if (dataGames[x] != found.name) {
await model.collection.findOneAndUpdate({ link: dataLinks[x] }, { $set: { name: dataGames[x] } });
}
}
if (x > length)
sendErrorMail();
}
closeServer()//Closes the server is happening before new entries or updates are made.
}
NOTE: Please refer the Mongoose official documentation for updates.
I'm not super familiar with mongoose api, but if you are unable to use a promisified version then you can use the Promise constructor to "jump" out of a callback:
const found = await new Promise((resolve, reject) => {
model.collection.findOne({ "link": dataLinks[x] }, function (err, found) {
if (err) {
reject(err);
return;
};
resolve(found);
}
});
This might help you work things out.
I need to create a function, that will return a Promise and will call another function that will have an axios.get() call. Axios.get() calls an API that returns a data with the following structure:
{
count: 87, //total number of records
next: '[api_url]/&page=2'//indication if the API has a next page
previous: null, ////indication if the API has a previous page, if call URL above - it will return [api_url]/&page=1
results: [...] //10 objects for each call, or remaining for the last page
}
Since I know that only 10 results are being returned on every call, I need to check if the returned object has the next key, and if it does - make another call and so on, until no more next. I need to concatenate all the results and eventually resolve the Promise returned from the main function with all the data.
So I tried something like that:
const fetchResource = async({type, search, page}) {
const data = {count: 0, results: []}
const request = await performFetch({type, search, page}, data).then((data) => {
console.log('data?', data)
})
console.log('req', request)
}
const performFetch = async({type, search, page}, result) => {
const params = {
page
}
if (search) {
params.search = search
}
await axios.get(`${type}/`, {
params
}).then(async({data}) => {
result.results = [...result.results, ...data.results]
result.count = data.count
if (data.next) {
page += 1
await performFetch({type, search, page}, result)
} else {
console.log('result', result)
return result
}
})
.catch((err) => {
console.error(err)
})
}
Now I see that once I call fetchResourche all the requests are going out, and in console.log('result', result) I do see the concatenated data:
{
count: 87,
results: [/*all 87 objects*/]
}
But console.log('data?', data) and console.log('req', request) both print out undefined.
Where I return result, I tried to return Promise.resolve(result) - same result.
And I'm not sure how to return a Promise here, that will resolve once all the API calls are concluded and all the data is received. What am I missing? How do I make it work?
Couple of observations regarding your code:
There's no need to mix async-await syntax with promise chaining, i.e. then() and catch() method calls
Inside performFetch function, you need an explicit return statement. Currently, the function is implicitly returning a promise that fulfils with the value of undefined.
Key point here is that the performFetch function is returning before you get the result of http requests to the API.
It isn't waiting for the result of HTTP requests to be returned before returning.
Following is a simple demo that illustrates how you can make multiple requests and aggregate the data until API has returned all the data.
let counter = 0;
function fakeAPI() {
return new Promise(resolve => {
setTimeout(() => {
if (counter < 5) resolve({ counter: counter++ });
else resolve({ done: true });
}, 1000);
});
}
async function performFetch() {
const results = [];
// keep calling the `fakeAPI` function
// until it returns "{ done = true }"
while (true) {
const result = await fakeAPI();
if (result.done) break;
else results.push(result);
}
return results;
}
performFetch().then(console.log).catch(console.log);
<small>Wait for 5 seconds</small>
Your code can be rewritten as shown below:
const fetchResource = async ({ type, search, page }) => {
const data = { count: 0, results: [] };
const result = await performFetch({ type, search, page }, data);
console.log(result);
};
const performFetch = async ({ type, search, page }, result) => {
const params = { page };
if (search) params.search = search;
while (true) {
const { data } = await axios.get(`${type}/`, { params });
result.results = [...result.results, ...data.results];
result.count = data.count;
if (data.next) page += 1;
else return result;
}
};
Ideally, the code that calls the fetchResource function should do the error handling in case any of the HTTP request fails.
I have a Node.js AWS Lambda function created via the serverless framework. I have multiple helper functions inside it. I am having an issue with one of them due to being async. The function runs and logs out all parts I put comments next to however it doesn't update callDuration. I think that the code is having an issue due to async where it finishes in the wrong order. My goal is to be able to return the callDuration to my main function for further processing. How can I get all code to process/run and be able to meet my goal and have the code run in the right order
Here is the function:
const callAggregate = async (billingData, billingDB) => {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
await Billing.findOne({_id: billingData._id}).exec().then(bill => {
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
client.calls(call)
.fetch()
.then(callDetails => {
console.log(callDetails) // This logs out
callDuration += callDetails.duration
})
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
Billing.findOneAndUpdate(
{_id: billingData._id},
{ $inc: { call_duration: callDuration }, callSid: []},
(err, doc) => {
if(err) {
console.log(err)
}
}
)
return callDuration
})
}
This is a case of mixing and matching promises with plain callbacks and mixing await with .then(), both of which make proper flow-control and error handling management difficult.
Inside your function which is async and uses await in some places, you also have a promise you are not awaiting (which means it runs open loop and nothing waits for it) and you have a database function that is using a plain callback, not the promise interface so nothing waits for it either.
More specifically, nothing is waiting for this:
client.calls(call).fetch()
So, because of not waiting for the .fetch() to finish, you were attempting to use the variable callDuration before the code was done modifying that variable (giving you the wrong value for it).
Similarly, nothing is waiting for Billing.findOneAndUpdate(...) to complete either.
A clean solution is to switch everything over to promises and await. This involves, using only promises with your database (no plain callbacks) and converting the .then() handlers into await.
async function callAggregate(billingData, billingDB) {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
let bill = await Billing.findOne({ _id: billingData._id }).exec();
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
let callDetails = await client.calls(call).fetch();
console.log(callDetails) // This logs out
callDuration += callDetails.duration
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
let doc = await Billing.findOneAndUpdate({ _id: billingData._id }, { $inc: { call_duration: callDuration }, callSid: [] }).exec();
return callDuration
}
This function takes 2 asynchronous callbacks. I am not sure why, but when these callbacks are called, the promises they await aren't being awaited correctly. I'm thinking it may have something to do with the way I'm calling them. I am not very familiar with the promise API so I kind of just hacked this together. If someone could tell me if I am doing something wrong, I would really appreciate it.
async queryTasks(handleCommand, handleSubmission) {
const D = new Date().getTime();
const promises = [];
while (!this.tasks.isEmpty()) {
const task = this.tasks.dequeue();
// If not a submission
if (task.item.body) {
const command = new Command().test(task.item.body);
if (command) { // If the item received was a command, return the command, the item, and priority
const T = {
command: command,
item: task.item,
priority: task.priority,
time: D
}
console.log("Calling back with handleCommand(task)".bgMagenta.white);
promises.push(handleCommand(T));
}
} else if (task.item.title) { // Task was a submission
console.log("Calling back with handleSubmission".bgMagenta.black);
const T = {
item: task.item,
priority: task.priority,
time: D
}
promises.push(handleSubmission(T));
}
}
return Promise.all(promises);
}
Or maybe it's the way I'm calling it queryTasks()?
/* [Snoolicious Run Cycle] */
const INTERVAL = (process.env.INTERVAL * 1000);
async function run() {
console.log("Running Test!!!".green);
await snoolicious.getMentions(2);
console.log("Size of the queue: ", snoolicious.tasks.size());
await snoolicious.queryTasks(handleCommand, handleSubmission);
console.log(`Finished querying tasks. Sleeping for ${INTERVAL/1000} seconds...`.rainbow);
setTimeout(() => {
return run()
}, (INTERVAL));
}
(async () => {
await run();
})();
The output:
Preparing new database...
SELECT count(*) FROM sqlite_master WHERE type='table' AND name='saved';
Preparing statements...
Running Test!!!
MentionBot --Assigning hte FIRST utc...
Size of the queue: 2
Snoolicious Querying Tasks!
Calling back with handleCommand(task)
Bot -- handling a command! { directive: 'positive', args: [] }
Test passed
getting the parent submission...
Calling back with handleCommand(task)
Bot -- handling a command! { directive: 'positive', args: [] }
Test passed
getting the parent submission...
Finished querying tasks. Sleeping for 30 seconds...
Got this parent: Comment {...
Handling commands and getting parent submission: (snoolicious.requester = snoowrap.requester)
async function handleCommand(task) {
let id = `${task.item.parent_id}${task.item.created_utc}${task.item.id}`;
const checkedId = await db.checkID(id);
if (task.item.subreddit.display_name === process.env.MASTER_SUB) {
try {
validateCommand(task.command);
const parent = await getParentSubmission(task.item);
console.log("Got this parent: ", parent);
console.log("Checking against this item: ", task.item);
await checkUserRatingSelf(task.item);
await checkTypePrefix(task.item);
} catch (err) {
await replyWithError(err.message);
}
} else {
console.log("id HAS been seen: id ", checkedId);
}
}
// Get a parent submission:
async function getParentSubmission(item) {
console.log("getting the parent submission...".magenta);
if (item.parent_id.startsWith('t3_')) {
const rep = item.parent_id.replace('t3_', '');
const parent = await snoolicious.requester.getSubmission(rep);
return parent;
} else if (item.parent_id.startsWith('t1_')) {
const rep = item.parent_id.replace('t1_', '');
const parent = await snoolicious.requester.getComment(rep);
return parent;
}
}
Here is a function to build db queries:
const buildDbQueries = async elements => elements.reduce(
async (acc, element) => {
// wait for the previous reducer iteration
const { firstDbQueries, secondDbQueries } = await acc
const asyncStuff = await someApi(element)
// leave if the API does not return anything
if (!asyncStuff) return { firstDbQueries, secondDbQueries }
// async db query, returns a Promise
const firstDbQuery = insertSomethingToDb({
id: asyncStuff.id,
name: asyncStuff.name
})
// another async db query, returns a Promise
// have to run after the first one
const secondDbQuery = insertAnotherthingToDb({
id: element.id,
name: element.name,
somethingId: asyncStuff.id
})
return {
firstDbQueries: [...firstDbQueries, firstDbQuery],
secondDbQueries: [...secondDbQueries, secondDbQuery]
}
},
// initial value of the accumulator is a resolved promise
Promise.resolve({
firstDbQueries: [],
secondDbQueries: []
})
)
This function returns promises which should not be executed until they are resolved.
Now we use that function
const myFunc = async elements => {
const { firstDbQueries, secondDbQueries } = await buildDbQueries(elements)
// we don't want any query to run before this point
await Promise.all(firstDbQueries)
console.log('Done with the first queries')
await Promise.all(secondDbQueries)
console.log('Done with the second queries')
}
The problems are:
the queries are executed before we call Promise.all.
the firstDbQueries queries are not executed before the secondDbQueries causing errors.
EDIT
As suggested in a comment, I tried not to use reduce, but a for … of loop.
const buildDbQueries = async elements => {
const firstDbQueries = []
const secondDbQueries = []
for (const element of elements) {
const asyncStuff = await someApi(element)
// leave if the API does not return anything
if (!asyncStuff) continue
// async db query, returns a Promise
const firstDbQuery = insertSomethingToDb({
id: asyncStuff.id,
name: asyncStuff.name
})
// another async db query, returns a Promise
// have to run after the first one
const secondDbQuery = insertAnotherthingToDb({
id: element.id,
name: element.name,
somethingId: asyncStuff.id
})
firstDbQueries.push(firstDbQuery)
secondDbQueries.push(secondDbQuery)
}
return { firstDbQueries, secondDbQueries }
}
This still produces the exact same problems as the previous version with reduce.
Don't use an async reducer. Especially not to build an array of promises. Or an array of things to run later. This is wrong on so many levels.
I guess you are looking for something like
function buildDbQueries(elements) {
return elements.map(element =>
async () => {
const asyncStuff = await someApi(element)
// leave if the api doesn't return anything
if (!asyncStuff) return;
await insertSomethingToDb({
id: asyncStuff.id,
name: asyncStuff.name
});
return () =>
insertAnotherthingToDb({
id: element.id,
name: element.name,
somethingId: asyncStuff.id
})
;
}
);
}
async function myFunc(elements) {
const firstQueries = buildDbQueries(elements)
// we don't want any query to run before this point
const secondQueries = await Promise.all(firstQueries.map(query => query()));
// this call actually runs the query ^^^^^^^
console.log('Done with the first queries');
await Promise.all(secondQueries.map(query => query()));
// this call actually runs the query ^^^^^^^
console.log('Done with the second queries')
}