I have a Node.js AWS Lambda function created via the serverless framework. I have multiple helper functions inside it. I am having an issue with one of them due to being async. The function runs and logs out all parts I put comments next to however it doesn't update callDuration. I think that the code is having an issue due to async where it finishes in the wrong order. My goal is to be able to return the callDuration to my main function for further processing. How can I get all code to process/run and be able to meet my goal and have the code run in the right order
Here is the function:
const callAggregate = async (billingData, billingDB) => {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
await Billing.findOne({_id: billingData._id}).exec().then(bill => {
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
client.calls(call)
.fetch()
.then(callDetails => {
console.log(callDetails) // This logs out
callDuration += callDetails.duration
})
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
Billing.findOneAndUpdate(
{_id: billingData._id},
{ $inc: { call_duration: callDuration }, callSid: []},
(err, doc) => {
if(err) {
console.log(err)
}
}
)
return callDuration
})
}
This is a case of mixing and matching promises with plain callbacks and mixing await with .then(), both of which make proper flow-control and error handling management difficult.
Inside your function which is async and uses await in some places, you also have a promise you are not awaiting (which means it runs open loop and nothing waits for it) and you have a database function that is using a plain callback, not the promise interface so nothing waits for it either.
More specifically, nothing is waiting for this:
client.calls(call).fetch()
So, because of not waiting for the .fetch() to finish, you were attempting to use the variable callDuration before the code was done modifying that variable (giving you the wrong value for it).
Similarly, nothing is waiting for Billing.findOneAndUpdate(...) to complete either.
A clean solution is to switch everything over to promises and await. This involves, using only promises with your database (no plain callbacks) and converting the .then() handlers into await.
async function callAggregate(billingData, billingDB) {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
let bill = await Billing.findOne({ _id: billingData._id }).exec();
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
let callDetails = await client.calls(call).fetch();
console.log(callDetails) // This logs out
callDuration += callDetails.duration
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
let doc = await Billing.findOneAndUpdate({ _id: billingData._id }, { $inc: { call_duration: callDuration }, callSid: [] }).exec();
return callDuration
}
Related
I am scraping a bunch of API's and saving the data to a dynamodb table.
Everything works absolutely fine when running serverless invoke local -f runAggregator locally.
However, after I set up the cron, I noticed things were not being saved to the Dynamodb table.
Here is my function:
module.exports.runAggregator = async (event) => {
await runModules({ saveJobs: true });
return {
statusCode: 200,
body: JSON.stringify(
{
message: "Aggregate",
input: event,
},
null,
2
),
};
};
And the runModules function:
module.exports = async ({ saveJobs }) => {
if (saveJobs) {
const flushDb = await flushDynamoDbTable();
console.log("Flushing Database: Complete");
console.log(flushDb);
}
// pseudo code
const allJobs = myLongArrayOfJobsFromApis
const goodJobs = allJobs.filter((job) => {
if (job.category) {
if (!job.category.includes("Missing map for")) return job;
}
});
// This runs absolutely fine locally...
if (saveJobs) goodJobs.forEach(saveJob); // see below for function
const badJobs = allJobs.filter((job) => {
if (!job.category) return job; // no role found from API
if (job.category.includes("Missing map for")) return job;
});
console.log("Total Jobs", allJobs.length);
console.log("Good Jobs", goodJobs.length);
console.log("Malformed Jobs", badJobs.length);
return uniqBy(badJobs, "category");
};
saveJob function
// saveJob.js
module.exports = (job) => {
validateJob(job);
dynamoDb
.put({
TableName: tableName,
Item: job,
})
.promise();
};
I am baffled as to why this works fine locally not when I run a 'test' in the lambda console. I only found out due to the table being empty after the cron had ran.
saveJob performs an async operation (ddb.put().promise()) but you are neither awaiting its completion nor returning the promise.
As the forEach in the runModules function will also not await anything, the function completes before the call to dynamodb is even performed (because of how promises vs synchronous code work) and the process is killed after the lambda's execution.
Locally you are not running lambda but something that looks like it. There are subtle differences, and what happens after the function is done is one of those differences. So it may work locally, but it won't on an actual lambda.
What you need to do is to make sure you await your call to dynamodb. Something like:
// saveJob.js
module.exports = (job) => {
validateJob(job);
return dynamoDb
.put({
TableName: tableName,
Item: job,
})
.promise();
};
and in your main function:
...
if (saveJobs) await Promise.all(...goodJobs.map(job => saveJob(job)))
// or with a Promise lib such as bluebird:
if (saveJobs) await Promise.map(goodJobs, job => saveJob(job))
// (or Promise.each(...) if you need to make sure this happens in sequence and not in parallel)
Note: instead of calling many times dynamodb.put, you could/should call once (or at least fewer times) the batchWriteItem operation, which can write up to 25 items in one call, saving quite a bit of latency in the process.
So I am creating a module with a class method(schedule) with async function awaiting the return
//SCHEDULER.JS//
class Schedule {
constructor(project_id, queue_id) {
this.project_id = project_id;
this.queue_id = queue_id;
}
//ASYNC METHOD 1
schedule = async (date, rquest) => {
const project = this.project_id;
const queue = this.queue_id;
const location = "us-central1";
const url = rquest.url;
const payload = rquest.body;
// Construct the fully qualified queue name.
const parent = client.queuePath(project, location, queue);
const task = {
httpRequest: {
httpMethod: rquest.method,
url,
headers: rquest.headers,
},
};
try {
const request = await { parent, task };
const [response] = await client.createTask(request);
console.log("<THIS IS THE PROJECT ID> :", response.name);
return `${response.name}`;
} catch (error) {
console.log("we have an error amigo!", error);
}
};
//ASYNC METHOD 2
delete = async (one) => {
return console.log("delete function", one);
};
I imported my module on main.js and used my method. Once the results returns, I need to use it as a parameter to another method(delete) on the module I created(Scheduler.js).
//main.js//
const task_id = scheduler.schedule(date, request);
scheduler.delete(task_id);
task_id is returning a promise and I can't execute scheduler.delete(task_id) because it is pending promise still.
Important: How can I handle this promise properly as I am only tasked to create the module and not the main.js. The people who would create the main.js would just be expected to run my methods without handling promise returns.
TLDR
task_id is returning a promise
If it's a promise you can await it
//main.js//
async function main () {
const task_id = await scheduler.schedule(date, request); // <--- THIS!
scheduler.delete(task_id);
}
main();
Await & promises:
In fact, the await keyword only works on promises (you can await non-promises but it is a no-op by design). That's the whole reason for await - an alternative way to use promises. Because of this functions marked with the async keyword always returns a promise.
Or if you prefer not to await then just use it as a promise:
//main.js//
scheduler.schedule(date, request)
.then(task_id => scheduler.delete(task_id));
You can create another function, which will be called from main.js, and inside this function call your actual function and in then function of Promise return the value.
Summary of Objective
I need to implement an API queue in my Node.js backend for API calls. The API rate limit I need to adhear to is 1 request every 2 seconds and it's a hard limit. I'm making my API call inside of a forEach loop since I need to do one API call for each user.
I've found a lot of articles online about how to create a queue but they mostly involve adding API calls to an array so I'm not sure how to implement a queue in this situation.
Any help would be greatly apprecaited and I can share more code if it's help.
Code
async function refreshStats() {
try {
// get list of all fortnite users
const fnUserList = await Users.find({}, "_id fnUserPlatform"); // my fnUser _id 5cca01ea8f52f40117b2ff51
fnUserList.forEach(async fnUser => {
//make API call. apiCall is a function I created to make the API call and format the response
const { lifeStats, statsEqual } = await apiCall(
fnUser.fnUserPlatform
);
//execute other functions with apiCall response
});
} catch (err) {
console.error("error in refreshStats", err);
}
}
If i get it correctly. You can take advantage of generator functions and combine it with setInterval. You can make make a queue function that enqueues its items in specified intervals.
Create a generator function basically makes an apiCall and pauses
async function* queueGenerator(userList) {
for (let fnUser of userList) {
const result = {lifeStats, statsEqual} = await apiCall(fnUser.fnUserPlatform);
yield result;
}
}
Then in your method create a queue and enqueue items with setInterval
async function refreshStats() {
try {
// get list of all fortnite users
let handle;
const fnUserList = await Users.find({}, "_id fnUserPlatform"); // my fnUser _id 5cca01ea8f52f40117b2ff51
const queue = queueGenerator(fnUserList);
const results = [];
handle = setInterval(async () => {
const result = await queue.next();
results.push(result.value);
if (results.length === users.length) clearInterval(handle);
}, 2000);
} catch (err) {
console.error("error in refreshStats", err);
}
}
Also there is another way which is to use setTimeout combined with Promises. which involves creating promises that resolves in setTimeOut with enough of delays.
async function refreshStatsV2() {
const fnUserList = await Users.find({}, "_id fnUserPlatform");
const promises = fnUserList.map((fnUser, ix) => (
new Promise(resolve =>
setTimeout(async() => {
const result = {
lifeStats,
statsEqual
} = await apiCall(ix.fnUserPlatform);
resolve(result);
}, ix * 2000) // delay every next item 2sec
)));
const result = await Promise.all(promises); // wait all
console.log(result);
}
async function getP(){
var params = {
Name: 'MY-NAME',
WithDecryption: true
};
var request = await ssm.getParameter(params).promise();
return request.Parameter.Value;
}
async function getParam(){
var resp = await getP()
console.log(resp)
}
getParam()
This is the code inside my lambda function which is currently not working and I'm not sure why..
when I change it to:
const x = getParam()
console.log(x) // it says that this is pending
but I thought the async awaits would have resolved that, any ideas?
edited:
console.log('first') // never logged
const res = await ssm.getParameter(paramUsername).promise(); // paramUsername deffo exists in SSM
console.log(res, 'res') // never logged
console.log('second') // never logged
Rough answer, you have two options which I need the output from either...
1)
function to(promise) {
return promise.then((data) => {
return [null, data]
}).catch(err => [err])
}
// YOUR CODE AMENDED
console.log('first') // never logged
let [err, res] = await to(ssm.getParameter(paramUsername).promise());
if(err){
console.log(err)
return
}
console.log(res, 'res') // never logged
console.log('second') // never logged
OR
2) Enclose that call in a try catch like so:
try {
console.log('first') // never logged
const res = await ssm.getParameter(paramUsername).promise(); // paramUsername deffo exists in SSM
console.log(res, 'res') // never logged
console.log('second') // never logged
} catch(e){
console.log(e)
}
Let me know what the error is, I'm betting your lambda doesn't have permission to access SSM! Will update!
Use as below:
import AWS from "aws-sdk";
const ssm = new AWS.SSM()
const params = (name) => {
return {
Name: name,
WithDecryption: true,
};
};
export const getParameter = async (key) => (await ssm.getParameter(params(key)).promise()).Parameter.Value;
Had the same issue - the only combo I found that worked was to do nothing after the awaits other than return the promise once resolved.
So if you changed your code to:
async function getP(){
var params = {
Name: 'MY-NAME',
WithDecryption: true
};
// Do not do anything after the await, only the return
var request = await ssm.getParameter(params).promise();
return request.Parameter;
}
async function getParam(){
// Do not do anything after the await, only the return
var resp = await getP()
return resp.Value;
}
const val = getParam();
console.log(val);
It should work inside lambda. This seems very quirky - I found that running my version of the original code from command line, or in a debugger it worked fine. It was only inside a lambda container (on AWS or in docker) that it didn't resolve and simply dropped out - no error thrown to be caught.
There are a few threads related to this topic (see below) so I hope this helps.
https://github.com/aws/aws-sdk-js/issues/2245
Accessing AWS SSM Parameters in NodeJS Lambas
Why is my async ssm request inside lambda not working?
https://forums.aws.amazon.com/thread.jspa?threadID=258408 -
I had the same problem, and the solution is set a egress outbound in the lambda security group.
I'm getting a "deadline-exceeded" error on the frontend when calling a firebase callable cloud function (onCall).
I know that I have to return a Promise so the function knows when to clean itself, but it is still not working.
After 60 seconds, "deadline-exceeded" is throw to the frontend but the function keeps running on the server and finish with success. All batch operations are written to the firestore.
10:37:14.782 AM
syncExchangeOperations
Function execution took 319445 ms, finished with status code: 200
10:36:57.323 AM
syncExchangeOperations
Function execution started
10:36:57.124 AM
syncExchangeOperations
Function execution took 170 ms, finished with status code: 204
10:36:56.955 AM
syncExchangeOperations
Function execution started
async function syncBinanceOperations(
userId,
userExchange,
userExchangeLastOperations,
systemExchange
) {
try {
const client = Binance({
apiKey: userExchange.apiKey,
apiSecret: userExchange.privateKey
});
const batch = admin.firestore().batch();
const lastOperations = userExchangeLastOperations
? userExchangeLastOperations
: false;
const promises = [];
promises.push(
syncBinanceTrades(client, lastOperations, userId, systemExchange, batch)
);
promises.push(
syncBinanceDeposits(client, lastOperations, userId, systemExchange, batch)
);
promises.push(
syncBinanceWhitdraws(
client,
lastOperations,
userId,
systemExchange,
batch
)
);
promises.push(
updateUserExchange(userId, userExchange.id, {
lastSync: moment().format('x')
})
);
await Promise.all(promises);
return batch.commit();
} catch (error) {
return handleErrors(error);
}
}
exports.syncExchangeOperations = functions.https.onCall(
async (data, context) => {
try {
userAuthenthication(data.userId, context.auth);
let user = await getUser(data.userId);
if (!user.plan.benefits.syncExchanges) {
throw 'Operação não autorizada para o plano contratado';
}
let userExchange = await getUserExchange(data.userId, data.exchangeId);
let response = await Promise.all([
getUserLastOperations(data.userId, userExchange.exchangeId),
getSystemExchange(userExchange.exchangeId)
]);
let userExchangeLastOperations = response[0];
let systemExchange = response[1];
switch (systemExchange.id) {
case 'binance':
return syncBinanceOperations(
user.id,
userExchange,
userExchangeLastOperations,
systemExchange
);
}
} catch (error) {
return handleErrors(error);
}
}
);
It works fine if I change this function to a HTTP request. It waits the function to finish and returns.
exports.syncExchangeOperations = functions
.runWith(runtimeOpts)
.https.onRequest((req, res) => {
return cors(req, res, async () => {
try {
let auth = await admin.auth().verifyIdToken(req.get('Authorization').split('Bearer ')[1]);
let userExchange = await getUserExchange(
auth.uid,
req.query.exchangeId
);
let response = await Promise.all([
getUserLastOperations(auth.uid, userExchange.exchangeId),
getSystemExchange(userExchange.exchangeId)
]);
let userExchangeLastOperations = response[0];
let systemExchange = response[1];
switch (systemExchange.id) {
case 'binance':
await syncBinanceOperations(
auth.uid,
userExchange,
userExchangeLastOperations,
systemExchange
);
}
res.status(200).send();
} catch (error) {
res.status(401).send(handleErrors(error));
}
});
});
The "deadline-exeeded" that you encountered is an error thrown by the Firebase Javascript library on the client (not the function itself). The Firebase docs are lacking documentation o how to use functions.runWithOptions() on a callable function. For some reason the functions().httpsCallable() has a built in timeout on the client side.
So if you use this on your Node.js function:
exports.testFunction = functions.runWith({ timeoutSeconds: 180 }).https.onCall(async (data, ctx) => {
// Your Function Code that takes more than 60second to run
});
You need to override the buit in Javascript Library timeout on the client like this:
let testFunction = firebase.functions().httpsCallable("testFunction", {timeout: 180000});
I don't know what is the purpose of the built in timeout on the client, for me it has no purpose since it doesn't even stop the execution of the function on the server. But it must be there for some internal reasons.
Notice the Node.js timeoutSeconds is in seconds and the timeout option on the client library is in milliseconds.
"Deadline exceeded" means that the function invocation timed out from the perspective of the client. The default is 60 seconds.
Try increasing the timeout on both the client and function so that it has time to complete before the client timeout is reached. You can do this by specifying it in an HttpsCallableOptions object.
Also try returning something other than batch.commit(). Whatever that function return will be serialized and sent to the client, which could cause problems. Instead, just await batch.commit() then return something predictable, like a plain JavaScript object.
See the API documentation for information on setting the timeout:
https://firebase.google.com/docs/reference/js/firebase.functions.Functions#https-callable
https://firebase.google.com/docs/reference/js/firebase.functions.HttpsCallableOptions.html#timeout