I have a node app that uses express for routing and pdfmake npm for generating a pdf document. On the click of a button, I make an http request that retrieves data from a database, generates a pdf document, and saves to disk. However, my async/await functions only seem to work before I create a write stream using fs.createWriteStream(path). All async/awaits after that seem to be ignored. Also, this only happens on a prod server. When debugging my app locally, ALL async/await functions seem to work. Any ideas as to why this could be happening?
Express route:
router.patch('/:id(\\d+)/approve', async function (req, res) {
try {
let id = req.params.id
const invoice = await db.fetchInvoiceById(id)
const harvestInvoice = await harvest.getInvoiceById(invoice.harvest_id)
// generate invoice pdf
await pdf.generateInvoice(invoice, harvestInvoice)
res.status(200).json({ id: id })
} catch (error) {
res.status(400).json({ error: 'something went wrong' })
}
})
Functions:
async function SLEEP5() {
await new Promise((resolve, reject) => {
setTimeout(() => {
resolve('DONE');
}, 5000);
});
}
function test(doc, invoicePath) {
return new Promise((resolve, reject) => {
const writeStream = fs.createWriteStream(invoicePath)
writeStream.on("finish", () => { resolve(true) })
writeStream.on("error", () => { reject(false) })
doc.pipe(writeStream)
doc.end()
})
}
exports.generateInvoice = async function generateInvoice(invoice, harvestInvoice) {
const invoicePath = `${__dirname}\\invoice_${invoice.number}.pdf`
let printer = new PdfPrinter(fonts)
let def = { // pdf defined here }
// generate invoice PDF
let doc = printer.createPdfKitDocument(def, {})
await SLEEP5() // THIS IS AWAITED
await test(doc, invoicePath)
await SLEEP5() // THIS IS NOT AWAITED FOR SOME REASON
}
I am using PM2 to run this node app on an aws ec2 server and Im using version 0.2.4 of pdfmake
I figured out what my issue was. It turns out that I was using pm2 start appName --watch to run my app. I was writing the pdf to a directory within the app. PM2 was detecting a change when the the pdf was being written and would restart the app (because of the --watch flag), causing all the issues i was seeing.
I don't know what printer.createPdfKitDocument(def, {}) does exactly, but
let doc = printer.createPdfKitDocument(def, {})
await sleep(5)
await writeStreamToFile(doc, invoicePath)
certainly looks problematic. If doc is not paused at creation, it might run and finish while you're still sleeping, and then pipe nothing into your write stream which will never emit a finish or error event. So remove the await sleep(5), and immediately do doc.pipe(writeStream) and immediately start listening for the events.
If you insist on waiting, either do
let doc = printer.createPdfKitDocument(def, {})
await Promise.all([
sleep(5),
writeStreamToFile(doc, invoicePath),
])
or try
const doc = printer.createPdfKitDocument(def, {})
doc.pause()
await sleep(5)
await writeStreamToFile(doc, invoicePath)
(The other explanation of course would be that createPdfKitDocument creates a never-ending stream, or errors without emitting an error event, etc. that would lead to the promise not being resolved).
Related
I have a Node.js AWS Lambda function created via the serverless framework. I have multiple helper functions inside it. I am having an issue with one of them due to being async. The function runs and logs out all parts I put comments next to however it doesn't update callDuration. I think that the code is having an issue due to async where it finishes in the wrong order. My goal is to be able to return the callDuration to my main function for further processing. How can I get all code to process/run and be able to meet my goal and have the code run in the right order
Here is the function:
const callAggregate = async (billingData, billingDB) => {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
await Billing.findOne({_id: billingData._id}).exec().then(bill => {
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
client.calls(call)
.fetch()
.then(callDetails => {
console.log(callDetails) // This logs out
callDuration += callDetails.duration
})
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
Billing.findOneAndUpdate(
{_id: billingData._id},
{ $inc: { call_duration: callDuration }, callSid: []},
(err, doc) => {
if(err) {
console.log(err)
}
}
)
return callDuration
})
}
This is a case of mixing and matching promises with plain callbacks and mixing await with .then(), both of which make proper flow-control and error handling management difficult.
Inside your function which is async and uses await in some places, you also have a promise you are not awaiting (which means it runs open loop and nothing waits for it) and you have a database function that is using a plain callback, not the promise interface so nothing waits for it either.
More specifically, nothing is waiting for this:
client.calls(call).fetch()
So, because of not waiting for the .fetch() to finish, you were attempting to use the variable callDuration before the code was done modifying that variable (giving you the wrong value for it).
Similarly, nothing is waiting for Billing.findOneAndUpdate(...) to complete either.
A clean solution is to switch everything over to promises and await. This involves, using only promises with your database (no plain callbacks) and converting the .then() handlers into await.
async function callAggregate(billingData, billingDB) {
const accountSid = process.env.TWILIO_ACCOUNT_SID
const authToken = process.env.TWILIO_AUTH_TOKEN
const client = require('twilio')(accountSid, authToken)
// Setup model
const Billing = billingDB.model('Billing')
let bill = await Billing.findOne({ _id: billingData._id }).exec();
const callArray = bill.callSid
console.log(bill) // This logs out
let callDuration = 0
for (const call of callArray) {
console.log(call) // This logs out
let callDetails = await client.calls(call).fetch();
console.log(callDetails) // This logs out
callDuration += callDetails.duration
}
console.log(`Billing for ${callDuration} minutes of voice calling for ${billingData._id}`) // This logs out
let doc = await Billing.findOneAndUpdate({ _id: billingData._id }, { $inc: { call_duration: callDuration }, callSid: [] }).exec();
return callDuration
}
im a beginner in Javascript / Nodejs.
i want to read a folder, and then each by each call a function with await...
I have a Folder, inside that folder i have Images.
with readdir i get all Folder Images with extension.
with that code i read the folder and split the ImageName and the .png, so i have only the ImageName without the .png.
idk if there is a better solution.
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
const split = file.split('.');
.....split[0]
});
});
If i add then this code inside the readdir
storeAsset();
Code from storeAsset
async function storeAsset() {
const client = new NFTStorage({ token: NFT_STORAGE_API_KEY })
const metadata = await client.store({
name: 'ExampleNFT',
description: 'My ExampleNFT is an awesome artwork!',
image: new File(
[await fs.promises.readFile('assets/Test.png')],
'MyExampleNFT.png',
{ type: 'image/png' }
),
})
console.log("Metadata stored on Filecoin and IPFS with URL:", metadata.url)
}
The Function storeAsset() run without Problem but its not waiting to finish StoreAsset and then each by each.
If i Add await that the forEach loop wait for finish each File...;
await storeAsset();
i get this Message:
await storeAsset();
^^^^^
SyntaxError: Unexpected reserved word
←[90m at ESMLoader.moduleStrategy (node:internal/modules/esm/translators:115:18)←[39m
←[90m at ESMLoader.moduleProvider (node:internal/modules/esm/loader:289:14)←[39m
←[90m at async link (node:internal/modules/esm/module_job:70:21)←[39m
So, how i can modify the readdir function with await storeasset that every function call wait for finish and then the next one?
thanks for helping
edit:
i have this now:
fs.readdir(testFolder, (err, files) => {
files.forEach(async file => {
const split = file.split('.');
//
await storeAsset();
//
console.log(split[0]);
// process.exit(1);
});
});
but its a bit wrong...
Metadata stored on Filecoin and IPFS with URL: ipfs://xxx/metadata.json
4
Metadata stored on Filecoin and IPFS with URL: ipfs://xxx/metadata.json
5
Metadata stored on Filecoin and IPFS with URL: ipfs://xxx/metadata.json
1
Metadata stored on Filecoin and IPFS with URL: ipfs://xxx/metadata.json
3
Metadata stored on Filecoin and IPFS with URL: ipfs://xxx/metadata.json
2
I need: 1 Image > Upload wait for Return Answer, then 2 Image and so on, and not start every upload at same time and then the upload is mixxed and not 1,2,3,4,5...
For iterations to wait for the previous one, and the previous one is running something async, you can wait for it using a simple for loop and await.
Array methods like forEach do not have a mechanism to allow them to wait for an async operation to finish before moving onto the next iteration.
Finally, the reason you are getting the SyntaxError: Unexpected reserved word is that the forEach function must be an async function, though that won't solve the problem you're trying to here.
const run = async () => {
for (const x of list) {
await do(x)
}
}
run()
This is a generic example to show how asynchronous code can be used along with readdir and a loop
const fs = require("fs")
function pr(f) {
return new Promise((resolve, _reject) => {
setTimeout(function () {
resolve("got file " + f)
}, 500)
})
}
const pth = __dirname; // whatever directory path with something in it...
fs.readdir(pth, async (err, files) => {
// Don't use a forEach syntax as it doesn't support asynchronous code,
// it would not throw an error, but the promises won't be
// resolved inside the loop with forEach.
for (const f of files) {
const msg = await pr(f);
console.log(msg)
}
})
And here is an example that should correspond to your case (it didn't not fully understand where you call the store asset and how you use the files iterator but it should illustrate the point)
fs.readdir(testFolder, async (err, files) => {
// ...
for (const file in files) {
await storeAsset();
}
// ...
});
I am scraping a bunch of API's and saving the data to a dynamodb table.
Everything works absolutely fine when running serverless invoke local -f runAggregator locally.
However, after I set up the cron, I noticed things were not being saved to the Dynamodb table.
Here is my function:
module.exports.runAggregator = async (event) => {
await runModules({ saveJobs: true });
return {
statusCode: 200,
body: JSON.stringify(
{
message: "Aggregate",
input: event,
},
null,
2
),
};
};
And the runModules function:
module.exports = async ({ saveJobs }) => {
if (saveJobs) {
const flushDb = await flushDynamoDbTable();
console.log("Flushing Database: Complete");
console.log(flushDb);
}
// pseudo code
const allJobs = myLongArrayOfJobsFromApis
const goodJobs = allJobs.filter((job) => {
if (job.category) {
if (!job.category.includes("Missing map for")) return job;
}
});
// This runs absolutely fine locally...
if (saveJobs) goodJobs.forEach(saveJob); // see below for function
const badJobs = allJobs.filter((job) => {
if (!job.category) return job; // no role found from API
if (job.category.includes("Missing map for")) return job;
});
console.log("Total Jobs", allJobs.length);
console.log("Good Jobs", goodJobs.length);
console.log("Malformed Jobs", badJobs.length);
return uniqBy(badJobs, "category");
};
saveJob function
// saveJob.js
module.exports = (job) => {
validateJob(job);
dynamoDb
.put({
TableName: tableName,
Item: job,
})
.promise();
};
I am baffled as to why this works fine locally not when I run a 'test' in the lambda console. I only found out due to the table being empty after the cron had ran.
saveJob performs an async operation (ddb.put().promise()) but you are neither awaiting its completion nor returning the promise.
As the forEach in the runModules function will also not await anything, the function completes before the call to dynamodb is even performed (because of how promises vs synchronous code work) and the process is killed after the lambda's execution.
Locally you are not running lambda but something that looks like it. There are subtle differences, and what happens after the function is done is one of those differences. So it may work locally, but it won't on an actual lambda.
What you need to do is to make sure you await your call to dynamodb. Something like:
// saveJob.js
module.exports = (job) => {
validateJob(job);
return dynamoDb
.put({
TableName: tableName,
Item: job,
})
.promise();
};
and in your main function:
...
if (saveJobs) await Promise.all(...goodJobs.map(job => saveJob(job)))
// or with a Promise lib such as bluebird:
if (saveJobs) await Promise.map(goodJobs, job => saveJob(job))
// (or Promise.each(...) if you need to make sure this happens in sequence and not in parallel)
Note: instead of calling many times dynamodb.put, you could/should call once (or at least fewer times) the batchWriteItem operation, which can write up to 25 items in one call, saving quite a bit of latency in the process.
Background
I am returning data from AWS Secrets Manager and using the aws-sdk to do so. Earlier I asked a question about how to correctly return the data and export it since the exported object never had the data resolved by the time the export was imported somewhere else. This caused me to get a bunch of undefined.
After solving that problem it was determined that the way to handle this was to wrap the aws-sdk function in a promise, then call the promise in another file with async await. This causes me issues.
Example
If I request and return the data from AWS like this,
let secrets = {
jwtHash: 10,
};
const client = new AWS.SecretsManager({
region: region
});
const promise = new Promise((resolve, reject) => {
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
secrets.dbUsername = res.username;
secrets.dbPassword = res.password;
secrets.dbHost = res.host;
secrets.dbPort = res.port;
secrets.dbDatabase = res.dbname;
resolve(secrets);
}
});
});
module.exports = promise;
Then I can import it in another file and use the data like this,
const promise = require('../secrets');
(async () => {
const secrets = await promise;
// use secrets here
})();
Now let's say in that file where I am trying to use secrets I have something like this,
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
If I wrap the pool function in the async self invoking function I have trouble exporting it so it can be used anywhere in my app when I need a database connection. Similar, I have many functions throughout my application that need access to the secret data. If I were to walk through the application wrapping all of my code in async functions it would continue to cause more of these difficulties.
Question
It seems to me the best solution here would be to return the data asynchronously and once it has resolved, export it synchronously.
How can I achieve such a task in this scenario?
A win here would be,
Make the request in /secrets/index.js
Build the secrets object in the same file
Export secrets as an object that can easily be imported anywhere else in my application without the need for asynchronous functions.
Example of How I Would Like to Use This
const secrets = require('../secrets');
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
Because the needed data is gotten asynchronously, there's no way around making everything that depends on it (somehow) asynchronous as well. With asynchronicity involved, one possibility is to usually export functions that can be called on demand, rather than exporting objects:
an object that depends on the asynchronous data can't be meaningfully exported before the data comes back
if you export functions rather than objects, you can ensure that control flow starts from your single entry point and heads downstream, rather than every module initializing itself at once (which can be problematic when some modules depend on others to be initialized properly, as you're seeing)
On another note, note that if you have a single Promise that needs to resolve, it's probably easier to call .then on it than use an async function. For example, rather than
const promise = require('../secrets');
(async () => {
// try/catch is needed to handle rejected promises when using await:
try {
const secrets = await promise;
// use secrets here
} catch(e) {
// handle errors
}
})();
you might consider:
const promise = require('../secrets');
promise
.then((secrets) => {
// use secrets here
})
.catch((err) => {
// handle errors
});
It's less wordy and probably easier to make sense of at a glance - better than a self-invoking async IIFE. IMO, the place to use await is when you have multiple Promises that need to resolve, and chaining .thens and returned Promises together gets too ugly.
A module that depends on secrets to perform has to, in its code, have something that effectively waits for secrets to be populated. Although being able to use your const secrets = require('../secrets'); in your lower code example would be nice, it just isn't possible like that. You can export a function that takes secrets as a parameter rather than as a require, and then (synchronously!) return the instantiated pool:
// note, secrets is *not* imported
function makePool(secrets) {
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
return pool;
}
module.exports = makePool;
Then, to use it in another module, once the secrets are created, call makePool with the secrets, and then use / pass around the returned pool:
const secretsProm = require('../secrets');
const makePool = require('./makePool');
secretsProm.then((secrets) => {
const pool = makePool(secrets);
doSomethingWithPool(pool);
})
.catch((err) => {
// handle errors
});
Note that the doSomethingWithPool function can be completely synchronous, as is makePool - the asynchronous nature of secrets, once handled with .then in one module, does not have to be dealt with asynchronously anywhere else, as long as other modules export functions, rather than objects.
I would suggest doing everything in 1 file, and then instead of exporting the object you create, export a function that returns the object. The function will always have access to the must up-to-date version of the object, and you can call it from any file to access the same object.
Example:
Create two files in a folder. In the first file, we will do this:
Define a value.
Set a timeout to change the value after some time
Export the value itself
Export a function that returns the value
values.js
let x = 0 ; // set initial value
setTimeout(() => { x = 5; }, 2000); // sometime later, value will change
const getValueOfX = () => { return x; };
module.exports = {
x: x,
getValueOfX: getValueOfX
};
Now in the other file, we just import the two exports from the previous file (we put them both in an object for easy exporting). We can then log them out, wait for some time to pass, and log them out again.
index.js
let values = require('./values');
console.log(`Single value test. x = ${values.x}`);
console.log(`Function return value test. x = ${values.getValueOfX()}`);
setTimeout(() => { console.log(`Single value test. x = ${values.x}`); }, 4000);
setTimeout(() => { console.log(`Function return value test. x = ${values.getValueOfX()}`); }, 4000);
To run the code, just open your Terminal or Command Prompt and, from the same directory as these two files, run node index.js
You'll see that when just the value (object, array, w/e) is exported, it is exported as-is when the export runs - almost always before the API call is finished.
BUT - If you export a function that returns the value (object, array, w/e), then that function will retrieve the up-to-date version of the value at the time it is called! Great for API calls!
so your code might look like this:
let secrets = { jwtHash: 10 };
const client = new AWS.SecretsManager({
region: region
});
let pool = null;
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
pool = new Pool({
user: res.username,
host: res.host
database: res.dbname
password: res.password
port: res.port
});
pool.on('error', err=> {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
}
});
module.exports = function(){ return pool; };
One thing I do (especially when working with a large application that imports static variables that have been moved to a database) is load that file via a function and that function populates an export.
// config.js
const exports = {};
export async function populate() {
const RUNTIMEVARS = await what_you_do_to_get_vars();
for (const config of RUNTIMEVARS) {
exports[config.key] = exports[config.data];
}
// for anything needing the config in the bootstrap.
return exports;
}
export default exports;
Then in the bootstrap:
// bootstrap.js
import './database-connection.js'; // important to have no internal dependencies.
(async() => {
const { populate } = await import('./config.js');
await populate();
import('./application/index.js');
})()
Now any file inside your application can import config from '../config.js' as though it were statically declared as we populated the object in the populate function in the bootstrap.
I'd like to test my koa API routes using supertest and check what's in DynamoDB before and after to make sure that the end point did what was intended.
// app related
const pool = require('../../src/common/pool');
const app = require('../../server');
// for testing
const uuid = require('uuid');
const supertest = require('supertest');
// listen on port 40002
const request = supertest.agent(app.listen(4002));
describe('test', () => {
it.only('should', async (done) => {
debugger;
const id = uuid.v4().replace(/-/g, '');
await pool.add(id, 'data', 30);
return request
.get('/api/1')
.expect(204)
// .then(async (res) => {
// .then((res) => {
.end((res) => {
// still returns 'data' instead of 'dataNew' after the route is hit
const record = await pool.get(id);
debugger;
done();
});
});
});
In the code above, I'm creating a record in the db, then I hit the end point, and I tried a then() and an end() chained function to check the db once again. The end point will just data to dataNew and in the then() function, it still returns the original data.
Any ideas on how I can verify the new record in the db ?
References:
Supertest: Verify database after request - In TLDR at the bottom, the solution was to use co. I tried this and had issues probably cause I'm using await instead of generators.
The above was fixed by chaining the pool.add() which returns a promise, to the supertest request and then awaiting the record to verify it. Sometimes it still gets the record too quickly because the pool.update() method is not awaited within the end point that request is hitting.
describe('test', () => {
it.only('should', async () => {
const id = uuid.v4().replace(/-/g, '');
await pool.add(id, 'data', 30).then(() => {
return request
.get('/api/1')
.expect(204)
// check custom headers
//.expect('pool-before', 'data')
//.expect('pool-after', 'dataModified')
.then(async (res) => {
const record = await pool.get(id);
debugger;
expect('dataModified').to.equal(record.fields.S);
});
});
});
});
The only other way I can think of is to check the value via a custom header, a delay, or use a mock.
Let me know if anyone has a better solution.