How await a recursive Promise in Javascript - javascript

I have written a recursive Promise in javascript which seems to be working fine but I wanted to test it using setTimeout() to be sure that I'm awaiting correctly before continuing with the execution. Here is the gist of my code:
try{
await renameFiles(); // <-- await here
console.log("do other stuff");
}
catch(){
}
const renameFiles = (path) => {
return new Promise(resolve => {
console.log("Renaming files...");
fs.readdirSync(path).forEach(file) => {
// if file is a directory ...
let newPath = path.join(path, file);
resolve( renameFiles(newPath) ); // <- recursion here!
// else rename file ...
}
resolve();
})
I've tested it with setTimeout() like this:
const renameFiles = () => {
return new Promise(resolve => {
setTimeout(() => {
// all previous code goes here
},2000)
}
}
and the output is:
"Renaming files..."
"Renaming files..."
// bunch of renaming files...
"do other stuff"
"Renaming files..."
"Renaming files..."
So it looks like it's awaiting for a bit but then it continues the execution at some point.
I'm also doubting I'm testing it wrong. Any idea where the problem may be?

As already mentioned - multiple resolve invocations don't make sense. However that is not the only problem in the code. Root invocation got resolved when its recursive call started for first sub directory. This code will process directories in hierarchical order
rename.js
const fs = require('fs');
const path = require('path');
const inputPath = path.resolve(process.argv[2]);
const newName = 'bar.txt';
async function renameFiles(filePath) {
for (const file of fs.readdirSync(filePath)) {
const newPath = path.join(filePath, file);
const descriptor = fs.lstatSync(newPath);
if (descriptor.isDirectory()) {
await renameFiles(newPath)
} else if (descriptor.isFile()) {
await renameFile(file);
}
}
}
async function renameFile(file) {
console.log(`Renaming ${file} to ${newName}`)
return new Promise(resolve => {
setTimeout(() => {
console.log(`Renamed ${file} to ${newName}`)
resolve();
}, 300)
});
}
async function main() {
console.log(`Renaming all files in ${inputPath} to ${newName}`);
await renameFiles(inputPath);
console.log('Finished');
}
main();
you can run it like
node rename.js relativeFolderName
or if order doesn't matter, then you can use map and Promise.all as mentioned by #Tiago Coelho
const renameFiles = async path => {
const renamePromises = fs.readdirSync(path).map(file => {
if (isDirectory(file)) {
const newPath = path.join(path, file);
return renameFiles(newPath)
} else {
return renamefile(file);
}
});
await Promise.all(renamePromises);
}

To make this work you need to wait for all the files in the directory to resolve. So you will need to do a Promise.all and use a map instead of a forEach
something like this:
try{
await renameFiles(); // <-- await here
console.log("do other stuff");
}
catch(){
}
const renameFiles = (path) => {
return new Promise(resolve => {
console.log("Renaming files...");
const allFilesRenamePromises = fs.readdirSync(path).map(file => {
if(file.isDirectory()) {
let newPath = path.join(path, file);
return renameFiles(newPath); // <- recursion here!
} else {
// rename file ...
}
}
resolve(Promise.all(allFilesRenamePromises));
})

Instead of writing one big complicated function, I'll suggest a more decomposed approach.
First we start with a files that recursively lists all files at a specified path -
const { readdir, stat } =
require ("fs") .promises
const { join } =
require ("path")
const files = async (path = ".") =>
(await stat (path)) .isDirectory ()
? Promise
.all
( (await readdir (path))
.map (f => files (join (path, f)))
)
.then
( results =>
[] .concat (...results)
)
: [ path ]
We have a way to list all files now, but we only wish to rename some of them. We'll write a generic search function to find all files that match a query -
const { basename } =
require ("path")
const search = async (query, path = ".") =>
(await files (path))
.filter (x => basename (x) === query)
Now we can write your renameFiles function as a specialisation of search -
const { rename } =
require ("fs") .promises
const { dirname } =
require ("path")
const renameFiles = async (from = "", to = "", path = ".") =>
Promise
.all
( (await search (from, path))
.map
( f =>
rename
( f
, join (dirname (f), to)
)
)
)
To use it, we simply call renameFiles with its expected parameters -
renameFiles ("foo", "bar", "path/to/someFolder")
.then
( res => console .log ("%d files renamed", res.length)
, console.error
)
// 6 files renamed
Reviewing our programs above, we see some patterns emerging with our use of Promise.all, await, and map. Indeed these patterns can be extracted and our programs can be further simplified. Here's files and renameFiles revised to use a generic Parallel module -
const files = async (path = ".") =>
(await stat (path)) .isDirectory ()
? Parallel (readdir (path))
.flatMap (f => files (join (path, f)))
: [ path ]
const renameFiles = (from = "", to = "", path = "") =>
Parallel (search (from, path))
.map
( f =>
rename
( f
, join (dirname (f), to)
)
)
The Parallel module was originally derived in this related Q&A. For additional insight and explanation, please follow the link.

In my first answer I showed you how to solve your problem using mainly functional techniques. In this answer, we'll see modern JavaScript features like async iterables make this kind of thing even easier -
const files = async function* (path = ".")
{ if ((await stat (path)) .isDirectory ())
for (const f of await readdir (path))
yield* files (join (path, f))
else
yield path
}
const search = async function* (query, path = ".")
{ for await (const f of files (path))
if (query === basename (f))
yield f
}
const renameFiles = async (from = "", to = "", path = ".") =>
{ for await (const f of search (from, path))
await rename
( f
, join (dirname (f), to)
)
}
renameFiles ("foo", "bar", "path/to/someFolder")
.then (_ => console .log ("done"), console.error)

For completeness, I'll post the entire solution based on #udalmik suggestion. The only difference is that I'm wrapping async function renameFile(file) inside a Promise.
const fs = require('fs');
const path = require('path');
const inputPath = path.resolve(process.argv[2]);
const newName = 'bar.txt';
async function renameFiles(filePath) {
for (const file of fs.readdirSync(filePath)) {
const newPath = path.join(filePath, file);
const descriptor = fs.lstatSync(newPath);
if (descriptor.isDirectory()) {
await renameFiles(newPath)
} else if (descriptor.isFile()) {
await renameFile(file);
}
}
}
async function renameFile(file) {
return new Promise(resolve => {
console.log(`Renaming ${file} to ${newName}`);
resolve();
})
}
async function main() {
console.log(`Renaming all files in ${inputPath} to ${newName}`);
await renameFiles(inputPath);
console.log('Finished');
}
main();
The reason for using the Promise is that I want to await for all the files to be renamed before continuing the execution (i.e. console.log('Finished');).
I've tested it using setTimeout
return new Promise(resolve => {
setTimeout(()=>{
console.log(`Renaming ${file} to ${newName}`);
},1000)
resolve(); // edited missing part
})
The solution took a different path from my original question but I guess it works for me.

try to change the await code like this. this might help you.
try{
const renameFilesPromise = renameFiles();
renameFilesPromise.then({ <-- then is a callback when promise is resolved
console.log("do other stuff");
})
}
catch(){
}
const renameFiles = (path) => {
return new Promise(resolve => {
console.log("Renaming files...");
fs.readdirSync(path).forEach(file) => {
// if file is a directory ...
let newPath = path.join(path, file);
resolve( renameFiles(newPath) ); // <- recursion here!
// else rename file ...
}
resolve();
})

Related

how to access array in async context

i have this function :
const list = [];
(async () => {
await fs.readdir(JSON_DIR, async (err, files) => {
await files.forEach(async filename => {
const readStream = fs.createReadStream(path.join("output/scheduled", filename));
const parseStream = json.createParseStream();
await parseStream.on('data', async (hostlist: HostInfo[]) => {
hostlist.forEach(async host => {
list.push(host);
});
});
readStream.pipe(parseStream);
})
});
//here list.length = 0
console.log(list.length);
})();
the function read from a directory of large json files, and reads them, for each file,it create a stream that starts reading the json, and the stream can be working at the same time.
at the end of the function i need to save the variable host in the list, but when i check the lis at the end, is empty.
how can i save the content of the host to a global variable, so it can be accessible in the end.
i tought as solution to check when every file is finished reading using and end event.
though to access the list at the end, i need another event to start when all other events are finished.
and looks complicated.
i have been using the big-json library,
https://www.npmjs.com/package/big-json
You could use a counter to determine when the streams have finished processing.
You can use readdirSync for executing the operation synchronously.
const list: HostInfo[] = [];
(() => {
const files = fs.readdirSync(JSON_DIR);
let streamFinished = 0;
let streamCount = files.length;
files.forEach((filename) => {
const readStream = fs.createReadStream(
path.join('output/scheduled', filename)
);
const parseStream = json.createParseStream();
parseStream.on('error', (err) => {
// Handle errors
})
parseStream.on('data', (hostlist: HostInfo[]) => {
list.push(...hostlist);
});
parseStream.on('end', () => {
streamFinished++;
if (streamFinished === streamCount) {
// End of all streams...
}
console.log(list.length);
})
readStream.pipe(parseStream);
});
})();

Promise Resolving before Google Cloud Bucket Upload

I am writing some code that loops over a CSV and creates a JSON file based on the CSV. Included in the JSON is an array named photos, which is to contain the returned urls for the images that are being uploaded to Google Cloud Storage within the function. However, having the promise wait for the uploads to finish has me stumped, since everything is running asynchronously, and finishes off the promise and the JSON compilation prior to finishing the bucket upload and returning the url. How can I make the promise resolve after the urls have been retrieved and added to currentJSON.photos?
const csv=require('csvtojson')
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
var serviceAccount = require("./my-firebase-storage-spot.json");
const testFolder = './Images/';
var csvFilePath = './Inventory.csv';
var dirArr = ['./Images/Subdirectory-A','./Images/Subdirectory-B','./Images/Subdirectory-C'];
var allData = [];
csv()
.fromFile(csvFilePath)
.subscribe((json)=>{
return new Promise((resolve,reject)=>{
for (var i in dirArr ) {
if (json['Name'] == dirArr[i]) {
var currentJSON = {
"photos" : [],
};
fs.readdir(testFolder+json['Name'], (err, files) => {
files.forEach(file => {
if (file.match(/.(jpg|jpeg|png|gif)$/i)){
var imgName = testFolder + json['Name'] + '/' + file;
bucket.upload(imgName, function (err, file) {
if (err) throw new Error(err);
//returned uploaded img address is found at file.metadata.mediaLink
currentJSON.photos.push(file.metadata.mediaLink);
});
}else {
//do nothing
}
});
});
allData.push(currentJSON);
}
}
resolve();
})
},onError,onComplete);
function onError() {
// console.log(err)
}
function onComplete() {
console.log('finito');
}
I've tried moving the resolve() around, and also tried placing the uploader section into the onComplete() function (which created new promise-based issues).
Indeed, your code is not awaiting the asynchronous invocation of the readdir callback function, nor of the bucket.upload callback function.
Asynchronous coding becomes easier when you use the promise-version of these functions.
bucket.upload will return a promise when omitting the callback function, so that is easy.
For readdir to return a promise, you need to use the fs Promise API: then you can use
the promise-based readdir method and use
promises throughout your code.
So use fs = require('fs').promises instead of fs = require('fs')
With that preparation, your code can be transformed into this:
const testFolder = './Images/';
var csvFilePath = './Inventory.csv';
var dirArr = ['./Images/Subdirectory-A','./Images/Subdirectory-B','./Images/Subdirectory-C'];
(async function () {
let arr = await csv().fromFile(csvFilePath);
arr = arr.filter(obj => dirArr.includes(obj.Name));
let allData = await Promise.all(arr.map(async obj => {
let files = await fs.readdir(testFolder + obj.Name);
files = files.filter(file => file.match(/\.(jpg|jpeg|png|gif)$/i));
let photos = await Promise.all(
files.map(async file => {
var imgName = testFolder + obj.Name + '/' + file;
let result = await bucket.upload(imgName);
return result.metadata.mediaLink;
})
);
return {photos};
}));
console.log('finito', allData);
})().catch(err => { // <-- The above async function runs immediately and returns a promise
console.log(err);
});
Some remarks:
There is a shortcoming in your regular expression. You intended to match a literal dot, but you did not escape it (fixed in above code).
allData will contain an array of { photos: [......] } objects, and I wonder why you would not want all photo elements to be part of one single array. However, I kept your logic, so the above will still produce them in these chunks. Possibly, you intended to have other properties (next to photos) as well, which would make it actually useful to have these separate objects.
The problem is the your code is not waiting in your forEach. I would highly recommend to look for stream and try to do things in parallel as much as possible. There is one library which is very powerful and does that job for you. The library is etl.
You can read rows from csv in parallel and process them in parallel rather than one by one.
I have tried to explain the lines in the code below. Hopefully it makes sense.
const etl = require("etl");
const fs = require("fs");
const csvFilePath = `${__dirname }/Inventory.csv`;
const testFolder = "./Images/";
const dirArr = [
"./Images/Subdirectory-A",
"./Images/Subdirectory-B",
"./Images/Subdirectory-C"
];
fs.createReadStream(csvFilePath)
.pipe(etl.csv()) // parse the csv file
.pipe(etl.collect(10)) // this could be any value depending on how many you want to do in parallel.
.pipe(etl.map(async items => {
return Promise.all(items.map(async item => { // Iterate through 10 items
const finalResult = await Promise.all(dirArr.filter(i => i === item.Name).map(async () => { // filter the matching one and iterate
const files = await fs.promises.readdir(testFolder + item.Name); // read all files
const filteredFiles = files.filter(file => file.match(/\.(jpg|jpeg|png|gif)$/i)); // filter out only images
const result = await Promise.all(filteredFiles).map(async file => {
const imgName = `${testFolder}${item.Name}/${file}`;
const bucketUploadResult = await bucket.upload(imgName); // upload image
return bucketUploadResult.metadata.mediaLink;
});
return result; // This contains all the media link for matching files
}));
// eslint-disable-next-line no-console
console.log(finalResult); // Return arrays of media links for files
return finalResult;
}));
}))
.promise()
.then(() => console.log("finsihed"))
.catch(err => console.error(err));
Here's a way to do it where we extract some of the functionality into some separate helper methods, and trim down some of the code. I had to infer some of your requirements, but this seems to match up pretty closely with how I understood the intent of your original code:
const csv=require('csvtojson')
const fs = require('fs');
const {Storage} = require('#google-cloud/storage');
var serviceAccount = require("./my-firebase-storage-spot.json");
const testFolder = './Images/';
var csvFilePath = './Inventory.csv';
var dirArr = ['./Images/Subdirectory-A','./Images/Subdirectory-B','./Images/Subdirectory-C'];
var allData = [];
// Using nodejs 'path' module ensures more reliable construction of file paths than string manipulation:
const path = require('path');
// Helper function to convert bucket.upload into a Promise
// From other responses, it looks like if you just omit the callback then it will be a Promise
const bucketUpload_p = fileName => new Promise((resolve, reject) => {
bucket.upload(fileName, function (err, file) {
if (err) reject(err);
resolve(file);
});
});
// Helper function to convert readdir into a Promise
// Again, there are other APIs out there to do this, but this is a rl simple solution too:
const readdir_p = dirName => new Promise((resolve, reject) => {
fs.readdir(dirName, function (err, files) {
if (err) reject(err);
resolve(files);
});
});
// Here we're expecting the string that we found in the "Name" property of our JSON from "subscribe".
// It should match one of the strings in `dirArr`, but this function's job ISN'T to check for that,
// we just trust that the code already found the right one.
const getImageFilesFromJson_p = jsonName => new Promise((resolve, reject) => {
const filePath = path.join(testFolder, jsonName);
try {
const files = await readdir_p(filePath);
resolve(files.filter(fileName => fileName.match(/\.(jpg|jpeg|png|gif)$/i)));
} catch (err) {
reject(err);
}
});
csv()
.fromFile(csvFilePath)
.subscribe(async json => {
// Here we appear to be validating that the "Name" prop from the received JSON matches one of the paths that
// we're expecting...? If that's the case, this is a slightly more semantic way to do it.
const nameFromJson = dirArr.find(dirName => json['Name'] === dirName);
// If we don't find that it matches one of our expecteds, we'll reject the promise.
if (!nameFromJson) {
// We can do whatever we want though in this case, I think it's maybe not necessarily an error:
// return Promise.resolve([]);
return Promise.reject('Did not receive a matching value in the Name property from \'.subscribe\'');
}
// We can use `await` here since `getImageFilesFromJson_p` returns a Promise
const imageFiles = await getImageFilesFromJson_p(nameFromJson);
// We're getting just the filenames; map them to build the full path
const fullPathArray = imageFiles.map(fileName => path.join(testFolder, nameFromJson, fileName));
// Here we Promise.all, using `.map` to convert the array of strings into an array of Promises;
// if they all resolve, we'll get the array of file objects returned from each invocation of `bucket.upload`
return Promise.all(fullPathArray.map(filePath => bucketUpload_p(filePath)))
.then(fileResults => {
// So, now we've finished our two asynchronous functions; now that that's done let's do all our data
// manipulation and resolve this promise
// Here we just extract the metadata property we want
const fileResultsMediaLinks = fileResults.map(file => file.metadata.mediaLink);
// Before we return anything, we'll add it to the global array in the format from the original code
allData.push({ photos: fileResultsMediaLinks });
// Returning this array, which is the `mediaLink` value from the metadata of each of the uploaded files.
return fileResultsMediaLinks;
})
}, onError, onComplete);
You are looking for this library ELT.
You can read rows from CSV in parallel and process them in parallel rather than one by one.
I have tried to explain the lines in the code below. Hopefully, it makes sense.
const etl = require("etl");
const fs = require("fs");
const csvFilePath = `${__dirname }/Inventory.csv`;
const testFolder = "./Images/";
const dirArr = [
"./Images/Subdirectory-A",
"./Images/Subdirectory-B",
"./Images/Subdirectory-C"
];
fs.createReadStream(csvFilePath)
.pipe(etl.csv()) // parse the csv file
.pipe(etl.collect(10)) // this could be any value depending on how many you want to do in parallel.
.pipe(etl.map(async items => {
return Promise.all(items.map(async item => { // Iterate through 10 items
const finalResult = await Promise.all(dirArr.filter(i => i === item.Name).map(async () => { // filter the matching one and iterate
const files = await fs.promises.readdir(testFolder + item.Name); // read all files
const filteredFiles = files.filter(file => file.match(/\.(jpg|jpeg|png|gif)$/i)); // filter out only images
const result = await Promise.all(filteredFiles).map(async file => {
const imgName = `${testFolder}${item.Name}/${file}`;
const bucketUploadResult = await bucket.upload(imgName); // upload image
return bucketUploadResult.metadata.mediaLink;
});
return result; // This contains all the media link for matching files
}));
// eslint-disable-next-line no-console
console.log(finalResult); // Return arrays of media links for files
return finalResult;
}));
}))
.promise()
.then(() => console.log("finsihed"))
.catch(err => console.error(err));

How to await Function to finish before executing the next one?

// The Below code already contains the suggestions from the answers and hence works :)
within the below script I tried to fully execute the 'createDatabase' function before the .then call at the end starts to run. Unfortunately I couldn't figure out a solution to achieve just that.
In generell the flow should be as followed:
GetFiles - Fully Execute it
CreateDatabase - Fully Execute it (while awaiting each .map call to finish before starting the next)
Exit the script within the .then call
Thanks a lot for any advise :)
const db = require("../database")
const fsp = require("fs").promises
const root = "./database/migrations/"
const getFiles = async () => {
let fileNames = await fsp.readdir(root)
return fileNames.map(fileName => Number(fileName.split(".")[0]))
}
const createDatabase = async fileNumbers => {
fileNumbers.sort((a, b) => a - b)
for (let fileNumber of fileNumbers) {
const length = fileNumber.toString().length
const x = require(`.${root}${fileNumber.toString()}.js`)
await x.create()
}
}
const run = async () => {
let fileNumbers = await getFiles()
await createDatabase(fileNumbers)
}
run()
.then(() => {
console.log("Database setup successfully!")
db.end()
process.exitCode = 0
})
.catch(err => {
console.log("Error creating Database!", err)
})
The x.create code looks as follows:
const dbQ = (query, message) => {
return new Promise((resolve, reject) => {
db.query(query, (err, result) => {
if (err) {
console.log(`Error: ${err.sqlMessage}`)
return reject()
}
console.log(`Success: ${message}!`)
return resolve()
})
})
}
x.create = async () => {
const query = `
CREATE TABLE IF NOT EXISTS Country (
Code CHAR(2) UNIQUE NOT NULL,
Flag VARCHAR(1024),
Name_de VARCHAR(64) NOT NULL,
Name_en VARCHAR(64) NOT NULL,
Primary Key (Code)
)`
const result = await dbQ(query, "Created Table COUNTRY")
return result
}
If you want each x.create to fully execute before the next one starts, i.e. this is what I interpret where you say while awaiting each .map call to finish before starting the next - then you could use async/await with a for loop as follows:
const createDatabase = async fileNumbers => {
fileNumbers.sort((a, b) => a - b);
for (let fileNumber of fileNumbers) {
const x = require(`.${root}${fileNumber.toString()}.js`);
await x.create();
})
}
However, this also assumes that x.create() returns a Promise - as you've not shown what is the typical content of .${root}${fileNumber.toString()}.js file is, then I'm only speculating
The other interpretation of your question would simply require you to change createDatabase so that promises is actually an array of promises (at the moment, it's an array of undefined
const createDatabase = async fileNumbers => {
fileNumbers.sort((a, b) => a - b);
const promises = fileNumbers.map(fileNumber => {
const x = require(`.${root}${fileNumber.toString()}.js`);
return x.create();
})
await Promise.all(promises);
}
Now all .create() run in "parallel", but createDatabase only resolves onces all promises returned by x.create() resolve - again, assumes that x.create() returns a promise

fs.readdir recursive search with depth=1

I have to write a code which takes one parameter i.e. path to directory, fetch files from the given directory and again does the same for the directories inside the given directory.The whole search should be wrapped in a promise.
But the depth of recursive search is 1.
Final array should look like: [file1, file2, file3, [file1inDir1, file2inDir1, Dir1inDir1, file3inDir1, Dir2inDir1], file4, file5]
My code is:
const fs = require("fs");
const path = require("path");
function checkfile(files){
let result = [];
for(let i=0 ; i<files.length ;i++){
let newpath = path.join(__dirname,files[i]);
fs.stat(newpath, (err,stats)=>{
if(stats.isDirectory()){
fs.readdir(newpath, (error,files)=>{result.push(files)})
}
else{result.push(files[i])}
})
}
return result;
}
let test = (filepath) => {
return new Promise((resolve, reject) =>{
fs.readdir(filepath, (error,files) => {
if (error) {
reject("Error occured while reading directory");
} else {
resolve(checkfile(files));
}
});
}
)}
test(__dirname)
.then(result => {
console.log(result);
})
.catch(er => {
console.log(er);
});
When I run it I get the following output: []
How do I correct this?
test correctly returns a promise, but checkfile does not, thus all the async operations happen after the yet empty result array was synchronously returned.
Fortunately NodeJS already provides utilities that return promises instead of taking callbacks docs, with them writing that code without callbacks going wrong is easy:
async function checkfile(files){
const result = [];
for(let i=0 ; i<files.length ;i++){
let newpath = path.join(__dirname,files[i]);
const stats = await fs.promises.stat(newpath);
if(stats.isDirectory()){
const files = await fs.promises.readdir(newpath);
result.push(files);
} else result.push(files[i]);
}
return result;
}
async function test(filepath) {
const files = await fs.promises.readdir(filepath);
return checkfile(files);
}

properly using async and await

The function below calls several asynchronous functions in a for loop. It's parsing different CSV files to build a single JavaScript object. I'd like to return the object after the for loop is done. Its returning the empty object right away while it does the asynchronous tasks. Makes sense, however I have tried various Promise / async /await combinations hopes of running something once the for loop has completed. I am clearly not understanding what is going on. Is there a better pattern to follow for something like this or am I thinking about it incorrectly?
async function createFormConfig(files: string[]): Promise<object>
return new Promise(resolve => {
const retConfig: any = {};
for (const file of files) {
file.match(matchFilesForFormConfigMap.get('FIELD')) ?
parseCsv(file).then(parsedData => {
retConfig.fields = parsedData.data;
})
: file.match(matchFilesForFormConfigMap.get('FORM'))
? parseCsv(file).then(parsedData => retConfig.formProperties = parsedData.data[0])
: file.match(matchFilesForFormConfigMap.get('PDF'))
? parseCsv(file).then(parsedData => retConfig.jsPdfProperties = parsedData.data[0])
: file.match(matchFilesForFormConfigMap.get('META'))
? parseCsv(file).then(parsedData => {
retConfig.name = parsedData.data[0].name;
retConfig.imgType = parsedData.data[0].imgType;
// console.log(retConfig); <- THIS CONSOLE WILL OUTPUT RETCONFIG LOOKING LIKE I WANT IT
})
: file.match(matchFilesForFormConfigMap.get('PAGES'))
? parseCsv(file).then(parsedData => retConfig.pages = parsedData.data)
: console.log('there is an extra file: ' + file);
}
resolve(retConfig); // <- THIS RETURNS: {}
});
This is the code I'm using to call the function in hopes of getting my 'retConfig' filled with the CSV data.
getFilesFromDirectory(`${clOptions.directory}/**/*.csv`)
.then(async (files) => {
const config = await createFormConfig(files);
console.log(config);
})
.catch(err => console.error(err));
};
First, an async function returns a Promise, so you dont have to return one explicitely.Here is how you can simplify your code:
async function createFormConfig(files: string[]): Promise<object> {
// return new Promise(resolve => { <-- remove
const retConfig: any = {};
// ...
// The value returned by an async function is the one you get
// in the callback passed to the function `.then`
return retConfig;
// }); <-- remove
}
Then, your function createFormConfig returns the config before it has finished to compute it. Here is how you can have it computed before returning it:
async function createFormConfig(files: string[]): Promise<object> {
const retConfig: any = {};
// Return a Promise for each file that have to be parsed
const parsingCsv = files.map(async file => {
if (file.match(matchFilesForFormConfigMap.get('FIELD'))) {
const { data } = await parseCsv(file);
retConfig.fields = data;
} else if (file.match(matchFilesForFormConfigMap.get('FORM'))) {
const { data } = await parseCsv(file);
retConfig.formProperties = data[0];
} else if (file.match(matchFilesForFormConfigMap.get('PDF'))) {
const { data } = await parseCsv(file);
retConfig.jsPdfProperties = data[0];
} else if (file.match(matchFilesForFormConfigMap.get('META'))) {
const { data } = await parseCsv(file);
retConfig.name = data[0].name;
retConfig.imgType = data[0].imgType;
} else if (file.match(matchFilesForFormConfigMap.get('PAGES'))) {
const { data } = await parseCsv(file);
retConfig.pages = data;
} else {
console.log('there is an extra file: ' + file);
}
});
// Wait for the Promises to resolve
await Promise.all(parsingCsv)
return retConfig;
}
async functions already return promises, you don't need to wrap the code in a new one. Just return a value from the function and the caller will receive a promise that resolves to the returned value.
Also, you have made an async function, but you're not actually using await anywhere. So the for loop runs through the whole loop before any of your promises resolve. This is why none of the data is making it into your object.
It will really simplify your code to only use await and get rid of the then() calls. For example you can do this:
async function createFormConfig(files: string[]): Promise<object> {
const retConfig: any = {};
for (const file of files) {
if (file.match(matchFilesForFormConfigMap.get('FIELD')){
// no need for the then here
let parsedData = await parseCsv(file)
retConfig.field = parsedData.data
}
// ...etc
At the end you can just return the value:
return retConfig

Categories

Resources