Sequentially loading files in javascript - javascript

I know there are other answers that are similar to this question, but I'm in a slightly different situation. Consider this block of code:
fileSelected = (e) => {
const files = e.target.files;
_.map(files, file => {
reader.readAsDataURL(file);
reader.onprogress = () => {...}
reader.onerror = () => {...}
reader.onload = () => {
const resp = await uploadAttachment(file);
// do something
}
}
}
This is iterating asynchronously when I want it sequentially. I want every new instance of FileReader to finish before moving on to the next file... I know it's not ideal, but I'm maxing out 10 files at a time.
I created a separate function to return a new Promise and used fileSelected function to loop through like so:
readFile = (file) => {
return new Promise(() => {
reader.readAsDataURL(file);
reader.onprogress...
reader.onerror...
reader.onload...
...
}
}
fileSelected = async (e) => {
for (const file of files) {
await readFile(file);
}
}
But it goes through the first file fine, but it doesn't move on to the next file. What could be the issue here? Why is it returning early?

Use async keyword inorder to use await(If you are not using ES2019)
fileSelected = async (e) => {
for (const file of files) {
await readFile(file);
}
}

Related

how to access array in async context

i have this function :
const list = [];
(async () => {
await fs.readdir(JSON_DIR, async (err, files) => {
await files.forEach(async filename => {
const readStream = fs.createReadStream(path.join("output/scheduled", filename));
const parseStream = json.createParseStream();
await parseStream.on('data', async (hostlist: HostInfo[]) => {
hostlist.forEach(async host => {
list.push(host);
});
});
readStream.pipe(parseStream);
})
});
//here list.length = 0
console.log(list.length);
})();
the function read from a directory of large json files, and reads them, for each file,it create a stream that starts reading the json, and the stream can be working at the same time.
at the end of the function i need to save the variable host in the list, but when i check the lis at the end, is empty.
how can i save the content of the host to a global variable, so it can be accessible in the end.
i tought as solution to check when every file is finished reading using and end event.
though to access the list at the end, i need another event to start when all other events are finished.
and looks complicated.
i have been using the big-json library,
https://www.npmjs.com/package/big-json
You could use a counter to determine when the streams have finished processing.
You can use readdirSync for executing the operation synchronously.
const list: HostInfo[] = [];
(() => {
const files = fs.readdirSync(JSON_DIR);
let streamFinished = 0;
let streamCount = files.length;
files.forEach((filename) => {
const readStream = fs.createReadStream(
path.join('output/scheduled', filename)
);
const parseStream = json.createParseStream();
parseStream.on('error', (err) => {
// Handle errors
})
parseStream.on('data', (hostlist: HostInfo[]) => {
list.push(...hostlist);
});
parseStream.on('end', () => {
streamFinished++;
if (streamFinished === streamCount) {
// End of all streams...
}
console.log(list.length);
})
readStream.pipe(parseStream);
});
})();

chaining promises in functions

I have a small problem, how to create a promise chain in a sensible way so that the makeZip function will first add all the necessary files, then create the zip, and finally delete the previously added files? (The makeZip function also has to return a promise). In the example below I don't call deleteFile anywhere because I don't know exactly where to call it. when I tried to call it inside the add file function to delete the file immediately after adding it, for some unknown reason the console displayed the zip maked! log first and then file deleted.
const deleteFile = (file, result) => {
new Promise((resolve, reject) => {
fs.unlink(`./screenshots/${file}`, (err) => {
if (err) return reject(err);
console.log(`${file} deleted!`);
return resolve();
});
});
};
const addFile = (file) => {
new Promise((resolve, reject) => {
try {
zip.addLocalFile(`./screenshots/${file}`);
console.log(`${file} added`);
return resolve();
} catch {
return reject(new Error("failed to add file"));
}
});
};
const makeZip = () => {
Promise.all(fs.readdirSync("./screenshots").map((file) => addFile(file)))
.then(() => {
return new Promise((resolve, reject) => {
try {
zip.writeZip(`./zip_files/supername.zip`);
console.log("zip maked!");
resolve();
} catch {
return reject(new Error("failed making zip"));
}
});
})
.catch((err) => console.log(err));
};
the main cause of this is that you are not returning the promises you are instantiating in your function calls. Also I have some cool suggestion to make that can improve you code cleanliness.
[TIP]: Ever checked the promisify function in NodeJS util package, it comes with node and it is very convenient for converting functions that require callbacks as arguments into promise returning functions., I will demonstrate below anyhow.
// so I will work with one function because the problem resonates with the rest, so
// let us look at the add file function.
// so let us get the promisify function first
const promisify = require('util').promisify;
const addFile = (file) => {
// if addLocalFile is async then you can just return it
return zip.addLocalFile(`./screenshots/${file}`);
};
// okay so here is the promisify example, realized it wasn't applicable int the function
// above
const deleteFile = (file, result) => {
// so we will return here a. So because the function fs.unlink, takes a second arg that
// is a callback we can use promisify to convert the function into a promise
// returning function.
return promisify(fs.unlink)(`./screenshots/${file}`);
// so from there you can do your error handling.
};
So now let us put it all together in your last function, that is, makeZip
const makeZip = () => {
// good call on this, very interesting.
Promise.all(fs.readdirSync("./screenshots").map((file) => addFile(file)))
.then(() => {
return zip.writeZip(`./zip_files/supername.zip`);
})
.then(() => {
//... in here you can then unlink your files.
});
.catch((err) => console.log(err));
};
Everything should be good with these suggestions, hope it works out...
Thank you all for the hints, the solution turned out to be much simpler, just use the fs.unlinkSync method instead of the asynchronous fs.unlink.
const deleteFile = (file) => {
try {
fs.unlinkSync(`./screenshots/${file}`);
console.log(`${file} removed`);
} catch (err) {
console.error(err);
}
};
const addFile = (file) => {
try {
zip.addLocalFile(`./screenshots/${file}`);
console.log(`${file} added`);
deleteFile(file);
} catch (err) {
console.error(err);
}
};
const makeZip = () => {
fs.readdirSync("./screenshots").map((file) => addFile(file));
zip.writeZip(`./zip_files/supername.zip`);
console.log("zip maked!");
};

Node.js how to synchronously read lines from stream.Readable

I'm interacting with a child process through stdio, and I need to wait for a line from childProcess.stdout each time I write some command to childProcess.stdin.
It's easy to wrap an asynchronous method for writing like below:
async function write(data){
return new Promise(resolve=>{
childProcess.stdin.write(data,()=>resolve());
})
}
However, it turns out quite difficult when it comes to reading, since data from stdout must be processed using listeners. I've tried below:
const LineReader = require("readline")
const reader = LineReader.createInterface(childProcess.stdout);
async function read(){
return new Promise(resolve=>{
reader.once("line",line=>resolve(line));
})
}
But it always returns the first line.
I know I may achieve this using setInterval, And I've already implemented the functionality this way. But it obviously has an impact on the performance, so now I'm trying to optimize it by wrapping it into an asynchronous method.
Any suggestions and solutions will be appreciated!
Well, I ended up with something pretty similar to what you were trying. It makes some assumptions that are mentioned in the code and needs more complete error handling:
const cp = require('child_process');
const readline = require('readline');
const child = cp.spawn("node", ["./echo.js"]);
child.on('error', err => {
console.log(err);
}).on('exit', () => {
console.log("child exited");
});
const reader = readline.createInterface({ input: child.stdout });
// this will miss line events that occurred before this is called
// so this only really works if you know the output comes one line at a time
function nextLine() {
return new Promise(resolve => {
reader.once('line', resolve);
});
}
// this does not check for stdin that is full and wants us to wait
// for a drain event
function write(str) {
return new Promise(resolve => {
let ready = child.stdin.write(str, resolve);
if (!ready) {
console.log("stream isn't ready yet");
}
});
}
async function sendCmd(cmd) {
// get line reader event handler installed so there's no race condition
// on missing the return event
let p = nextLine();
// send the command
await write(cmd);
return p;
}
// send a sequence of commands and get their results
async function run() {
let result1 = await sendCmd("hi\n");
console.log(`Got '${result1}'`);
let result2 = await sendCmd("goodbye\n");
console.log(`Got '${result2}'`);
let result3 = await sendCmd("exit\n");
console.log(`Got '${result3}'`);
}
run().then(() => {
console.log("done");
}).catch(err => {
console.log(err);
});
And, for testing purposes, I ran it with this echo app:
process.stdin.on("data", data => {
let str = data.toString();
let ready = process.stdout.write("return: " + str, () => {
if (str.startsWith("exit")) {
process.exit();
}
});
if (!ready) {
console.log("echo wasn't ready");
}
});

Different strings with headstream and promise

I use in my project a function:
function readStream(file) {
console.log("starte lesen");
const readStream = fs.createReadStream(file);
readStream.setEncoding('utf8');
return new Promise((resolve, reject) => {
let data = "";
readStream.on("data", chunk => data += chunk);
readStream.on("end", () => {resolve(data);});
readStream.on("error", error => reject(error));
});
}
It will read an xml file with around 800 lines in. If I add:
readStream.on("end", () => {console.log(data); resolve(data);});
Then the xml data is complete. Everything is fine. But if I call now this readStream from another function:
const dpath = path.resolve(__basedir, 'tests/downloads', 'test.xml');
let xml = await readStream(dpath);
console.log(xml);
then the XML data is cut. I think 800 lines is nothing big. So what can happen that the data is cut at this position but not in the function itself.
I have tried it like following way, it seems working for me.
For complete running example clone node-cheat xml-streamer and run node main.js.
xml-streamer.js:
const fs = require('fs');
module.exports.readStream = function (file) {
console.log("read stream started");
const readStream = fs.createReadStream(file);
readStream.setEncoding('utf8');
return new Promise((resolve, reject) => {
let data = "";
readStream.on("data", chunk => data += chunk);
readStream.on("end", () => {console.log(data); resolve(data);});
readStream.on("error", error => reject(error));
});
}
main.js:
const path = require('path');
const _streamer = require('./xml-streamer');
async function main() {
const xml = await _streamer.readStream( path.resolve(__dirname, 'files', 'test.xml'));
console.log(xml);
}
main();
P.S. In above mentioned node-cheat test xml file has 1121 lines.
Sometimes sync + async code can get a race condition when it's called in the same tick. Try using setImmediate(resolve, data) on your event handler, which will resolve on the next process tick.
Alternatively, if you're targeting node v12 or higher you can use the stream async iterator interface, which will be much cleaner for your code:
async function readStream(file) {
console.log("starte lesen");
const readStream = fs.createReadStream(file);
readStream.setEncoding('utf8');
let data = "";
for await (const chunk of readStream) {
out += chunk;
}
return out;
}
If you happens to use a modern version of node, there's fs.promises
const { promises: fs } = require('fs')
;(async function main() {
console.log(await fs.readFile('./input.txt', 'utf-8'));
})()

Functions are not waiting until they are resolved

I'm trying to execute functions one at a time, sequentially. Using promises, I believe it should work, but for me, it does not work. I've researched somewhat and found this question, and one of the answers explains to use Promises, that is what I've been trying to do.
Here's the functions:
async function loadCommands () {
return new Promise((resolve, reject) => {
let commands = 0;
readdir('./commands/', (error, files) => {
if (error) reject(error);
for (const file of files) {
if (!file.endsWith('.js')) return;
commands++;
}
}
resolve(commands); // this is in my code, I forgot to put it - sorry commenters
});
};
async function loadEvents () {
return new Promise(async (resolve, reject) => {
let events = 0;
readdir('./events/', (error, files) => {
if (error) reject (error);
for (const file of files) {
if (!file.endsWith('.js')) return;
events++
}
});
resolve(events);
});
};
I am then using await in an async function to try and make sure it each function resolves before going onto the next function:
console.log('started');
const events = await loadEvents();
console.log(events);
console.log('load commands');
const commands = await loadCommands();
console.log(commands);
console.log('end')
In the console, this is linked (keep in mind, I have no files in ./events/ and I have one file in ./commands/):
start
0 // expected
load commands
0 // not expected, it's supposed to be 1
end
What am I doing wrong? I want these functions to be run sequentially. I've tried making it so instead of functions, it's just the bare code in the one async function, but still came to the issue.
You never resolve() the promise that you create in loadCommands, and you resolve() the promise that you create in loadEvents before the readdir callback happened.
Also, don't do any logic in non-promise callbacks. Use the new Promise constructor only to promisify, and call only resolve/reject in the async callback:
function readdirPromise(path) {
return new Promise((resolve, reject) => {
readdir(path, (err, files) => {
if (err) reject(err);
else resolve(files);
});
});
});
or simply
import { promisify } from 'util';
const readdirPromise = promisify(readdir);
Then you can use that promise in your actual logic function:
async function countJsFiles(path) {
const files = await readdirPromise(path);
let count = 0;
for (const file of files) {
if (file.endsWith('.js'))
count++;
// I don't think you really wanted to `return` otherwise
}
return count;
}
function loadCommands() {
return countJsFiles('./commands/');
}
function loadEvents() {
return countJsFiles('./events/');
}
You're trying to use await outside async. You can await a promise only inside an async function. The functions returning promises ( here loadCommands & loadEvents ) don't need to be async. Make an async wrapper function like run and call the await statements inside it like this.
PS: Plus you also need to resolve loadCommands with commands in the callback itself. Same for loadEvents. Also, remove the return and simple increment the variable when true.
function loadCommands() {
return new Promise((resolve, reject) => {
let commands = 0;
readdir('./commands/', (error, files) => {
if (error) reject(error);
for (const file of files) {
if (file.endsWith('.js')) commands++;
}
}
resolve(commands);
});
};
function loadEvents() {
return new Promise((resolve, reject) => {
let events = 0;
readdir('./events/', (error, files) => {
if (error) reject(error);
for (const file of files) {
if (file.endsWith('.js')) events++
}
resolve(events);
});
});
};
async function run() {
console.log('started');
const events = await loadEvents();
console.log(events);
console.log('load commands');
const commands = await loadCommands();
console.log(commands);
console.log('end')
}
run();
Hope this helps !

Categories

Resources