Adding "await" skips all lines after [duplicate] - javascript

This question already has answers here:
Resolve promises one after another (i.e. in sequence)?
(36 answers)
Closed 11 days ago.
I've tried to solve this the last hours and cannot come to a conclusion. It seems that adding await before sharp(image).rotate(value).toBuffer() in the below code makes it jump over all lines after.
const promises = this.imagesToProcess.map(async (image) => {
try {
await sharp(image).rotate(value).toBuffer() // <-- This line here cause the problem
console.log(this.terminalColors.green, `${image} [Rotate - Processed]`) // <-- This line will not run unless 'await' is removed
} catch (e) {
console.error(e)
this.errors.push(this.terminalColors.red, `Could not process file ${image} [Skipping]`)
}
})
// Await all proimises
Promise.all(promises)
I'm 100% sure this is due to me not understanding asynchronous programming all too well. I just can't figure it out, however.
It's worth mentioning that "sharp" is a image manipulation library for NodeJS. The toBuffer() function seems to return a promise, according to the index.d.ts file:
/**
* Write output to a Buffer. JPEG, PNG, WebP, AVIF, TIFF, GIF and RAW output are supported.
* By default, the format will match the input image, except SVG input which becomes PNG output.
* #param options resolve options
* #param options.resolveWithObject Resolve the Promise with an Object containing data and info properties instead of resolving only with data.
* #returns A promise that resolves with the Buffer data.
*/
toBuffer(options?: { resolveWithObject: false }): Promise<Buffer>;

Your promises are not returning the buffers, which is what I think you want to do. Try:
const promises = this.imagesToProcess.map(async (image) => {
try {
const buffer = await sharp(image).rotate(value).toBuffer() // <-- store return in variable
console.log(this.terminalColors.green, `${image} [Rotate - Processed]`)
return buffer // <----------- return the buffer
} catch (e) {
console.error(e)
this.errors.push(this.terminalColors.red, `Could not process file ${image} [Skipping]`)
}
})

Related

Script only prints last file in array instead of all files [duplicate]

This question already has answers here:
JavaScript closure inside loops – simple practical example
(44 answers)
Closed 10 months ago.
I'm reading about Promises in JavaScript The Definitive Guide by Flannagan 7ed. In the book there is a script which shows how to build a Promise chain dynamically for an arbitrary number of URLs. The script is as follows:
function fetchSequentially(urls) {
// We'll store the URL bodies here as we fetch them
const bodies = [];
// Here's a Promise-returning function that fetches one body
function fetchOne(url) {
return fetch(url)
.then(response => response.text())
.then(body => {
// We save the body to the array, and we're purposely
// omitting a return value here (returning undefined)
bodies.push(body);
});
}
// Start with a Promise that will fulfill right away (with value undefined)
let p = Promise.resolve(undefined);
// Now loop through the desired URLs, building a Promise chain
// of arbitrary length, fetching one URL at each stage of the chain
for (url of urls) {
p = p.then(() => fetchOne(url));
}
// When the last Promise in that chain is fulfilled, then the
// bodies array is ready. So let's return a Promise for that
// bodies array. Note that we don't include any error handlers:
// we want to allow errors to propagate to the caller.
return p.then(() => bodies);
}
//The script was run as below
//I added the line below to declare the urls array
let urls = ['/data.txt', '/readme.txt', '/textfile.txt'];
//the line below is from the book
fetchSequentially(urls)
.then(bodies => {
console.log(bodies)
})
.catch(e => console.error(e));
I added the let urls line to run the script to fetch 3 text files on my PC.
When the script runs it seems to only fetch the last file textfile.txt, and it prints out the contents of the third file 3 times in the console. I thought the script would retrieve the contents of all 3 files, add them to the bodies array, and then log the contents of all 3 files to console.
Can anyone spot why this isn't working?
It looks like this is the section that's causing problems:
for(url of urls) {
p = p.then(() => fetchOne(url));
}
Here you're creating a global variable url, and since it's running asynchronously, fetchOne(url) is using the last instance of it.
Instead you can do something like:
for(let url of urls) {
p = p.then(() => fetchOne(url));
}
This creates a local instance of url for each iteration.
This sort of style of programming of iterating through arrays asynchronously can introduce subtle errors like this one, so I'd recommend a style that unambiguously creates a new instance per iteration. Something like:
urls.forEach(function (url) {
p = p.then(() => fetchOne(url));
});
Though for this sort of thing with multiple promises, you might just want to do a .map with Promise.all:
return Promise.all(urls.map(fetchOne)); // instead of promise chaining with p

How to handle error from fs readline.Interface async iterator

Based on the example of processLineByLine() I noticed that we cannot catch the error if the given filename does not exist. In that case the program finishes with something like:
UnhandledPromiseRejectionWarning: Error: ENOENT: no such file or directory
So the most simpler approach that I followed to raise a catchable error was to make 2 modifications to the processLineByLine() function:
turn it in a generator such as function*
await on file exist check await access(filename, fs.constants.F_OK)
Finally I had to convert the readline.Interface instance to an async generator. I do not like this last part in particularly. The resulting lines() function is like:
export async function* lines(filename) {
await access(filename, fs.constants.F_OK)
const lines = readline.createInterface({
input: fs.createReadStream(filename),
crlfDelay: Infinity
})
for await (const l of lines) {
yield l
}
}
Question: Is there a better approach to make lines() either return an async iterator or throw an error if the filename does not exist?
BUG report: Regarding #jfriend00 observations I have opened a Bug issue on nodejs: https://github.com/nodejs/node/issues/30831
Hmm, this is a tricky one. Even detecting whether the file exists as a pre-flight doesn't guarantee that you can successfully open it (it could be locked or have permission issues) and detecting if it exists before opening is a classic race condition in server development (small window, but still a race condition).
I'm still thinking there must be a better way to get an error out of a fs.createReadStream(), but the only way I could find was to wrap it in a promise that only resolves when the file is successfully open. That lets you get the error from opening the file and propagate it back to the caller of your async function. Here's what that would look like:
const fs = require('fs');
const readline = require('readline');
function createReadStreamSafe(filename, options) {
return new Promise((resolve, reject) => {
const fileStream = fs.createReadStream(filename, options);
fileStream.on('error', reject).on('open', () => {
resolve(filestream);
});
});
}
async function processLineByLine(f) {
const fileStream = await createReadStreamSafe(f);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
for await (const line of rl) {
// Each line in input.txt will be successively available here as `line`.
console.log(`Line from file: ${line}`);
}
}
processLineByLine("nofile").catch(err => {
console.log("caught error");
});
This makes it so that the promise that processLineByLine() returns will reject and you can handle the error there which is what I think you were asking for. If I misunderstood what you were asking for, then please clarify.
FYI, this seems to me to be a bug in readline.createInterface() because it seems like it should reject on the first iteration of for await (const line of rl), but that doesn't appear to be what happens.
So, as a consequence of that, even this work-around won't detect read errors on the stream after it's opened. That really needs to be fixed internal to createInterface(). I agree both a file open error or a read error should show up as a reject on for await (const line of rl).
Another work-around for the file open issue would be to pre-open the file using await fs.promises.open(...) and pass the fd to fs.createReadStream and then you would see the error on the open yourself.
A Different Solution - Wrapping the readLine iterator to add error handling
Warning, this ends up looking like a bit of hack, but it's a really interesting learning project because I ended up having to wrap an the readline asyncIterator with my own in order to reject when I detected an error on the readStream (the error handling that the readline library is missing).
I set out on a mission to figure out how to write a processLineByLine() function that would return an asyncIterator that would properly reject on stream errors (even though the readline code has bugs in this regard) while still using the readline library internally.
The goal was to be able to write code like this:
for await (let line of processLineByLine("somefile1.txt")) {
console.log(line);
}
that properly handles errors on the readStream used internally, whether the file doesn't exist, exists but can't be opened or even encounters a read error later while reading. Since I'm not changing/fixing the readline interface code internally, I had to install my own error listener on the readStream and when I see an error there, I need to cause any pending or future promises from the readline interface to reject.
Here's what I ended up with:
// This is an experiment to wrap the lines asyncIterator with our own iterator
// so we can reject when there's been an error on the readStream. It's really
// ugly, but does work.
const fs = require('fs');
const readline = require('readline');
function processLineByLine(filename, options = {}) {
const fileStream = fs.createReadStream(filename, options);
let latchedError = null;
let kill = new Set();
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
const lines = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
// create our own little asyncIterator that wraps the lines asyncIterator
// so we can reject when we need to
function asyncIterator() {
const linesIterator = lines[Symbol.asyncIterator]();
return {
next: function() {
if (latchedError) {
return Promise.reject(latchedError);
} else {
return new Promise((resolve, reject) => {
// save reject handlers in higher scope so they can be called
// from the stream error handler
kill.add(reject);
let p = linesIterator.next();
// have our higher level promise track the iterator promise
// except when we reject it from the outside upon stream error
p.then((data => {
// since we're resolving now, let's removing our reject
// handler from the kill storage. This will allow this scope
// to be properly garbage collected
kill.delete(reject);
resolve(data);
}), reject);
});
}
}
}
}
var asyncIterable = {
[Symbol.asyncIterator]: asyncIterator
};
return asyncIterable;
}
async function runIt() {
for await (let line of processLineByLine("xfile1.txt")) {
console.log(line);
}
}
runIt().then(() => {
console.log("done");
}).catch(err => {
console.log("final Error", err);
});
Some explanation on how this works...
Our own error monitoring on the stream
First, you can see this:
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
This is our own error monitoring on the readStream to make up for the missing error handling inside of readline. Anytime we see an error, we save it in a higher scoped variable for potential later use and, if there are any pending promises registered from readline for this stream, we "kill" them (which rejects them, you will see later how that works).
No special handling for file open errors
Part of the goal here was to get rid of the special handling in the previous solution for file open errors. We want ANY error on the readStream to trigger a rejection of the asyncIterable so this is a much more general purpose mechanism. the file open error gets caught in this error handling just the same way any other read error would.
Our own asyncIterable and asyncIterator
Calling readline.createInterace() returns an asyncIterable. It's basically the same as a regular iterable in that you call a special property on it to get an asyncIterator. That asyncIterator has a .next() property on it just like a regular iterator except when asyncIterator.next() is called, it returns a promise that resolves to an object instead of an object.
So, that's how for await (let line of lines) works. It first calls lines[Symbol.asyncIterator]() to get an asyncIterator. Then, on that asyncIterator that it gets back, it repeatedly does await asyncIterator.next() waiting on the promise that asyncIterator.next() returns.
Now, readline.createInterface() already returns such an asyncIterable. But, it doesn't work quite right. When the readStream gets an error, it doesn't reject the promise returned by .next() on each iteration. In fact, that promise never gets rejected or resolved. So, things get stalled. In my test app, the app would just exit because the readStream was done (after the error) and there was no longer anything keeping the app from exiting, even though a promise was still pending.
So, I needed a way to force that promise that readlineIterator.next() had previously returned and was currently being awaited by for await (...) to be rejected. Well, a promise doesn't provide an outward interface for rejecting it and we don't have access to the internals to the readline implementation where there is access to reject it.
My solution was to wrap the readlineIterator with my own as a sort of proxy. Then, we my own error detector sees an error and there are promise(s) outstanding from readline, I can use my proxy/wrapper to force a rejection on those outstanding promise(s). This will cause the for await (...) to see the reject and get a proper error. And, it works.
It took me awhile to learn enough about how asyncIterators work to be able to wrap one. I owe a lot of thanks to this Asynchronous Iterators in JavaScript article which provided some very helpful code examples for constructing your own asyncIterable and asyncIterator. This is actually where the real learning came about in this exercise and where others might learn by understanding how this works in the above code.
Forcing a wrapped promise to reject
The "ugliness" in this code comes in forcing a promise to reject from outside the usual scope of the reject handler for that promise. This is done by storing the reject handler in a higher level scope where an error handling for the readStream can call trigger that promise to reject. There may be a more elegant way to code this, but this works.
Making our own asyncIterable
An async iterable is just an object that has one property on it named [Symbol.asyncIterator]. That property must be a function that, when called with no arguments, returns an asyncIterator. So, here's our asyncIterable.
var asyncIterable = {
[Symbol.asyncIterator]: asyncIterator
};
Making our own asyncIterator
An asyncIterator is a function that when called returns an object with a next() property on it. Each time obj.next() is called, it returns a promise that resolves to the usual iterator tuple object {done, value}. We don't have to worry about the resolved value because we'll just get that from the readline's iterator. So, here's our asyncIterator:
// create our own little asyncIterator that wraps the lines asyncIterator
// so we can reject when we need to
function asyncIterator() {
const linesIterator = lines[Symbol.asyncIterator]();
return {
next: function() {
if (latchedError) {
return Promise.reject(latchedError);
} else {
return new Promise((resolve, reject) => {
// save reject handlers in higher scope so they can be called
// from the stream error handler
kill.push(reject);
let p = linesIterator.next();
// have our higher level promise track the iterator promise
// except when we reject it from the outside upon stream error
p.then(resolve, reject);
});
}
}
}
}
First, it gets the asyncIterator from the readline interface (the one we're proxying/wrapping) and stores it locally in scope so we can use it later.
Then, it returns the mandatory iterator structure of the form {next: fn}. Then, inside that function is where our wrapping logic unfolds. If we've seen a previous latched error, then we just always return Promise.reject(latchedError);. If there's no error, then we return a manually constructed promise.
Inside the executor function for that promise, we register our reject handling by adding it into a higher scoped Set named kill. This allows our higher scoped filestream.on('error', ....) handler to reject this promise if it sees an error by calling that function.
Then, we call linesIterator.next() to get the promise that it returns. We register an interest in both the resolve and reject callbacks for that promise. If that promise is properly resolved, we remove our reject handler from the higher level scope (to enable better garbage collection of our scope) and then resolve our wrap/proxy promise with the same resolved value.
If that linesIterator promise rejects, we just pass the reject right through our wrap/proxy promise.
Our own filestream error handling
So, now the final piece of explanation. We have this error handler watching the stream:
fileStream.on('error', (err) => {
latchedError = err;
// any open promises waiting on this stream, need to get rejected now
for (let fn of kill) {
fn(err);
}
});
This does two things. First, it stores/latches the error so any future calls to the lines iterator will just reject with this previous error. Second, if there are any pending promises from the lines iterator waiting to be resolved, it cycles through the kill Set and rejects those promises. This is what gets the asyncIterator promise to get properly rejected. This should be happening inside the readline code, but since it isn't doing it properly, we force our wrap/proxy promise to reject so the caller sees the proper rejection when the stream gets an error.
In the end, you can just do this as all the ugly detail is hidden behind the wrapped asyncIterable:
async function runIt() {
for await (let line of processLineByLine("xfile1.txt")) {
console.log(line);
}
}
runIt().then(() => {
console.log("done");
}).catch(err => {
console.log("final Error", err);
});
Readline error handling
I also struggled to get readline to be able to throw any error states.
I did mange to get readstream to throw errors, if the file was not found. (shown in the code below)
But writestream never does.
My use case was, I did not want to use up loads of memory reading a whole file then converting from JSON to an object. Just line by line - hence readline.
Which I now see Node.js has done some work on ver 19.xxx? but its not production ready yet.(As of 25/OCT/2022)
I tried to convert readline to promise based but its has to many hoops in the current version.
This is a working shell that shows my structure if it helps others.
It needs more work to make it async/await but great care is needed here to
avoid the readstream or writestream race conditions.
////////////////////////////////////////////////////////////////////////
// Simple file stream that can be used to find/edit/remove/filter data
// Example; User name password email
////////////////////////////////////////////////////////////////////////
const readline = require('readline');
const fs = require('fs');
Let obj={};
// A file called json_users.txt it has JSON strings terminated with line feed
//{"id":1,"username":"","password":"","email":""}\n
const readStream2 = fs.createReadStream( "json_users.txt" );
const writeStream2 = fs.createWriteStream( "update_users.txt", { encoding: "utf8"} );
// Some sort of Read error handler - works if no file
readStream2.on('error', function (err) {
console.log("This is a read stream error "+ err);
});
// Some sort of Write error handler -
// but never called even with a file name like "$$--!!.$$$"
writeStream2.on('error', function (err) {
console.log("This is a write stream error "+ err);
});
// Create readline with input read stream and output write stream
const rl = readline.createInterface({
input: readStream2,
output: writeStream2,
terminal: false,
crlfDelay: Infinity,
historySize: 0
});
// readline is event driven on line feed
rl.on('line', (line) => {
obj =JSON.parse(line); // convert line into an object
// Any Filter work goes here e.g. Remove user - find user edit user
if(obj.id==20) { // test if id=20 make username ="
obj.username="Douglas Crockford";
}
// Write object and \n back to stream
writeStream2.write(JSON.stringify(obj)+'\n');
});
// much better way to close stream do this but for now
// await new Promise((res) => rl.once('close', res));
rl.once("close",()=>{
console.log("done");
rl.close;
writeStream2.close;
});

Creating better version of promise function [duplicate]

This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
Closed 4 years ago.
Im a bit struggling to grasp promises, I have a use case when I perform X async actions, and when these are completed I make call to rest api. below my code:
const promises = defect.images.map(async image => {
return new Promise((resolve)=> {
this.fileProvider.getDefectImage(image.url)
.then(binary => {
images.set(image.url, binary);
resolve();
});
})
});
console.log(images)
return Promise.all(promises)
.then(() => spinner.dismiss())
but the console.log of images is always empty ... what should i change?
//edit
sorry guys for leading into a trap, indeed this console.log can not work properly but Im not getting the data on bakcend side, its also empty there.
You are logging images before any of the promises have been resolved, so it's still empty at that point.
You need to wait for all the promises to be resolved first. And apparently you already know how to do that:
return Promise.all(promises)
.then(() => {
console.log(images);
spinner.dismiss();
})
Besides, you are mixing async/await syntax and Promise syntax. If you're going to rely on async/await, you might as well go all the way, and then your code will be executed from top to bottom again:
const promises = defect.images.map(async image => {
let binary = await this.fileProvider.getDefectImage(image.url);
images.set(image.url, binary);
});
await Promise.all(promises);
console.log(images);
spinner.dismiss();

Is it ok not to use reject in a Promise?

Is it allright to construct a Promise that never rejects? I mean is this some sort of anti-pattern or is it acceptable? Let me illustrate this with an example. I have a class ModifyURL that consits of many methods each method does something with an array of URI strings and returns a Promise. Part of implementation looks like this.
class ModifyURL {
constructor() {
}
/**
* Remove empty and invalid urls
* #param {object} object containing arrays of links
*
* #return {object} object containing arrays of valid links
*/
removeInvalid(data) {
return new Promise((resolve,reject)=>{
for (let key in data) {
data[key] = data[key].filter( function(item) {
return !(item == '#' || !item || item == ' ');
});
}
resolve(data)
});
}
/**
* Remove duplicates
* #param {object} object containing arrays of links
*
* #return {object} object containing arrays of unique links
*/
removeDuplicates(data) {
return new Promise((resolve,reject)=>{
for (let key in data) {
data[key] = data[key].filter(function (item, pos) {
return data[key].indexOf(item) == pos;
})
}
resolve(data)
});
}
/**
* Add full hostname to relative urls.
* #param {object} object containing arrays of links
* #param {string} hostname to be added if link is relative
*
* #return {object} object containing arrays of absolute links
*/
fixRelativeLinks(data,hostname) {
if(typeof data === 'object'){
return new Promise((resolve,reject)=>{
for (let key in data) {
data[key].forEach((v, i) => {
if(data[key][i]){
data[key][i] = URL.resolve(hostname, data[key][i])
}
})
}
resolve(data)
})
}
}
}
Later I chain these Promises and it works fine.
modifyURL.removeInvalid(data).then(res=>{
return res
})
.then(()=>{
return modifyURL.fixRelativeLinks(data, res.request.href)
})
.then(modifyURL.removeDuplicates).then(res=>{
onSuccess(res)
}).catch(err=>{console.log(err)})
As you noticed I don't use reject and it feels a bit odd. The reason for that is that at the end I need to receive some data. Even if some Promise in chain fails to do their task I need to finally resolve with my array of URI strings. That's why I don't reject because it breaks my Promise chain. But without the reject I lose the ability to track errors propperly. What is the proper way to handle this kind of task?
Is it allright to construct a Promise that never rejects?
Yes, its all right. If there is no possible error in the operation, then it's perfectly fine if the promise only resolves.
I mean is this some sort of anti-pattern or is it acceptable?
It is perfectly acceptable to have a promise that never rejects (assuming there is some code path that it will resolve). For example, you might write a function that resolves a promise after a particular delay such as this:
function delay(t, v) {
return new Promise(resolve => {
setTimeout(resolve, t, v);
});
}
I have a class ModifyURL that consits of many methods each method does something with an array of URI strings and returns a Promise. Part of implementation looks like this.
These are all synchronous functions. They do not need to be wrapped in promises and should not be wrapped in promises. Promises are for tracking asynchronous operations. They only create more complicated code than necessary when using them with purely synchronous code.
While one can wrap synchronous operations with promises (as you have), I would call that an anti-pattern as it makes the code a lot more complex than just using normal synchronous coding patterns where you just call multiple functions one after another or if they all operate on the same data, make those functions all methods on an object and call them one after the other.
As you noticed I don't use reject and it feels a bit odd. The reason for that is that at the end I need to receive some data.
First off, you're misusing promises here with synchronous code. But, generically with asynchronous code, your code gets to decide what happens when there's an error. You can let a rejection propagate and stop the chain or you can catch the rejection locally and change it into whatever you want the chain to continue with and the chain will not know any error occurred. That's up to you and your code. You are in full control of that.
Even if some Promise in chain fails to do their task I need to finally resolve with my array of URI strings.
This is just about having proper local error handling so you catch and handle any error locally so you can continue processing the rest of your data and return the data that was successfully processed. This would be no different in concept than using a try/catch with synchronous code to catch errors locally so you can catch them, handle them however is appropriate and continue with the rest of the work.
That's why I don't reject because it breaks my Promise chain. But without the reject I lose the ability to track errors properly. What is the proper way to handle this kind of task?
There isn't really a generic answer to this as it really depends upon the particular application and what it's doing and how you want to communicate back both results and errors. Sometimes you abort the chain upon first error (fail fast). Sometimes you skip the errors and just return the successful results. Sometimes you return an array of results and an array of errors. Sometimes you return a single array that has both results and errors and you provide some means in the format of the data to know which is an error and which is a successful result. And, you could invent as many other ways as you want to communicate back both results and errors.

NodeJS fs.readdir - return callback internals as an object? [duplicate]

This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
Closed 8 years ago.
I'm trying to refactor my code a bit to make it a little nicer to use. What I would like to do is take a Node callback and pass back the result or error as a JavaScript object. It appears the scoping rules are preventing me from doing this, so is there a somewhat straightforward way to do this or am I stuck inside of the callback (unless I do something funky like implementing Promises in JavaScript, which Google Search reveals is what all the cool kids are doing these days)?
I'm still not all that comfortable with JavaScript - I think there is probably an easier way to do this and I'm just not getting it because I haven't worked with it enough (I know with jQuery I can set 'this' within a promise to get access to my outer scope, for example, but I'm not sure how to do this with Node, or if I can).
/*
* Retrieves the full list of only the files, excluding subdirectories, in the directory provided.
*
* Parameters:
* dir: The directory to get the file list from.
*
* Return: An object containing either an error message or the fileList as a String array.
* e.g.
* { errMsg: undefined,
* fileList: ["file1.txt", "file2.xlsx"]
* }
*/
var getFileList = function (dir) {
var filterOutDirectories = function (files) {
return files.map(function (file) {
return path.join(dir, file);
}).filter(function (file) {
return fs.statSync(file).isFile();
});
};
var result = {
errMsg: null,
fileList: null
};
fs.readdir(dir, function (err, files) {
if (err) {
result.errMsg = "Error: Could not retrieve files...\n\t" + err;
} else {
result.fileList = filterOutDirectories(files);
}
});
console.log(result.fileList); // returns null
console.log(result.errMsg); // also returns null
return result;
}
the readdir is an async call, so the variables that are set in the callback are not available directly after the call.

Categories

Resources