Retry a promise step - javascript

Suppose I have the the following Promise chain:
var result = Promise.resolve(filename)
.then(unpackDataFromFile)
.then(transformData)
.then(compileDara)
.then(writeData);
Now I have not only one transformData function but two or more, stored in an array. I want to try the first one, and if the compileData function fails, try the second one and so on until either compileData succeeds or the array of transformData functions is exhausted.
Can someone give me an example on how to implement this?
Running all transformData functions and give the result array to compileData is not an option, since the functions are very expensive and I want to run as few as possible of them.
transformData itself also returns a Promise, if that helps.

I would start by isolating the notion of trying a number of promises until one succeeds:
function tryMultiple([promise, ...rest]) {
if (!promise) throw new Error("no more to try");
return promise.catch(() => tryMultiple(rest));
}
Now write a handler which tries each combination of transforming and compiling:
function transformAndCompile(transformers) {
return function(data) {
return tryMultiple(transformers.map(t => t(data).then(compileData)));
};
}
Now the top level is just:
var result = Promise.resolve(filename)
.then(unpackDataFromFile)
.then(transformAndCompile(transformers))
.then(writeData);
By the way, Promise.resolve(filename).then(unpackDataFromFile) is just a roundabout way of saying unpackDataFromFile(filename).

You can do something like this:
// various transformer functions to try in order to be tried
var transformers = [f1, f2, f3, f4];
function transformFile(filename) {
// initialize tIndex to select next transformer function
var tIndex = 0;
var p = unpackDataFromFile(filename);
function run() {
return p.then(transformers[tIndex++])
.then(compileData)
.catch(function(err) {
if (tIndex < transformers.length) {
// execute the next transformer, returning
// a promise so it is linked into the chain
return run();
} else {
// out of transformers, so reject and stop
throw new Error("No transformer succeeded");
}
}).then(writeData);
}
return run();
}
transformFile("someData.txt").then(function(finalResult) {
// succeeded here
}).catch(function(err) {
// error here
});
Here's how this works:
Sets up a tIndex variable that indexes into the array of transformer functions.
Calls unpackDataFromFile(filename) and saves the resulting promise.
Then executes the sequence p.then(transformer).then(compileData) using the first transformer. If that succeeds, it calls writeData and returns the resulting promise.
If either the transformer or compileData fails, then it goes to the next transformer function and starts over. The key here to making this work is that in the .catch() handler, it returns a new promise which chains into the originally returned promise. Each new call to run() is chained onto the original promise from unpackDataFromFile() which allows you to reuse that result.
Here's a bit more generic implementation that makes an iterator for an array that iterates until the iterator callback returns a promise that fulfills.
// Iterate an array using an iterator that returns a promise
// Stop iterating as soon as you get a fulfilled promise from the iterator
// Pass:
// p - Initial promise (can be just Promise.resolve(data))
// array - array of items to pass to the iterator one at a time
// fn - iterator function that returns a promise
// iterator called as fn(data, item)
// data - fulfilled value of promise passed in
// item - array item for this iteration
function iterateAsyncUntilSuccess(p, array, fn) {
var index = 0;
function next() {
if (index < array.length) {
var item = array[index++];
return p.then(function(data) {
return fn(data, item).catch(function(err) {
// if this one fails, try the next one
return next();
});
});
} else {
return Promise.reject(new Error("End of data with no operation successful"));
}
}
return next();
}
// Usage:
// various transformer functions to try in order to be tried
var transformers = [f1, f2, f3, f4];
iterateAsyncUntil(unpackDataFromFile(filename), transformers, function(data, item) {
return item(data).then(compileData);
}).then(writeData).then(function(result) {
// successfully completed here
}).catch(function(err) {
// error here
});

The following should do what you want most idiomatically:
var transformers = [transformData, transformData2];
var result = unpackDataFromFile(filename)
.then(function transpile(data, i = 0) {
return transformers[i](data).then(compileData)
.catch(e => ++i < transformers.length? transpile(data, i) : Promise.reject(e));
})
.then(writeData);
Basically you recurse on the transformers array, using .catch().

Related

Wait for all nested promises to complete, but still react to each individual resolve

Suppose that newsService.getNews() returns a promise that should resolve to a random news entry returned by some service, while translateService.translate() returns a promise that should resolve to the translation of the passed text.
var newsPromises = [];
var translatePromises = [];
for (var i = 0; i < 5; i++) {
var p1 = this.newsService.getNews();
newsPromises.push(p1);
p1.then(function (data) {
var p2 = this.translateService.translate(data);
translatePromises.push(p2);
p2.then(function (translatedData) {
addNews(`${data} (${translatedData})`);
}, function (fail) {
console.log(fail.message);
});
}, function (fail) {
console.log(fail.message);
});
}
now the page initially shows a loading spinner that I would like to hide when all the promises (including the nested translation promises) have completed (succeeded or failed):
Promise.all(newsPromises)
.then(function (results) {
Promise.all(translatePromises).then(function (results) {
removeLoading();
},
function (err) {
removeLoading();
}
);
}, function (err) {
Promise.all(translatePromises).then(function (results) {
removeLoading();
},
function (err) {
removeLoading();
}
);
});
This code a) does not work as it should, since the loading spinner some times disappears before the promises resolve, and b) is horribly complex.
How is this done properly? (with vanilla JS / ES6)
Remember that promises chains are pipelines, where each handler can transform the chain's result as the result passes through the handler. See comments:
// We only need one array of promises
const promises = [];
// Build the array
for (let i = 0; i < 5; i++) {
// Add this promise to the array
promises.push(
// Get the news...
this.newsService.getNews().then(
// ...and translate it...
data => this.translateService.translate(data)
.then(translatedData => {
// ...and show it as soon as it's available
addNews(`${data} (${translatedData})`);
// Note that here we're converting the resolution value to
// `undefined`, but nothing uses it so...
// If you want something to be able to use it,
// return `translatedData` (or `data` or...)
})
)
.catch(fail => {
console.log(fail.message);
// WARNING: Here you're converting rejection to resolution with `undefined`
})
);
}
// Wait until all that is done before removing the loading indicator
Promise.all(promises).then(removeLoading);
Note that the only reason we don't need a catch on the Promise.all promise is that you're ignoring (other than logging) errors that occur, so we know that promise will never reject.
Also note that the above assumes removeLoading doesn't pay any attention to the arguments it receives, and that it doesn't return a promise that may reject. If it does care about arguments and it's important to call it with no arguments, change the Promise.all bit to:
Promise.all(promises).then(() => removeLoading());
If it returns a promise that may reject, you'll need a catch handler as well.
in such cases i creating global counter loadersCount = 0
each time you call this.newsService.getNews() call function loaderStart()
and each time you call addNews() or console.log(fail.message) call loaderStop()
function loaderStart () {
if (loadersCount === 0) {
addLoading();
}
loadersCount++;
}
function loaderStop () {
if (loadersCount === 1) {
removeLoading();
}
loadersCount--;
}

Waiting for multiple AJAX requests to finish in a loop [duplicate]

Thats how I do it:
function processArray(array, index, callback) {
processItem(array[index], function(){
if(++index === array.length) {
callback();
return;
}
processArray(array, index, callback);
});
};
function processItem(item, callback) {
// do some ajax (browser) or request (node) stuff here
// when done
callback();
}
var arr = ["url1", "url2", "url3"];
processArray(arr, 0, function(){
console.log("done");
});
Is it any good? How to avoid those spaghetti'ish code?
Checkout the async library, it's made for control flow (async stuff) and it has a lot of methods for array stuff: each, filter, map. Check the documentation on github. Here's what you probably need:
each(arr, iterator, callback)
Applies an iterator function to each item in an array, in parallel. The iterator is called with an item from the list and a callback for when it has finished. If the iterator passes an error to this callback, the main callback for the each function is immediately called with the error.
eachSeries(arr, iterator, callback)
The same as each only the iterator is applied to each item in the array in series. The next iterator is only called once the current one has completed processing. This means the iterator functions will complete in order.
As pointed in some answer one can use "async" library. But sometimes you just don't want to introduce new dependency in your code. And below is another way how you can loop and wait for completion of some asynchronous functions.
var items = ["one", "two", "three"];
// This is your async function, which may perform call to your database or
// whatever...
function someAsyncFunc(arg, cb) {
setTimeout(function () {
cb(arg.toUpperCase());
}, 3000);
}
// cb will be called when each item from arr has been processed and all
// results are available.
function eachAsync(arr, func, cb) {
var doneCounter = 0,
results = [];
arr.forEach(function (item) {
func(item, function (res) {
doneCounter += 1;
results.push(res);
if (doneCounter === arr.length) {
cb(results);
}
});
});
}
eachAsync(items, someAsyncFunc, console.log);
Now, running node iterasync.js will wait for about three seconds and then print [ 'ONE', 'TWO', 'THREE' ]. This is a simple example, but it can be extended to handle many situations.
As correctly pointed out, you have to use setTimeout, for example:
each_async = function(ary, fn) {
var i = 0;
-function() {
fn(ary[i]);
if (++i < ary.length)
setTimeout(arguments.callee, 0)
}()
}
each_async([1,2,3,4], function(p) { console.log(p) })
The easiest way to handle async iteration of arrays (or any other iterable) is with the await operator (only in async functions) and for of loop.
(async function() {
for(let value of [ 0, 1 ]) {
value += await(Promise.resolve(1))
console.log(value)
}
})()
You can use a library to convert any functions you may need which accept callback to return promises.
In modern JavaScript there are interesting ways to extend an Array into an async itarable object.
Here I would like to demonstrate a skeleton of a totally new type AsyncArray which extends the Array type by inheriting it's goodness just to become an async iterable array.
This is only available in the modern engines. The code below uses the latest gimmicks like the private instance fields and for await...of.
If you are not familiar with them then I would advise you to have a look at the above linked topics in advance.
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
So an Async Iterable Array must contain promises. Only then it can return an iterator object which with every next() call returns a promise to eventually resolve into an object like {value : "whatever", done: false} or {done: true}. So basically everything returned is a promise here. The await abstraction unpacks the value within and gives it to us.
Now as I mentioned before, this AsyncArray type, since extended from Array, allows us to use those Array methods we are familiar with. That should simplify our job.
Let's see what happens;
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
var aa = AsyncArray.from({length:10}, (_,i) => new Promise(resolve => setTimeout(resolve,i*1000,[i,~~(Math.random()*100)])));
async function getAsycRandoms(){
for await (let random of aa){
console.log(`The Promise at index # ${random[0]} gets resolved with a random value of ${random[1]}`);
};
};
getAsycRandoms();
For modern Node.js:
To iterate through a collection truly asynchronously, you can try my tiny package with zero dependencies, compatible with ESM and CJS modules with .d.ts typings. Check the code it's really tiny.
https://www.npmjs.com/package/array-to-async-iterable
You can use it just like this:
for await(const el of new AsyncTimeIterator(arrayOfObjects)){
...
}
You can't just use for await of loop because of the JavaScript engines' microtasks and macrotasks nature.
In a brief, you won't get new HTTP requests and let other timers' callbacks to be executed with this code:
for await(const el of array){
...
}
You force V8 or the other engine to execute all the microtasks (your loop iteration) and when the loop completes you'll unblock the event loop and be ready to receive HTTP connections. So this code is completely useless.

How to chain functions returning promises or values?

I have an array of functions which I want to execute in order, some of those functions return a promise, while others simply return a value.
I want the functions to be executed one at a time, in order they appear in the array. If the function returns a promise, I want to wait until it resolves to a value. If it returns a value, I want it to simply use that value.
Example of how I want it to work:
function f1() {
return 1;
}
function f2() {
return new Promise(function (resolve, reject) {
// ...
resolve(4);
}).then(function (num) {return num / 2; });
}
function f3() {
return Promise.resolve("3");
}
var array = [f1, f2, f3];
chain(array).then(function (values) {
// values == [1, 2, "3"];
});
In case any of the promises fail, the chain function should stop execution and pass the error further.
Your current solution is pretty overkill, promises already accept values and/or promises with .then, you can refactor your code to this:
var queue = Promise.resolve(); // start empty queue
var results = arr.map(function(el){
return (queue = queue.then(function(){ // update the queue to wait for next promise
return el(); // call the function, return it so the array resolves to it
}));
});
Promise.all(results).then(function(results){
// access all results here
});
Figured this out while making the question and decided to share.
function chain(array) {
array = array.slice(); // Make a copy
var result = [];
return new Promise(function (resolve, reject) {
(function chainLoop() { // Make an IIFE-based loop
if (array.length > 0) { // If we have elements in the array...
Promise.resolve(array.shift()()) // Remove and resolve value of first element from the array
.then(function (value) {
result.push(value); // Push onto the result
chainLoop(); // Loop - This won't cause a stack overflow, promises reset the stack
}, reject);
} else { // Otherwise, if the array is empty...
resolve(result); // resolve with our result array.
}
}());
});
}
Usage:
chain([f1, f2, f3]).then(function (values) {
console.log(values); // [1, 2, "3"] (assuming f1, f2, f3 from the question)
});
This will crash if there are non-functions in the array, so there is room for improvement, but this fits my needs so far.

Recursive Async Looping in NodeJS

I'm trying to do a recursive async loop to trace all the children of a particular object from a third-party lib in nodejs.
Heres the pseudo code:
var tracer = function(nodes){
var promises [];
nodes.forEach(function(node){
// trace returns a promise ...
var promise = builder.trace(node)
promises.push(promise);
promise.then(function(tree){
// if we had children, get those
if(tree.children.length){
promises.push.apply(promises, tracer(tree.children));
}
});
});
return promises;
};
RSVP.all(tracer(myArr)).then(function(allTrees){ ... });
but I can't put my finger on how to get them all to resolve correctly and returns the results in one array.
You must not push the recursive promises on the array in the delayed callback. Instead, you'll need to push a promise that represents the recursive results (resolves with those delayed produced promises) right away. Luckily, you even get exactly that back from that then call.
Additionally, I would swap out the each for a map, and do RSVP.all immediately inside the function, for not expecting the caller to deal with that.
function tracer(nodes){
var promises = nodes.map(function(node){
// trace returns a promise ...
var promise = builder.trace(node)
var recusivePromise = promise.then(function(tree){
// if we had children, get those
if (tree.children.length)
return tracer(tree.children));
else
return node;// the leaf node itself
});
return recusivePromise; // which will resolve with the `tracer(…)` result
// or the leaf
});
return RSVP.all(promises);
}
tracer(myArr).then(function(allTrees){ … });
I ended up going with a counter type approach ...
var traceDeps = function(parents, cb){
var count = 0,
trees = [],
trace = function(nodes){
nodes.forEach(function(node){
count++;
builder.trace(node).then(function(tree){
trees.push(tree);
if(tree.children.length){
trace(tree.children);
}
count--;
if (count === 0) cb(trees);
});
});
};
trace(parents);
};
traceDeps(myArr, function(trees){ ... });

Whats the smartest / cleanest way to iterate async over arrays (or objs)?

Thats how I do it:
function processArray(array, index, callback) {
processItem(array[index], function(){
if(++index === array.length) {
callback();
return;
}
processArray(array, index, callback);
});
};
function processItem(item, callback) {
// do some ajax (browser) or request (node) stuff here
// when done
callback();
}
var arr = ["url1", "url2", "url3"];
processArray(arr, 0, function(){
console.log("done");
});
Is it any good? How to avoid those spaghetti'ish code?
Checkout the async library, it's made for control flow (async stuff) and it has a lot of methods for array stuff: each, filter, map. Check the documentation on github. Here's what you probably need:
each(arr, iterator, callback)
Applies an iterator function to each item in an array, in parallel. The iterator is called with an item from the list and a callback for when it has finished. If the iterator passes an error to this callback, the main callback for the each function is immediately called with the error.
eachSeries(arr, iterator, callback)
The same as each only the iterator is applied to each item in the array in series. The next iterator is only called once the current one has completed processing. This means the iterator functions will complete in order.
As pointed in some answer one can use "async" library. But sometimes you just don't want to introduce new dependency in your code. And below is another way how you can loop and wait for completion of some asynchronous functions.
var items = ["one", "two", "three"];
// This is your async function, which may perform call to your database or
// whatever...
function someAsyncFunc(arg, cb) {
setTimeout(function () {
cb(arg.toUpperCase());
}, 3000);
}
// cb will be called when each item from arr has been processed and all
// results are available.
function eachAsync(arr, func, cb) {
var doneCounter = 0,
results = [];
arr.forEach(function (item) {
func(item, function (res) {
doneCounter += 1;
results.push(res);
if (doneCounter === arr.length) {
cb(results);
}
});
});
}
eachAsync(items, someAsyncFunc, console.log);
Now, running node iterasync.js will wait for about three seconds and then print [ 'ONE', 'TWO', 'THREE' ]. This is a simple example, but it can be extended to handle many situations.
As correctly pointed out, you have to use setTimeout, for example:
each_async = function(ary, fn) {
var i = 0;
-function() {
fn(ary[i]);
if (++i < ary.length)
setTimeout(arguments.callee, 0)
}()
}
each_async([1,2,3,4], function(p) { console.log(p) })
The easiest way to handle async iteration of arrays (or any other iterable) is with the await operator (only in async functions) and for of loop.
(async function() {
for(let value of [ 0, 1 ]) {
value += await(Promise.resolve(1))
console.log(value)
}
})()
You can use a library to convert any functions you may need which accept callback to return promises.
In modern JavaScript there are interesting ways to extend an Array into an async itarable object.
Here I would like to demonstrate a skeleton of a totally new type AsyncArray which extends the Array type by inheriting it's goodness just to become an async iterable array.
This is only available in the modern engines. The code below uses the latest gimmicks like the private instance fields and for await...of.
If you are not familiar with them then I would advise you to have a look at the above linked topics in advance.
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
So an Async Iterable Array must contain promises. Only then it can return an iterator object which with every next() call returns a promise to eventually resolve into an object like {value : "whatever", done: false} or {done: true}. So basically everything returned is a promise here. The await abstraction unpacks the value within and gives it to us.
Now as I mentioned before, this AsyncArray type, since extended from Array, allows us to use those Array methods we are familiar with. That should simplify our job.
Let's see what happens;
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
var aa = AsyncArray.from({length:10}, (_,i) => new Promise(resolve => setTimeout(resolve,i*1000,[i,~~(Math.random()*100)])));
async function getAsycRandoms(){
for await (let random of aa){
console.log(`The Promise at index # ${random[0]} gets resolved with a random value of ${random[1]}`);
};
};
getAsycRandoms();
For modern Node.js:
To iterate through a collection truly asynchronously, you can try my tiny package with zero dependencies, compatible with ESM and CJS modules with .d.ts typings. Check the code it's really tiny.
https://www.npmjs.com/package/array-to-async-iterable
You can use it just like this:
for await(const el of new AsyncTimeIterator(arrayOfObjects)){
...
}
You can't just use for await of loop because of the JavaScript engines' microtasks and macrotasks nature.
In a brief, you won't get new HTTP requests and let other timers' callbacks to be executed with this code:
for await(const el of array){
...
}
You force V8 or the other engine to execute all the microtasks (your loop iteration) and when the loop completes you'll unblock the event loop and be ready to receive HTTP connections. So this code is completely useless.

Categories

Resources