Related
Seem to be having some issues incorporating async/await with .reduce(), like so:
const data = await bodies.reduce(async(accum, current, index) => {
const methodName = methods[index]
const method = this[methodName]
if (methodName == 'foo') {
current.cover = await this.store(current.cover, id)
console.log(current)
return {
...accum,
...current
}
}
return {
...accum,
...method(current.data)
}
}, {})
console.log(data)
The data object is logged before the this.store completes...
I know you can utilise Promise.all with async loops, but does that apply to .reduce()?
The problem is that your accumulator values are promises - they're return values of async functions. To get sequential evaluation (and all but the last iteration to be awaited at all), you need to use
const data = await array.reduce(async (accumP, current, index) => {
const accum = await accumP;
…
}, Promise.resolve(…));
That said, for async/await I would in general recommend to use plain loops instead of array iteration methods, they're more performant and often simpler.
I like Bergi's answer, I think it's the right way to go.
I'd also like to mention a library of mine, called Awaity.js
Which lets you effortlessly use functions like reduce, map & filter with async / await:
import reduce from 'awaity/reduce';
const posts = await reduce([1,2,3], async (posts, id) => {
const res = await fetch('/api/posts/' + id);
const post = await res.json();
return {
...posts,
[id]: post
};
}, {})
posts // { 1: { ... }, 2: { ... }, 3: { ... } }
[Not addressing OPs exact prob; focused on others who land here.]
Reduce is commonly used when you need the result of the previous steps before you can process the next. In that case, you can string promises together a la:
promise = elts.reduce(
async (promise, elt) => {
return promise.then(async last => {
return await f(last, elt)
})
}, Promise.resolve(0)) // or "" or [] or ...
Here's an example with uses fs.promise.mkdir() (sure, much simpler to use mkdirSync, but in my case, it's across a network):
const Path = require('path')
const Fs = require('fs')
async function mkdirs (path) {
return path.split(/\//).filter(d => !!d).reduce(
async (promise, dir) => {
return promise.then(async parent => {
const ret = Path.join(parent, dir);
try {
await Fs.promises.lstat(ret)
} catch (e) {
console.log(`mkdir(${ret})`)
await Fs.promises.mkdir(ret)
}
return ret
})
}, Promise.resolve(""))
}
mkdirs('dir1/dir2/dir3')
Below is another example which add 100 + 200 ... 500 and waits around a bit:
async function slowCounter () {
const ret = await ([100, 200, 300, 400, 500]).reduce(
async (promise, wait, idx) => {
return promise.then(async last => {
const ret = last + wait
console.log(`${idx}: waiting ${wait}ms to return ${ret}`)
await new Promise((res, rej) => setTimeout(res, wait))
return ret
})
}, Promise.resolve(0))
console.log(ret)
}
slowCounter ()
The current accepted answer advises to use Promise.all() instead of an async reduce. However this does not have the same behavior as an async reduce and is only relevant for the case where you want an exception to stop all iterations immediately, which is not always the case.
Additionally in the comments of that answer it's suggested that you should always await the accumulator as the first statement in the reducer, because otherwise you might risk unhandled promise rejections. The poster also says that this was what the OP is asking for, which is not the case. Instead he just wants to know when everything is done. In order to know that you indeed need to do await acc, but this could be at any point in the reducer.
const reducer = async(acc, key) => {
const response = await api(item);
return {
...await acc, // <-- this would work just as well for OP
[key]: response,
}
}
const result = await ['a', 'b', 'c', 'd'].reduce(reducer, {});
console.log(result); // <-- Will be the final result
How to safely use async reduce
That being said, using a reducer this way does mean that you need to guarantee it does not throw, else you will get "unhandled promise rejections". It's perfectly possible to ensure this by using a try-catch, with the catch block returning the accumulator (optionally with a record for the failed API call).
const reducer = async (acc, key) => {
try {
data = await doSlowTask(key);
return {...await acc, [key]: data};
} catch (error) {
return {...await acc, [key]: {error}};
};
}
const result = await ['a', 'b', 'c','d'].reduce(reducer, {});
Difference with Promise.allSettled
You can get close to the behavior of an async reduce (with error catching) by using Promise.allSettled. However this is clunky to use: you need to add another synchronous reduce after it if you want to reduce to an object.
The theoretical time complexity is also higher for Promise.allSettled + regular reduce, though there are probably very few use cases where this will make a difference. async reduce can start accumulating from the moment the first item is done, whereas a reduce after Promise.allSettled is blocked until all promises are fulfilled. This could make a difference when looping over a very large amount of elements.
const responseTime = 200; //ms
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
const api = async (key) => {
console.log(`Calling API for ${ key }`);
// Boz is a slow endpoint.
await sleep(key === 'boz' ? 800 : responseTime);
console.log(`Got response for ${ key }`);
if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);
return {
[key]: `API says ${ key }`,
};
};
const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];
const reducer = async (acc, key) => {
let data;
try {
const response = await api(key);
data = {
apiData: response
};
} catch (e) {
data = {
error: e.message
};
}
// OP doesn't care how this works, he only wants to know when the whole thing is ready.
const previous = await acc;
console.log(`Got previous for ${ key }`);
return {
...previous,
[key]: {
...data
},
};
};
(async () => {
const start = performance.now();
const result = await keys.reduce(reducer, {});
console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();
Check the order of execution with Promise.allSettled:
const responseTime = 200; //ms
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
const api = async (key) => {
console.log(`Calling API for ${ key }`);
// Boz is a slow endpoint.
await sleep(key === 'boz' ? 800 : responseTime);
console.log(`Got response for ${ key }`);
if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);
return {
key,
data: `API says ${ key }`,
};
};
const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];
(async () => {
const start = performance.now();
const apiResponses = await Promise.allSettled(keys.map(api));
const result = apiResponses.reduce((acc, {status, reason, value}) => {
const {key, data} = value || {};
console.log(`Got previous for ${ key }`);
return {
...acc,
[key]: status === 'fulfilled' ? {apiData: data} : {error: reason.message},
};
}, {});
console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();
Sometimes the best thing to do is simply put both code versions side by side, sync and async:
Sync version:
const arr = [1, 2, 3, 4, 5];
const syncRev = arr.reduce((acc, i) => [i, ...acc], []); // [5, 4, 3, 2, 1]
Async one:
(async () => {
const asyncRev = await arr.reduce(async (promisedAcc, i) => {
const id = await asyncIdentity(i); // could be id = i, just stubbing async op.
const acc = await promisedAcc;
return [id, ...acc];
}, Promise.resolve([])); // [5, 4, 3, 2, 1]
})();
//async stuff
async function asyncIdentity(id) {
return Promise.resolve(id);
}
const arr = [1, 2, 3, 4, 5];
(async () => {
const asyncRev = await arr.reduce(async (promisedAcc, i) => {
const id = await asyncIdentity(i);
const acc = await promisedAcc;
return [id, ...acc];
}, Promise.resolve([]));
console.log('asyncRev :>> ', asyncRev);
})();
const syncRev = arr.reduce((acc, i) => [i, ...acc], []);
console.log('syncRev :>> ', syncRev);
async function asyncIdentity(id) {
return Promise.resolve(id);
}
For typescript previous value and initial value need to be same.
const data = await array.reduce(async (accumP: Promise<Tout>, curr<Tin>) => {
const accum: Tout = await accumP;
doSomeStuff...
return accum;
}, Promise<Tout>.resolve({} as Tout);
You can wrap your entire map/reduce iterator blocks into their own Promise.resolve and await on that to complete. The issue, though, is that the accumulator doesn't contain the resulting data/object you'd expect on each iteration. Due to the internal async/await/Promise chain, the accumulator will be actual Promises themselves that likely have yet to resolve themselves despite using an await keyword before your call to the store (which might lead you to believe that the iteration won't actually return until that call completes and the accumulator is updated.
While this is not the most elegant solution, one option you have is to move your data object variable out of scope and assign it as a let so that proper binding and mutation can occur. Then update this data object from inside your iterator as the async/await/Promise calls resolve.
/* allow the result object to be initialized outside of scope
rather than trying to spread results into your accumulator on iterations,
else your results will not be maintained as expected within the
internal async/await/Promise chain.
*/
let data = {};
await Promise.resolve(bodies.reduce(async(accum, current, index) => {
const methodName = methods[index]
const method = this[methodName];
if (methodName == 'foo') {
// note: this extra Promise.resolve may not be entirely necessary
const cover = await Promise.resolve(this.store(current.cover, id));
current.cover = cover;
console.log(current);
data = {
...data,
...current,
};
return data;
}
data = {
...data,
...method(current.data)
};
return data;
}, {});
console.log(data);
export const addMultiTextData = async(data) => {
const textData = await data.reduce(async(a, {
currentObject,
selectedValue
}) => {
const {
error,
errorMessage
} = await validate(selectedValue, currentObject);
return {
...await a,
[currentObject.id]: {
text: selectedValue,
error,
errorMessage
}
};
}, {});
};
Here's how to make async reduce:
async function asyncReduce(arr, fn, initialValue) {
let temp = initialValue;
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
temp = await fn(temp, cur, idx);
}
return temp;
}
Another classic option with Bluebird
const promise = require('bluebird');
promise.reduce([1,2,3], (agg, x) => Promise.resolve(agg+x),0).then(console.log);
// Expected to product sum 6
My solution for .reduce in typescript
Thanks to this person
https://dev.to/arnaudcourtecuisse/comment/1el22
const userOrders = await existUsersWithName.reduce(
async (promise, existUserAndName) => {
const acc = await promise;
const {user, name} = existUserAndName;
// My async function
acc[user] = await this.users.getOrders(name);
return promise;
},
<Promise<Record<string, string[] | undefined>>>{}
);
Introduction
Imagine this method for getting the language of a user:
const getUserLanguage = (userId) => new Promise(
(resolve, reject) => {
if (Math.random() < 0.3) resolve("en");
if (Math.random() < 0.6) resolve("es");
reject("Unexpected error.");
}
);
(async () => {
try {
const language = await getUserLanguage("Mike")
console.log(`Language: ${language}`);
} catch(err) {
console.error(err);
}
})();
Now, I am trying to group the language of multiple users, performing a parallel request:
const getUserLanguage = () => new Promise(
(resolve, reject) => {
if (Math.random() < 0.3) resolve("en");
if (Math.random() < 0.6) resolve("es");
reject("Unexpected error.");
}
);
const groupUsersByLanguage = async (userIds) => {
const promiseResults = await Promise.allSettled(
userIds.reduce(async (acc, userId) => {
const language = await getUserLanguage(userId);
(acc[language] = acc[language] ?? []).push(userId);
return acc;
}, {})
);
console.log({ promiseResults });
// Filter fulfilled promises
const result = promiseResults
.filter(({ status }) => status === "fulfilled")
.map(({ value }) => value);
return result;
}
(async () => {
const userIds = ["Mike", "Walter", "Saul", "Pinkman"];
const usersGroupedByLanguage = await groupUsersByLanguage(userIds);
console.log(usersGroupedByLanguage);
})();
Problem
But my implementation is not working:
const promiseResults = await Promise.allSettled(
userIds.reduce(async (acc, userId) => {
const language = await getUserLanguage(userId);
(acc[language] = acc[language] ?? []).push(userId);
return acc;
}, {})
);
How can I do for getting an output like
{
"es": ["Mike", "Saul"],
"en": ["Walter"],
}
using the Promise.allSettled combined with .reduce?
Your .reduce is constructing an object where each value is a Promise. Such an object is not something that .allSettled can understand - you must pass it an array.
I'd create an object outside, which gets mutated inside a .map callback. This way, you'll have an array of Promises that .allSettled can work with, and also have the object in the desired shape.
const getLanguage = () => new Promise(
(resolve, reject) => {
if (Math.random() < 0.3) resolve("en");
if (Math.random() < 0.6) resolve("es");
reject("Unexpected error.");
}
);
const groupUsersByLanguage = async (userIds) => {
const grouped = {};
await Promise.allSettled(
userIds.map(async (userId) => {
const language = await getLanguage(userId);
(grouped[language] = grouped[language] ?? []).push(userId);
})
);
return grouped;
}
(async () => {
const userIds = ["Mike", "Walter", "Saul", "Pinkman"];
const usersGroupedByLanguage = await groupUsersByLanguage(userIds);
console.log(usersGroupedByLanguage);
})();
An option that doesn't rely on side-effects inside a .map would be to instead return both the userId and the language inside the map callback, then filter the allSettled results to include only the good ones, then turn it into an object.
const getLanguage = () => new Promise(
(resolve, reject) => {
if (Math.random() < 0.3) resolve("en");
if (Math.random() < 0.6) resolve("es");
reject("Unexpected error.");
}
);
const groupUsersByLanguage = async (userIds) => {
const settledResults = await Promise.allSettled(
userIds.map(async (userId) => {
const language = await getLanguage(userId);
return [userId, language];
})
);
const grouped = {};
settledResults
.filter(result => result.status === 'fulfilled')
.map(result => result.value)
.forEach(([userId, language]) => {
(grouped[language] = grouped[language] ?? []).push(userId);
});
return grouped;
}
(async () => {
const userIds = ["Mike", "Walter", "Saul", "Pinkman"];
const usersGroupedByLanguage = await groupUsersByLanguage(userIds);
console.log(usersGroupedByLanguage);
})();
I would write a main function using two utility functions for this: one that groups a set of elements according to the result of a function, and one that takes a predicate function and partitions an array into those ones for which it returns true and those ones for which it returns false. These two in turn use a push utility function which simply reifies Array.prototype.push into a plain function.
The main function maps the getUserLanguage function over the users, calls Promise.allSettled on the results, then we map over the resulting promises, to connect the original userId back with the promise results. (If the fake getUserLanguage returned an object with properties for both the userId and language, this step would be unnecessary.) Then we partition the resulting promises to separate out the fulfilled from the rejected ones. I do this because your question doesn't say what to do with the rejected language lookups. I choose to add one more entry to the output. Here as well as es and en, we also get a list of userIds under _errors. If we wanted to ignore these, then we could replace the partition with a filter and simplify the last step. That last step takes successful results and the failures, combining the successful ones into an object with our group helper, and appending the _errors, by mapping the failures to their userIds.
It might look like this:
// dummy implementation, resolving to random language, or rejecting with error
const getUserLanguage = (userId) => new Promise ((resolve, reject) => {if (Math.random() < 0.3) resolve("en"); if (Math.random() < 0.6) resolve("es"); reject("Unexpected error.");});
// utility functions
const push = (x) => (xs) =>
(xs .push (x), xs)
const partition = (fn) => (xs) =>
xs .reduce (([y, n], x) => fn (x) ? [push (x) (y), n] : [y, push (x) (n)], [[], []])
const group = (getKey, getValue) => (xs) =>
xs .reduce ((a, x, _, __, key = getKey (x)) => ((a [key] = push (getValue (x)) (a[key] ?? [])), a), {})
// main function
const groupUsersByLanguage = (users) => Promise .allSettled (users .map (getUserLanguage))
.then (ps => ps .map ((p, i) => ({...p, user: users [i]})))
.then (partition (p => p .status == 'fulfilled'))
.then (([fulfilled, rejected]) => ({
...group (x => x .value, x => x.user) (fulfilled),
_errors: rejected .map (r => r .user)
}))
// sample data
const users = ['fred', 'wilma', 'betty', 'barney', 'pebbles', 'bambam', 'yogi', 'booboo']
// demo
groupUsersByLanguage (users)
.then (console .log)
.as-console-wrapper {max-height: 100% !important; top: 0}
This yields output like this (YMMV because of the random calls):
{
en: [
"fred",
"wilma",
"barney"
],
es: [
"bambam",
"yogi",
"booboo"
],
_errors: [
"betty",
"pebbles"
]
}
Note that those utility functions are general-purpose. If we keep our own libraries of such tools handy, we can write functions like this without great effort.
Another option of doing this would be to first fetch all languages using:
const languages = await Promise.allSettled(userIds.map(getLanguage));
Then zip then together with userIds and process them further.
async function getLanguage() {
if (Math.random() < 0.3) return "en";
if (Math.random() < 0.6) return "es";
throw "Unexpected error.";
}
function zip(...arrays) {
if (!arrays[0]) return;
return arrays[0].map((_, i) => arrays.map(array => array[i]));
}
async function groupUsersByLanguage(userIds) {
const languages = await Promise.allSettled(userIds.map(getLanguage));
const groups = {};
for (const [userId, language] of zip(userIds, languages)) {
if (language.status != "fulfilled") continue;
groups[language.value] ||= [];
groups[language.value].push(userId);
}
return groups;
}
(async () => {
const userIds = ["Mike", "Walter", "Saul", "Pinkman"];
const usersGroupedByLanguage = await groupUsersByLanguage(userIds);
console.log(usersGroupedByLanguage);
})();
If you are not interested in creating a zip() helper you can use a "normal" for-loop:
const groups = {};
for (let i = 0; i < userIds.length; i += 1) {
if (languages[i].status != "fulfilled") continue;
groups[languages[i].value] ||= [];
groups[languages[i].value].push(userId);
}
I used promises as advised in my previous question to get values from 2 async calls.
But I want the results from my first call based on a condition of my second call. I keep getting undefined when I do what I am doing. How do I get my desired result.
First JSON:
let first_json = [
{
"company": "one"
},
{
"company": "two"
},
{
"company": "three"
}
]
The second JSON is dependent on the first one and is of similar format.
Using promises I did:
$.getJSON(first_json)
.then(first_data =>
first_data.map(d => {
return d.company;
})
)
.then(promises => Promise.all(promises))
.then(company => company.map(c => {
let second_json = json_string + c;
$.getJSON(second_json, function(data) {
if (data.length > 0) return c;
});
}))
.then(arr => {
console.log(arr);
});
arr for me is supposed to return ['one', 'three'] but is instead returning:
[undefined, undefined, undefined].
Why is that happening and how do I fix it?
Your callback is asynchronous, so, unless you 'await' it with a then, it won't be available to you right away, and therefore you can't act based on it.
Instead, do it like this:
$.getJSON(first_json)
.then(first_data =>
first_data.map(d => {
return d.company;
})
)
.then(promises => Promise.all(promises))
.then(company => company.map(c => {
let second_json = json_string + c;
return $.getJSON(second_json)
.then(data => {
if (data.length > 0) return c;
});
}))
.then(promises => Promise.all(promises))
.then(arr => {
console.log(arr);
});
You're applying the Promise.all in the wrong stage:
$.getJSON(first_json).then(first_data => {
const companies = first_data.map(d => {
return d.company;
});
const promises = companies.map(c => {
// ^^^^^^^^
let second_json = json_string + c;
return $.getJSON(second_json).then(data => {
// ^^^^^^
if (data.length > 0) return c;
});
});
return Promise.all(promises);
// ^^^^^^^^^^^
}).then(arr => {
console.log(arr);
});
if I had an array and a callback function, how do I get the results back asynchronously? here's what I've been trying
const fakeAsync = (list = [], cb = '') => {
let map = {};
list.forEach((x) => {
map[x] = cb(x)
})
return map;
}
const list = [
'user1',
'user2',
'user3'
]
const cb = (name) => {
setTimeout(function(){ return 'good ' + name }, 3000);
}
fakeAsync(list, cb)
it prints out
=> { user1: undefined, user2: undefined, user3: undefined }
I want it to be
=> { user1: 'good user1', user2: 'good user2', user3: 'good user3' }
Calling cb will return nothing directly, so the return inside the setTimeout doesn't actually return anything to be put inside the map object.
You could use Promises to get it simulated like this:
const fakeAsync = (list = [], cb = ()=>{}) => {
Promise.all( // Wait for all Promises to settle/resolve
list.map((x) => { // Make an array of Promises from the list
return new Promise((resolve) => { // Return Promise to be settled
setTimeout(() => { // Simulated delay before resolving with a return value
return resolve(cb(x))
}, 3000)
})
})
).then((map) => { // Get all returned values from the Promises
console.log(map);
});
}
const list = [
'user1',
'user2',
'user3'
]
const cb = (name) => {
return 'good ' + name; // The actual return value
}
fakeAsync(list, cb);
Some references/documentation:
https://developer.mozilla.org/nl/docs/Web/JavaScript/Reference/Global_Objects/Promise
https://developers.google.com/web/fundamentals/primers/promises
Seem to be having some issues incorporating async/await with .reduce(), like so:
const data = await bodies.reduce(async(accum, current, index) => {
const methodName = methods[index]
const method = this[methodName]
if (methodName == 'foo') {
current.cover = await this.store(current.cover, id)
console.log(current)
return {
...accum,
...current
}
}
return {
...accum,
...method(current.data)
}
}, {})
console.log(data)
The data object is logged before the this.store completes...
I know you can utilise Promise.all with async loops, but does that apply to .reduce()?
The problem is that your accumulator values are promises - they're return values of async functions. To get sequential evaluation (and all but the last iteration to be awaited at all), you need to use
const data = await array.reduce(async (accumP, current, index) => {
const accum = await accumP;
…
}, Promise.resolve(…));
That said, for async/await I would in general recommend to use plain loops instead of array iteration methods, they're more performant and often simpler.
I like Bergi's answer, I think it's the right way to go.
I'd also like to mention a library of mine, called Awaity.js
Which lets you effortlessly use functions like reduce, map & filter with async / await:
import reduce from 'awaity/reduce';
const posts = await reduce([1,2,3], async (posts, id) => {
const res = await fetch('/api/posts/' + id);
const post = await res.json();
return {
...posts,
[id]: post
};
}, {})
posts // { 1: { ... }, 2: { ... }, 3: { ... } }
[Not addressing OPs exact prob; focused on others who land here.]
Reduce is commonly used when you need the result of the previous steps before you can process the next. In that case, you can string promises together a la:
promise = elts.reduce(
async (promise, elt) => {
return promise.then(async last => {
return await f(last, elt)
})
}, Promise.resolve(0)) // or "" or [] or ...
Here's an example with uses fs.promise.mkdir() (sure, much simpler to use mkdirSync, but in my case, it's across a network):
const Path = require('path')
const Fs = require('fs')
async function mkdirs (path) {
return path.split(/\//).filter(d => !!d).reduce(
async (promise, dir) => {
return promise.then(async parent => {
const ret = Path.join(parent, dir);
try {
await Fs.promises.lstat(ret)
} catch (e) {
console.log(`mkdir(${ret})`)
await Fs.promises.mkdir(ret)
}
return ret
})
}, Promise.resolve(""))
}
mkdirs('dir1/dir2/dir3')
Below is another example which add 100 + 200 ... 500 and waits around a bit:
async function slowCounter () {
const ret = await ([100, 200, 300, 400, 500]).reduce(
async (promise, wait, idx) => {
return promise.then(async last => {
const ret = last + wait
console.log(`${idx}: waiting ${wait}ms to return ${ret}`)
await new Promise((res, rej) => setTimeout(res, wait))
return ret
})
}, Promise.resolve(0))
console.log(ret)
}
slowCounter ()
The current accepted answer advises to use Promise.all() instead of an async reduce. However this does not have the same behavior as an async reduce and is only relevant for the case where you want an exception to stop all iterations immediately, which is not always the case.
Additionally in the comments of that answer it's suggested that you should always await the accumulator as the first statement in the reducer, because otherwise you might risk unhandled promise rejections. The poster also says that this was what the OP is asking for, which is not the case. Instead he just wants to know when everything is done. In order to know that you indeed need to do await acc, but this could be at any point in the reducer.
const reducer = async(acc, key) => {
const response = await api(item);
return {
...await acc, // <-- this would work just as well for OP
[key]: response,
}
}
const result = await ['a', 'b', 'c', 'd'].reduce(reducer, {});
console.log(result); // <-- Will be the final result
How to safely use async reduce
That being said, using a reducer this way does mean that you need to guarantee it does not throw, else you will get "unhandled promise rejections". It's perfectly possible to ensure this by using a try-catch, with the catch block returning the accumulator (optionally with a record for the failed API call).
const reducer = async (acc, key) => {
try {
data = await doSlowTask(key);
return {...await acc, [key]: data};
} catch (error) {
return {...await acc, [key]: {error}};
};
}
const result = await ['a', 'b', 'c','d'].reduce(reducer, {});
Difference with Promise.allSettled
You can get close to the behavior of an async reduce (with error catching) by using Promise.allSettled. However this is clunky to use: you need to add another synchronous reduce after it if you want to reduce to an object.
The theoretical time complexity is also higher for Promise.allSettled + regular reduce, though there are probably very few use cases where this will make a difference. async reduce can start accumulating from the moment the first item is done, whereas a reduce after Promise.allSettled is blocked until all promises are fulfilled. This could make a difference when looping over a very large amount of elements.
const responseTime = 200; //ms
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
const api = async (key) => {
console.log(`Calling API for ${ key }`);
// Boz is a slow endpoint.
await sleep(key === 'boz' ? 800 : responseTime);
console.log(`Got response for ${ key }`);
if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);
return {
[key]: `API says ${ key }`,
};
};
const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];
const reducer = async (acc, key) => {
let data;
try {
const response = await api(key);
data = {
apiData: response
};
} catch (e) {
data = {
error: e.message
};
}
// OP doesn't care how this works, he only wants to know when the whole thing is ready.
const previous = await acc;
console.log(`Got previous for ${ key }`);
return {
...previous,
[key]: {
...data
},
};
};
(async () => {
const start = performance.now();
const result = await keys.reduce(reducer, {});
console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();
Check the order of execution with Promise.allSettled:
const responseTime = 200; //ms
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
const api = async (key) => {
console.log(`Calling API for ${ key }`);
// Boz is a slow endpoint.
await sleep(key === 'boz' ? 800 : responseTime);
console.log(`Got response for ${ key }`);
if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);
return {
key,
data: `API says ${ key }`,
};
};
const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];
(async () => {
const start = performance.now();
const apiResponses = await Promise.allSettled(keys.map(api));
const result = apiResponses.reduce((acc, {status, reason, value}) => {
const {key, data} = value || {};
console.log(`Got previous for ${ key }`);
return {
...acc,
[key]: status === 'fulfilled' ? {apiData: data} : {error: reason.message},
};
}, {});
console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();
Sometimes the best thing to do is simply put both code versions side by side, sync and async:
Sync version:
const arr = [1, 2, 3, 4, 5];
const syncRev = arr.reduce((acc, i) => [i, ...acc], []); // [5, 4, 3, 2, 1]
Async one:
(async () => {
const asyncRev = await arr.reduce(async (promisedAcc, i) => {
const id = await asyncIdentity(i); // could be id = i, just stubbing async op.
const acc = await promisedAcc;
return [id, ...acc];
}, Promise.resolve([])); // [5, 4, 3, 2, 1]
})();
//async stuff
async function asyncIdentity(id) {
return Promise.resolve(id);
}
const arr = [1, 2, 3, 4, 5];
(async () => {
const asyncRev = await arr.reduce(async (promisedAcc, i) => {
const id = await asyncIdentity(i);
const acc = await promisedAcc;
return [id, ...acc];
}, Promise.resolve([]));
console.log('asyncRev :>> ', asyncRev);
})();
const syncRev = arr.reduce((acc, i) => [i, ...acc], []);
console.log('syncRev :>> ', syncRev);
async function asyncIdentity(id) {
return Promise.resolve(id);
}
For typescript previous value and initial value need to be same.
const data = await array.reduce(async (accumP: Promise<Tout>, curr<Tin>) => {
const accum: Tout = await accumP;
doSomeStuff...
return accum;
}, Promise<Tout>.resolve({} as Tout);
You can wrap your entire map/reduce iterator blocks into their own Promise.resolve and await on that to complete. The issue, though, is that the accumulator doesn't contain the resulting data/object you'd expect on each iteration. Due to the internal async/await/Promise chain, the accumulator will be actual Promises themselves that likely have yet to resolve themselves despite using an await keyword before your call to the store (which might lead you to believe that the iteration won't actually return until that call completes and the accumulator is updated.
While this is not the most elegant solution, one option you have is to move your data object variable out of scope and assign it as a let so that proper binding and mutation can occur. Then update this data object from inside your iterator as the async/await/Promise calls resolve.
/* allow the result object to be initialized outside of scope
rather than trying to spread results into your accumulator on iterations,
else your results will not be maintained as expected within the
internal async/await/Promise chain.
*/
let data = {};
await Promise.resolve(bodies.reduce(async(accum, current, index) => {
const methodName = methods[index]
const method = this[methodName];
if (methodName == 'foo') {
// note: this extra Promise.resolve may not be entirely necessary
const cover = await Promise.resolve(this.store(current.cover, id));
current.cover = cover;
console.log(current);
data = {
...data,
...current,
};
return data;
}
data = {
...data,
...method(current.data)
};
return data;
}, {});
console.log(data);
export const addMultiTextData = async(data) => {
const textData = await data.reduce(async(a, {
currentObject,
selectedValue
}) => {
const {
error,
errorMessage
} = await validate(selectedValue, currentObject);
return {
...await a,
[currentObject.id]: {
text: selectedValue,
error,
errorMessage
}
};
}, {});
};
Here's how to make async reduce:
async function asyncReduce(arr, fn, initialValue) {
let temp = initialValue;
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
temp = await fn(temp, cur, idx);
}
return temp;
}
Another classic option with Bluebird
const promise = require('bluebird');
promise.reduce([1,2,3], (agg, x) => Promise.resolve(agg+x),0).then(console.log);
// Expected to product sum 6
My solution for .reduce in typescript
Thanks to this person
https://dev.to/arnaudcourtecuisse/comment/1el22
const userOrders = await existUsersWithName.reduce(
async (promise, existUserAndName) => {
const acc = await promise;
const {user, name} = existUserAndName;
// My async function
acc[user] = await this.users.getOrders(name);
return promise;
},
<Promise<Record<string, string[] | undefined>>>{}
);