Difference between for loop [duplicate] - javascript

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.

Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}

With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.

Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}

The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();

Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!

files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}

Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).

#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.

This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})

A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.

In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.

it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}

Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function

It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}

One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}

Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}

Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))

Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.

You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}

Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);

Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available

To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();

The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox

Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p

As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.

Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)

Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af

If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}

This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})

For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

Related

Why is my timer inside for each not working [duplicate]

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.
Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}
The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();
Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}
Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).
#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.
This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})
A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.
In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}
One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);
Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available
To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.
Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)
Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}
This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})
For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

In foreach maping ids [duplicate]

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.
Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}
The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();
Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}
Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).
#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.
This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})
A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.
In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}
One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);
Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available
To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.
Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)
Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}
This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})
For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

Javascript Async with map [duplicate]

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.
Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}
The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();
Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}
Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).
#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.
This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})
A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.
In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}
One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);
Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available
To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.
Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)
Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}
This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})
For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

How to make a JS for loop async [duplicate]

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.
Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}
The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();
Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}
Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).
#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.
This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})
A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.
In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}
One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);
Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available
To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.
Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)
Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}
This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})
For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

How to correctly use async await in a loop [duplicate]

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
console.log(contents)
}
}
See spec: proposal-async-iteration
Simplified:
for await (const results of array) {
await longRunningTask()
}
console.log('I will wait')
2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration.
Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}, Promise.resolve());
}
The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
(async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();
Picture worth 1000 words - For Sequential Approach Only
Background : I was in similar situation last night. I used async function as foreach argument. The result was un-predictable. When I did testing for my code 3 times, it ran without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
Scenario 1 - How un-sequential it can get with async in foreach
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
myPromiseArray.forEach(async (element, index) => {
let result = await element;
console.log(result);
})
console.log('After For Each Loop')
}
main();
Scenario 2 - Using for - of loop as #Bergi above suggested
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well
for (const element of myPromiseArray) {
let result = await element;
console.log(result)
}
console.log('After For Each Loop')
}
main();
If you are little old school like me, you could simply use the classic for loop, that works too :)
const getPromise = (time) => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(`Promise resolved for ${time}s`)
}, time)
})
}
const main = async () => {
const myPromiseArray = [getPromise(1000), getPromise(500), getPromise(3000)]
console.log('Before For Each Loop')
// AVOID USING THIS
// myPromiseArray.forEach(async (element, index) => {
// let result = await element;
// console.log(result);
// })
// This works well too - the classic for loop :)
for (let i = 0; i < myPromiseArray.length; i++) {
const result = await myPromiseArray[i];
console.log(result);
}
console.log('After For Each Loop')
}
main();
I hope this helps someone, good day, cheers!
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
})
The issue is, the promise returned by the iteration function is ignored by forEach(). forEach does not wait to move to the next iteration after each async code execution is completed. All the fs.readFile functions
will be invoked in the same round of the event loop, which means they are started in parallel, not in sequential, and the execution continues immediately after invoking forEach(), without
waiting for all the fs.readFile operations to complete. Since forEach does not wait for each promise to resolve, the loop actually finishes iterating before promises are resolved. You are expecting that after forEach is completed, all the async code is already executed but that is not the case. You may end up trying to access values that are not available yet.
you can test the behaviour with this example code
const array = [1, 2, 3];
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
const delayedSquare = (num) => sleep(100).then(() => num * num);
const testForEach = (numbersArray) => {
const store = [];
// this code here treated as sync code
numbersArray.forEach(async (num) => {
const squaredNum = await delayedSquare(num);
// this will console corrent squaredNum value
console.log(squaredNum);
store.push(squaredNum);
});
// you expect that store array is populated but is not
// this will return []
console.log("store",store);
};
testForEach(array);
// Notice, when you test, first "store []" will be logged
// then squaredNum's inside forEach will log
the solution is using the for-of loop.
for (const file of files){
const contents = await fs.readFile(file, 'utf8')
}
Here are some forEachAsync prototypes. Note you'll need to await them:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).
#Bergi has already gave the answer on how to handle this particular case properly. I'll not duplicate here.
I'd like to address the difference between using forEach and for loop when it comes to async and await
how forEach works
Let's look at how forEach works. According to ECMAScript Specification, MDN provides an implementation which can be used as a polyfill. I copy it and paste here with comments removal.
Array.prototype.forEach = function (callback, thisArg) {
if (this == null) { throw new TypeError('Array.prototype.forEach called on null or undefined'); }
var T, k;
var O = Object(this);
var len = O.length >>> 0;
if (typeof callback !== "function") { throw new TypeError(callback + ' is not a function'); }
if (arguments.length > 1) { T = thisArg; }
k = 0;
while (k < len) {
var kValue;
if (k in O) {
kValue = O[k];
callback.call(T, kValue, k, O); // pay attention to this line
}
k++;
}
};
Let's back to your code, let's extract the callback as a function.
async function callback(file){
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}
So, basically callback returns a promise since it's declared with async. Inside forEach, callback is just called in a normal way, if the callback itself returns a promise, the javascript engine will not wait it to be resolved or rejected. Instead, it puts the promise in a job queue, and continues executing the loop.
How about await fs.readFile(file, 'utf8') inside the callback?
Basically, when your async callback gets the chance to be executed, the js engine will pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume execution of the async function after fulfillment. So the contents variable store the actual result from fs.readFile, not a promise. So, console.log(contents) logs out the file content not a Promise
Why for ... of works?
when we write a generic for of loop, we gain more control than forEach. Let's refactor printFiles.
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
for (const file of files) {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
// or await callback(file)
}
}
When evaluate for loop, we have await promise inside the async function, the execution will pause until the await promise is settled. So, you can think of that the files are read one by one in a determined order.
Execute sequentially
Sometimes, we really need the the async functions to be executed in a sequential order. For example, I have a few new records stored in an array to be saved to database, and I want them to be saved in sequential order which means first record in array should be saved first, then second, until last one is saved.
Here is an example:
const records = [1, 2, 3, 4];
async function saveRecord(record) {
return new Promise((resolved, rejected) => {
setTimeout(()=> {
resolved(`record ${record} saved`)
}, Math.random() * 500)
});
}
async function forEachSaveRecords(records) {
records.forEach(async (record) => {
const res = await saveRecord(record);
console.log(res);
})
}
async function forofSaveRecords(records) {
for (const record of records) {
const res = await saveRecord(record);
console.log(res);
}
}
(async () => {
console.log("=== for of save records ===")
await forofSaveRecords(records)
console.log("=== forEach save records ===")
await forEachSaveRecords(records)
})()
I use setTimeout to simulate the process of saving a record to database - it's asynchronous and cost a random time. Using forEach, the records are saved in an undetermined order, but using for..of, they are saved sequentially.
This solution is also memory-optimized so you can run it on 10,000's of data items and requests. Some of the other solutions here will crash the server on large data sets.
In TypeScript:
export async function asyncForEach<T>(array: Array<T>, callback: (item: T, index: number) => Promise<void>) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index);
}
}
How to use?
await asyncForEach(receipts, async (eachItem) => {
await ...
})
A simple drop-in solution for replacing a forEach() await loop that is not working is replacing forEach with map and adding Promise.all( to the beginning.
For example:
await y.forEach(async (x) => {
to
await Promise.all(y.map(async (x) => {
An extra ) is needed at the end.
In addition to #Bergi’s answer, I’d like to offer a third alternative. It's very similar to #Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().
In #Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Bergi's solution works nicely when fs is promise based.
You can use bluebird, fs-extra or fs-promise for this.
However, solution for node's native fs libary is as follows:
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {
// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
It is not good to call an asynchronous method from a loop. This is because each loop iteration will be delayed until the entire asynchronous operation completes. That is not very performant. It also averts the advantages of parallelization benefits of async/await.
A better solution would be to create all promises at once, then get access to the results using Promise.all(). Otherwise, each successive operation will not start until the previous one has completed.
Consequently, the code may be refactored as follows;
const printFiles = async () => {
const files = await getFilePaths();
const results = [];
files.forEach((file) => {
results.push(fs.readFile(file, 'utf8'));
});
const contents = await Promise.all(results);
console.log(contents);
}
One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.
Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Just adding to the original answer
The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];
const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
async function printFiles() {
const files = await getFilePaths();
for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Like #Bergi's response, but with one difference.
Promise.all rejects all promises if one gets rejected.
So, use a recursion.
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)
return files.length <= index
? readFilesQueue(files, ++index)
: files
}
const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
readFilesQueue is outside of printFiles cause the side effect* introduced by console.log, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).
Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.
const files = await getFilesPath()
const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}
const readFiles = async = (files, index = 0) => {
await printFile(files[index])
return files.lengh <= index
? readFiles(files, ++index)
: files
}
readFiles(files)
Future edit/current state
Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
Wrap the functions in a async function.
export const readFilesQueue = async () => {
// ... to code goes here
}
Or change the names of variables, whatever...
* by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
You can use Array.prototype.forEach, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but Array.prototype.forEach does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.
Here is a way to read and print each file in series using Array.prototype.forEach
async function printFilesInSeries () {
const files = await getFilePaths()
let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
Here is a way (still using Array.prototype.forEach) to print the contents of files in parallel
async function printFilesInParallel () {
const files = await getFilePaths()
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.
More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here
The multiple ways through which it can be done and they are as follows,
Method 1 : Using the wrapper.
await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();
Method 2: Using the same as a generic function of Array.prototype
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
let hello = async (items) => {
// Method 1 - Using the Array.prototype.forEach
await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});
console.log("count = " + count);
}
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));
console.log("count = " + count);
Method 4 : Traditional for loop or modern for loop
// Method 4 - using for loop directly
// 1. Using the modern for(.. in..) loop
for(item in items){
await someAPICall();
count++;
}
//2. Using the traditional for loop
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
console.log("count = " + count);
Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.
Let's test it...
// Nodejs style
// file: someOtherFile.js
const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log
// Create a stream interface
function createReader(options={prompt: '>'}){
return readline.createInterface({
input: process.stdin
,output: process.stdout
,prompt: options.prompt !== undefined ? options.prompt : '>'
})
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
log(question)
let reader = createReader(options)
return new Promise((res)=>{
reader.on('line', (answer)=>{
process.stdout.cursorTo(0, 0)
process.stdout.clearScreenDown()
reader.close()
res(answer)
})
})
}
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
await questions.asyncForEach(async function(question, index){
let answer = await getUserIn(question)
responses[question] = answer
})
}
async function main(){
await getResponses()
log(responses)
}
main()
// Should prompt user for an answer to each question and then
// log each question and answer as an object to the terminal
We could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Some things to note:
Your iteratorFunction must be an async function or promise
Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available
To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
Arbitrary order.
printFiles can finish running before printing files.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
Initiate all of the file reads to happen in parallel.
Preserve the order via the use of map to map file names to promises to wait for.
Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
The OP's original question
Are there any issues with using async/await in a forEach loop? ...
was covered to an extent in #Bergi's selected answer,
which showed how to process in serial and in parallel. However there are other issues noted with parallelism -
Order -- #chharvey notes that -
For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array.
Possibly opening too many files at once -- A comment by Bergi under another answer
It is also not good to open thousands of files at once to read them concurrently. One always has to do an assessment whether a sequential, parallel, or mixed approach is better.
So let's address these issues showing actual code that is brief and concise, and does not use third party libraries. Something easy to cut, paste, and modify.
Reading in parallel (all at once), printing in serial (as early as possible per file).
The easiest improvement is to perform full parallelism as in #Bergi's answer, but making a small change so that each file is printed as soon as possible while preserving order.
async function printFiles2() {
const readProms = (await getFilePaths()).map((file) =>
fs.readFile(file, "utf8")
);
await Promise.all([
await Promise.all(readProms), // branch 1
(async () => { // branch 2
for (const p of readProms) console.log(await p);
})(),
]);
}
Above, two separate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
That was easy.
Reading in parallel with a concurrency limit, printing in serial (as early as possible per file).
A "concurrency limit" means that no more than N files will ever being read at the same time.
Like a store that only allows in so many customers at a time (at least during COVID).
First a helper function is introduced -
function bootablePromise(kickMe: () => Promise<any>) {
let resolve: (value: unknown) => void = () => {};
const promise = new Promise((res) => { resolve = res; });
const boot = () => { resolve(kickMe()); };
return { promise, boot };
}
The function bootablePromise(kickMe:() => Promise<any>) takes a
function kickMe as an argument to start a task (in our case readFile) but is not started immediately.
bootablePromise returns a couple of properties
promise of type Promise
boot of type function ()=>void
promise has two stages in life
Being a promise to start a task
Being a promise complete a task it has already started.
promise transitions from the first to the second state when boot() is called.
bootablePromise is used in printFiles --
async function printFiles4() {
const files = await getFilePaths();
const boots: (() => void)[] = [];
const set: Set<Promise<{ pidx: number }>> = new Set<Promise<any>>();
const bootableProms = files.map((file,pidx) => {
const { promise, boot } = bootablePromise(() => fs.readFile(file, "utf8"));
boots.push(boot);
set.add(promise.then(() => ({ pidx })));
return promise;
});
const concurLimit = 2;
await Promise.all([
(async () => { // branch 1
let idx = 0;
boots.slice(0, concurLimit).forEach((b) => { b(); idx++; });
while (idx<boots.length) {
const { pidx } = await Promise.race([...set]);
set.delete([...set][pidx]);
boots[idx++]();
}
})(),
(async () => { // branch 2
for (const p of bootableProms) console.log(await p);
})(),
]);
}
As before there are two branches
branch 1: For running and handling concurrency.
branch 2: For printing
The difference now is the no more than concurLimit Promises are allowed to run concurrently.
The important variables are
boots: The array of functions to call to force its corresponding Promise to transition. It is used only in branch 1.
set: There are Promises in a random access container so that they can be easily removed once fulfilled. This container is used only in branch 1.
bootableProms: These are the same Promises as initially in set, but it is an array not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
"2": 500,
"3": 400,
"4": 300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Available as executable in the typescript playground sandbox
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
As other answers have mentioned, you're probably wanting it to be executed in sequence rather in parallel. Ie. run for first file, wait until it's done, then once it's done run for second file. That's not what will happen.
I think it's important to address why this doesn't happen.
Think about how forEach works. I can't find the source, but I presume it works something like this:
const forEach = (arr, cb) => {
for (let i = 0; i < arr.length; i++) {
cb(arr[i]);
}
};
Now think about what happens when you do something like this:
forEach(files, async logFile(file) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
Inside forEach's for loop we're calling cb(arr[i]), which ends up being logFile(file). The logFile function has an await inside it, so maybe the for loop will wait for this await before proceeding to i++?
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to resume execution. After the await defers the continuation of the async function, execution of subsequent statements ensues. If this await is the last expression executed by its function execution continues by returning to the function's caller a pending Promise for completion of the await's function and resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b":
const delay = (ms) => {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
};
const logNumbers = async () => {
console.log(1);
await delay(2000);
console.log(2);
await delay(2000);
console.log(3);
};
const main = () => {
console.log("a");
logNumbers();
console.log("b");
};
main();
Circling back to forEach, forEach is like main and logFile is like logNumbers. main won't stop just because logNumbers does some awaiting, and forEach won't stop just because logFile does some awaiting.
Here is a great example for using async in forEach loop.
Write your own asyncForEach
async function asyncForEach(array, callback) {
for (let index = 0; index < array.length; index++) {
await callback(array[index], index, array)
}
}
You can use it like this
await asyncForEach(array, async function(item,index,array){
//await here
}
)
Similar to Antonio Val's p-iteration, an alternative npm module is async-af:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
If you'd like to iterate over all elements concurrently:
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):
Option A: Promises
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
Option B: async/await
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];
await fn(cur, idx, arr);
}
}
This does not use async/await as the OP requested and only works if you are in the back-end with NodeJS. Although it still may be helpful for some people, because the example given by OP is to read file contents, and normally you do file reading in the backend.
Fully asynchronous and non-blocking:
const fs = require("fs")
const async = require("async")
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
async.forEachOf(obj, (value, key, callback) => {
fs.readFile(__dirname + value, "utf8", (err, data) => {
if (err) return callback(err)
try {
configs[key] = JSON.parse(data);
} catch (e) {
return callback(e)
}
callback()
});
}, err => {
if (err) console.error(err.message)
// configs is now a map of JSON data
doSomethingWith(configs)
})
For TypeScript users, a Promise.all(array.map(iterator)) wrapper with working types
Using Promise.all(array.map(iterator)) has correct types since the TypeScript's stdlib support already handles generics.
However copy pasting Promise.all(array.map(iterator)) every time you need an async map is obviously suboptimal, and Promise.all(array.map(iterator)) doesn't convey the intention of the code very well - so most developers would wrap this into an asyncMap() wrapper function. However doing this requires use of generics to ensure that values set with const value = await asyncMap() have the correct type.
export const asyncMap = async <ArrayItemType, IteratorReturnType>(
array: Array<ArrayItemType>,
iterator: (
value: ArrayItemType,
index?: number
) => Promise<IteratorReturnType>
): Promise<Array<IteratorReturnType>> => {
return Promise.all(array.map(iterator));
};
And a quick test:
it(`runs 3 items in parallel and returns results`, async () => {
const result = await asyncMap([1, 2, 3], async (item: number) => {
await sleep(item * 100);
return `Finished ${item}`;
});
expect(result.length).toEqual(3);
// Each item takes 100, 200 and 300ms
// So restricting this test to 300ms plus some leeway
}, 320);
sleep() is just:
const sleep = async (timeInMs: number): Promise<void> => {
return new Promise((resolve) => setTimeout(resolve, timeInMs));
};

Categories

Resources