I'm making a bot code for a game running on NodeJS and what this function is supposed to do is to loop through an array of vectors and then make the bot go to each vector.
However, what it's actually doing is telling the bot to run to all the vectors at the same time so it spazzes out and then runs to the last vector in the array:
function digSchedule() {
var arrayLength = blocksToMine.length;
for (var i = 0; i < blocksToMine.length; i++) {
console.log(i);
scaffoldTo(blocksToMine[i]);
}
...
}
The function scaffoldTo() needs to be ran and then wait for the bot to do said function then run it for the next element in the array, but I can't figure out how to do it.
There's several ways to achieve this. The first is probably to pass a callback with the "next function to be called" (probably scaffoldTo()). You can use .bind() to create a reference with the iterator index i.
Alternatively, you could set up a loop of Promises, which by definition have a .then() method which executes once the promise is resolved.
Finally, the async/await pattern is similar to Promises, but some find it clearer and it seems to be winning the Hype Wars: https://hackernoon.com/6-reasons-why-javascripts-async-await-blows-promises-away-tutorial-c7ec10518dd9.
Callbacks (solution 1) will be available in any version of JS. Promises generally available with a library and have native support in ES6. Async/await is a proposal (?) in ES2017 and is generally well supported.
Here's another way that you could play with this. This way focuses more on the callback style, although it does assume that you can modify the scaffoldTo function, as well as the parameters required for digSchedule
function digSchedule(i, callback) {
if(!i){
i = 0;
}
if(i < blocksToMine.length){
scaffoldTo(blocksToMine[i], digSchedule(i++, callback));
}
else{
callback();
}
}
Then inside of scaffoldTo you would need something like this
function scaffoldTo(block, callback){
//do what ever you need to have the bot go to the vector
//after bot goes to vector call callback
callback();
}
In order to start it all you would just need to call digSchedule with something like this:
digSchedule({null or 0}, function(){
console.log("finished visiting vectors");
});
This does change the pattern from using a for loop like you have, but I think its a fun way to accomplish the goal as well.
That is a good use case to the async functions of ES2017.
Please, try the following:
async function digSchedule() {
var arrayLength = blocksToMine.length;
for (var i = 0; i < blocksToMine.length; i++) {
console.log(i);
await scaffoldTo(blocksToMine[i]);
}
...
}
If ES2017 is out of the question, your best choice would be to make a recursive function that only calls itself again when the promise from scaffoldTo is resolved.
You may use async module to achieve this. Alternatively, you may try something like this
function forEachAsync(array, fun, cb) {
var index = 0;
if (index == array.length) {
cb(null);
return;
}
var next = function () {
fun(array[index], function(err) {
if (err) {
cb(err);
return;
}
index++;
if (index < array.length) {
setImmediate(next);
return;
}
//We are done
cb(null);
});
};
next();
}
forEachAsync([1,2,3,4,5], function(e, cb) {
console.log(e);
cb();
}, function(err) {
console.log('done');
});
Here is how we do with the help of promises.
let numbers = new Array(1,3,2,1);
function asyncFunction(number){
return new Promise((resolve,reject)=>{
setTimeout(()=>{
console.log("Number : ",number);
return resolve();
},number*1000);
})
}
let promise = Promise.resolve();
// Here we are using forEach to chain promise one after another.
numbers.forEach(number=>{
promise = promise.then(()=>{
return asyncFunction(number);
});
})
promise.then(()=>{
console.log("Every thing is done!");
})
Related
I've been learning promises using bluebird for two weeks now. I have them mostly understood, but I went to go solve a few related problems and it seems my knowledge has fell apart. I'm trying to do this simple code:
var someGlobal = true;
whilePromsie(function() {
return someGlobal;
}, function(result) { // possibly even use return value of 1st parm?
// keep running this promise code
return new Promise(....).then(....);
});
as a concrete example:
// This is some very contrived functionality, but let's pretend this is
// doing something external: ajax call, db call, filesystem call, etc.
// Simply return a number between 0-999 after a 0-999 millisecond
// fake delay.
function getNextItem() {
return new Promise.delay(Math.random()*1000).then(function() {
Promise.cast(Math.floor(Math.random() * 1000));
});
}
promiseWhile(function() {
// this will never return false in my example so run forever
return getNextItem() !== false;
}, // how to have result == return value of getNextItem()?
function(result) {
result.then(function(x) {
// do some work ...
}).catch(function(err) {
console.warn("A nasty error occured!: ", err);
});
}).then(function(result) {
console.log("The while finally ended!");
});
Now I've done my homework! There is the same question, but geared toward Q.js here:
Correct way to write loops for promise.
But the accepted answers, as well as additional answers:
Are geared toward Q.js or RSVP
The only answer geared toward bluebird uses recursion. These seems like it's likely to cause a huge stack overflow in an infinite loop such as mine? Or at best, be very inefficient and create a very large stack for nothing? If I'm wrong, then fine! Let me know.
Don't allow you to use result of the condition. Although this isn't requirement -- I'm just curious if it's possible. The code I'm writing, one use case needs it, the other doesn't.
Now, there is an answer regarding RSVP that uses this async() method. And what really confuses me is bluebird documents and I even see code for a Promise.async() call in the repository, but I don't see it in my latest copy of bluebird. Is it in the git repository only or something?
It's not 100% clear what you're trying to do, but I'll write an answer that does the following things you mention:
Loops until some condition in your code is met
Allows you to use a delay between loop iterations
Allows you to get and process the final result
Works with Bluebird (I'll code to the ES6 promise standard which will work with Bluebird or native promises)
Does not have stack build-up
First, let's assume you have some async function that returns a promise whose result is used to determine whether to continue looping or not.
function getNextItem() {
return new Promise.delay(Math.random()*1000).then(function() {
return(Math.floor(Math.random() * 1000));
});
}
Now, you want to loop until the value returned meets some condition
function processLoop(delay) {
return new Promise(function(resolve, reject) {
var results = [];
function next() {
getNextItem().then(function(val) {
// add to result array
results.push(val);
if (val < 100) {
// found a val < 100, so be done with the loop
resolve(results);
} else {
// run another iteration of the loop after delay
setTimeout(next, delay);
}
}, reject);
}
// start first iteration of the loop
next();
});
}
processLoop(100).then(function(results) {
// process results here
}, function(err) {
// error here
});
If you wanted to make this more generic so you could pass in the function and comparison, you could do this:
function processLoop(mainFn, compareFn, delay) {
return new Promise(function(resolve, reject) {
var results = [];
function next() {
mainFn().then(function(val) {
// add to result array
results.push(val);
if (compareFn(val))
// found a val < 100, so be done with the loop
resolve(results);
} else {
// run another iteration of the loop after delay
if (delay) {
setTimeout(next, delay);
} else {
next();
}
}
}, reject);
}
// start first iteration of the loop
next();
});
}
processLoop(getNextItem, function(val) {
return val < 100;
}, 100).then(function(results) {
// process results here
}, function(err) {
// error here
});
Your attempts at a structure like this:
return getNextItem() !== false;
Can't work because getNextItem() returns a promise which is always !== false since a promise is an object so that can't work. If you want to test a promise, you have to use .then() to get its value and you have to do the comparson asynchronously so you can't directly return a value like that.
Note: While these implementations use a function that calls itself, this does not cause stack build-up because they call themselves asynchronously. That means the stack has already completely unwound before the function calls itself again, thus there is no stack build-up. This will always be the case from a .then() handler since the Promise specification requires that a .then() handler is not called until the stack has returned to "platform code" which means it has unwound all regular "user code" before calling the .then() handler.
Using async and await in ES7
In ES7, you can use async and await to "pause" a loop. That can make this type of iteration a lot simpler to code. This looks structurally more like a typical synchronous loop. It uses await to wait on promises and because the function is declared async, it always returns a promise:
function delay(t) {
return new Promise(resolve => {
setTimeout(resolve, t);
});
}
async function processLoop(mainFn, compareFn, timeDelay) {
var results = [];
// loop until condition is met
while (true) {
let val = await mainFn();
results.push(val);
if (compareFn(val)) {
return results;
} else {
if (timeDelay) {
await delay(timeDelay);
}
}
}
}
processLoop(getNextItem, function(val) {
return val < 100;
}, 100).then(function(results) {
// process results here
}, function(err) {
// error here
});
So i am trying to transfer my code into the "Promise world", and in many places when i had to "loop" with async functionality - i simply used recursion in such a way
function doRecursion(idx,callback){
if(idx < someArray.length){
doAsync(function(){
doRecursion(++idx,callback)
});
}else{
callback('done!')
}
}
doRecursion(0,function(msg){
//...
});
Now i am trying to make the change into the Promise world, and i am quite stuck
var Promise = require('bluebird')
function doRecursion(idx){
return new Promise(function(resolve){
if(idx < someArray.length){
doAsync(function(){
//... doRecursion(++idx)
// how do i call doRecusion here....
});
}else{
resolve('done!')
}
});
}
doRecursion(0).then(function(msg){
//...
});
Thanks.
I'd go with the Promise.all approach.
What this does is wait until all the promises in the array have resolved. The map will apply the async method to each item in the array and return a promise.
function doAsyncP() {
return new Promise((resolve) => {
doAsync(function() {
resolve();
});
});
}
Promise.all(
someArray.map(doAsyncP)
).then((msg) => {
//we're done.
});
In your recursive function, you can do this:
...
if (idx < someArray.length) {
doAsync(function() {
resolve(doRecursion(idx + 1));
});
} else {
...
In other words, while idx is less than someArray.length, your promise will resolve to another promise, this time the promise returned by calling doRecursion() with an idx incremented by one. The then callback the bottom will not be called until doRecursion resolves to some value other than a promise. In this case, it will eventually resolve with a value of 'done!'.
That said, if you are using promises, you probably don't need to use recursion at all. You might have to refactor your code a bit more, but I would suggest considering #BenFortune's answer as an alternative.
I'm using UnderscoreJs with nodejs and have a need for the _.times() method. times() will invoke a function X number of times
This works as expected, however I need to iterate in a series, instead of in parallel which this appears to be doing.
Any idea if there's a way to use this in series w/ callback methods?
Given something like this:
function f() {
some_async_call({ callback: function(err, results) {...})
}
_(3).times(f);
Then the three f calls will happen in series but the some_async_call calls won't necessarily happen in series because they're asynchronous.
If you want to force your calls to run in series then you need to use the callback on the async call to launch the next one in the series:
function f(times, step) {
step = step || 0;
some_async_call({
callback: function(err, results) {
// Do something with `err` and `results`...
if(step < times)
f(times, step + 1);
}
});
}
f(3);
That approach will execute the three some_async_calls in series but, alas, the initial f(3) will return immediately. One solution to that problem is, of course, another callback:
function f(from_n, upto, and_finally) {
some_async_call({
callback: function(err, results) {
// Do something with `err` and `results`...
if(from_n < upto)
f(from_n + 1, upto, and_finally);
else
and_finally();
}
});
}
f(0, 3, function() { console.log('all done') });
Where does _.times in with all this? No where really. _.times is just a for loop:
_.times = function(n, iterator, context) {
for (var i = 0; i < n; i++) iterator.call(context, i);
};
_.times exists for completeness and to allow you to add for loop when using _.chain. You could probably shoe-horn it in if you really wanted to but you would be making a big ugly mess instead of simplifying your code.
You could use 250R's async idea but you'd have to build an array of three functions but _.range and _.map would be more appropriate for that than _.times:
// Untested off the top of my head code...
function f(callback) {
some_async_call({
callback: function(err, results) {
// Deal with `err` and `results`...
callback();
}
});
}
var three_fs = _(3).range().map(function() { return f });
async.series(three_fs);
But you still have to modify f to have a callback function and if you're always calling f three times then:
async.series([f, f, f]);
might be better than dynamically building the array with _.range and _.map.
The real lesson here is that once you get into asynchronous function calls, you end up implementing all your logic as callbacks calling callbacks calling callbacks, callbacks all the way down.
This async library might get you started
https://github.com/caolan/async#series
Or if you want to do it yourself, the idea is to do recursive calls after each function callback is called, here's the source code https://github.com/caolan/async/blob/master/lib/async.js#L101
Thats how I do it:
function processArray(array, index, callback) {
processItem(array[index], function(){
if(++index === array.length) {
callback();
return;
}
processArray(array, index, callback);
});
};
function processItem(item, callback) {
// do some ajax (browser) or request (node) stuff here
// when done
callback();
}
var arr = ["url1", "url2", "url3"];
processArray(arr, 0, function(){
console.log("done");
});
Is it any good? How to avoid those spaghetti'ish code?
Checkout the async library, it's made for control flow (async stuff) and it has a lot of methods for array stuff: each, filter, map. Check the documentation on github. Here's what you probably need:
each(arr, iterator, callback)
Applies an iterator function to each item in an array, in parallel. The iterator is called with an item from the list and a callback for when it has finished. If the iterator passes an error to this callback, the main callback for the each function is immediately called with the error.
eachSeries(arr, iterator, callback)
The same as each only the iterator is applied to each item in the array in series. The next iterator is only called once the current one has completed processing. This means the iterator functions will complete in order.
As pointed in some answer one can use "async" library. But sometimes you just don't want to introduce new dependency in your code. And below is another way how you can loop and wait for completion of some asynchronous functions.
var items = ["one", "two", "three"];
// This is your async function, which may perform call to your database or
// whatever...
function someAsyncFunc(arg, cb) {
setTimeout(function () {
cb(arg.toUpperCase());
}, 3000);
}
// cb will be called when each item from arr has been processed and all
// results are available.
function eachAsync(arr, func, cb) {
var doneCounter = 0,
results = [];
arr.forEach(function (item) {
func(item, function (res) {
doneCounter += 1;
results.push(res);
if (doneCounter === arr.length) {
cb(results);
}
});
});
}
eachAsync(items, someAsyncFunc, console.log);
Now, running node iterasync.js will wait for about three seconds and then print [ 'ONE', 'TWO', 'THREE' ]. This is a simple example, but it can be extended to handle many situations.
As correctly pointed out, you have to use setTimeout, for example:
each_async = function(ary, fn) {
var i = 0;
-function() {
fn(ary[i]);
if (++i < ary.length)
setTimeout(arguments.callee, 0)
}()
}
each_async([1,2,3,4], function(p) { console.log(p) })
The easiest way to handle async iteration of arrays (or any other iterable) is with the await operator (only in async functions) and for of loop.
(async function() {
for(let value of [ 0, 1 ]) {
value += await(Promise.resolve(1))
console.log(value)
}
})()
You can use a library to convert any functions you may need which accept callback to return promises.
In modern JavaScript there are interesting ways to extend an Array into an async itarable object.
Here I would like to demonstrate a skeleton of a totally new type AsyncArray which extends the Array type by inheriting it's goodness just to become an async iterable array.
This is only available in the modern engines. The code below uses the latest gimmicks like the private instance fields and for await...of.
If you are not familiar with them then I would advise you to have a look at the above linked topics in advance.
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
So an Async Iterable Array must contain promises. Only then it can return an iterator object which with every next() call returns a promise to eventually resolve into an object like {value : "whatever", done: false} or {done: true}. So basically everything returned is a promise here. The await abstraction unpacks the value within and gives it to us.
Now as I mentioned before, this AsyncArray type, since extended from Array, allows us to use those Array methods we are familiar with. That should simplify our job.
Let's see what happens;
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
var aa = AsyncArray.from({length:10}, (_,i) => new Promise(resolve => setTimeout(resolve,i*1000,[i,~~(Math.random()*100)])));
async function getAsycRandoms(){
for await (let random of aa){
console.log(`The Promise at index # ${random[0]} gets resolved with a random value of ${random[1]}`);
};
};
getAsycRandoms();
For modern Node.js:
To iterate through a collection truly asynchronously, you can try my tiny package with zero dependencies, compatible with ESM and CJS modules with .d.ts typings. Check the code it's really tiny.
https://www.npmjs.com/package/array-to-async-iterable
You can use it just like this:
for await(const el of new AsyncTimeIterator(arrayOfObjects)){
...
}
You can't just use for await of loop because of the JavaScript engines' microtasks and macrotasks nature.
In a brief, you won't get new HTTP requests and let other timers' callbacks to be executed with this code:
for await(const el of array){
...
}
You force V8 or the other engine to execute all the microtasks (your loop iteration) and when the loop completes you'll unblock the event loop and be ready to receive HTTP connections. So this code is completely useless.
I have a question regarding the native Array.forEach implementation of JavaScript: Does it behave asynchronously?
For example, if I call:
[many many elements].forEach(function () {lots of work to do})
Will this be non-blocking?
No, it is blocking. Have a look at the specification of the algorithm.
However a maybe easier to understand implementation is given on MDN:
if (!Array.prototype.forEach)
{
Array.prototype.forEach = function(fun /*, thisp */)
{
"use strict";
if (this === void 0 || this === null)
throw new TypeError();
var t = Object(this);
var len = t.length >>> 0;
if (typeof fun !== "function")
throw new TypeError();
var thisp = arguments[1];
for (var i = 0; i < len; i++)
{
if (i in t)
fun.call(thisp, t[i], i, t);
}
};
}
If you have to execute a lot of code for each element, you should consider to use a different approach:
function processArray(items, process) {
var todo = items.concat();
setTimeout(function() {
process(todo.shift());
if(todo.length > 0) {
setTimeout(arguments.callee, 25);
}
}, 25);
}
and then call it with:
processArray([many many elements], function () {lots of work to do});
This would be non-blocking then. The example is taken from High Performance JavaScript.
Another option might be web workers.
If you need an asynchronous-friendly version of Array.forEach and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.
async.each(openFiles, saveFile, function(err){
// if any of the saves produced an error, err would equal that error
});
There is a common pattern for doing a really heavy computation in Node that may be applicable to you...
Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.
Forking a child -
For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.
For example:
var child_process = require('child_process');
function run_in_child(array, cb) {
var process = child_process.exec('node libfn.js', function(err, stdout, stderr) {
var output = JSON.parse(stdout);
cb(err, output);
});
process.stdin.write(JSON.stringify(array), 'utf8');
process.stdin.end();
}
Array.forEach is meant for computing stuff not waiting, and there is nothing to be gained making computations asynchronous in an event loop (webworkers add multiprocessing, if you need multi-core computation). If you want to wait for multiple tasks to end, use a counter, which you can wrap in a semaphore class.
Edit 2018-10-11:
It looks like there is a good chance the standard described below may not go through, consider pipelineing as an alternative (does not behave exactly the same but methods could be implemented in a similar manor).
This is exactly why I am excited about es7, in future you will be able to do something like the code below (some of the specs are not complete so use with caution, I will try to keep this up to date). But basically using the new :: bind operator, you will be able to run a method on an object as if the object's prototype contains the method. eg [Object]::[Method] where normally you would call [Object].[ObjectsMethod]
Note to do this today (24-July-16) and have it work in all browsers you will need to transpile your code for the following functionality:Import / Export, Arrow functions, Promises, Async / Await and most importantly function bind. The code below could be modfied to use only function bind if nessesary, all this functionality is neatly available today by using babel.
YourCode.js (where 'lots of work to do' must simply return a promise, resolving it when the asynchronous work is done.)
import { asyncForEach } from './ArrayExtensions.js';
await [many many elements]::asyncForEach(() => lots of work to do);
ArrayExtensions.js
export function asyncForEach(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
for(let i=0;i<ar.length;i++)
{
await callback.call(ar, ar[i], i, ar);
}
});
};
export function asyncMap(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
const out = [];
for(let i=0;i<ar.length;i++)
{
out[i] = await callback.call(ar, ar[i], i, ar);
}
return out;
});
};
This is a short asynchronous function to use without requiring third party libs
Array.prototype.each = function (iterator, callback) {
var iterate = function () {
pointer++;
if (pointer >= this.length) {
callback();
return;
}
iterator.call(iterator, this[pointer], iterate, pointer);
}.bind(this),
pointer = -1;
iterate(this);
};
There is a package on npm for easy asynchronous for each loops.
var forEachAsync = require('futures').forEachAsync;
// waits for one request to finish before beginning the next
forEachAsync(['dogs', 'cats', 'octocats'], function (next, element, index, array) {
getPics(element, next);
// then after all of the elements have been handled
// the final callback fires to let you know it's all done
}).then(function () {
console.log('All requests have finished');
});
Also another variation forAllAsync
It is possible to code even the solution like this for example :
var loop = function(i, data, callback) {
if (i < data.length) {
//TODO("SELECT * FROM stackoverflowUsers;", function(res) {
//data[i].meta = res;
console.log(i, data[i].title);
return loop(i+1, data, errors, callback);
//});
} else {
return callback(data);
}
};
loop(0, [{"title": "hello"}, {"title": "world"}], function(data) {
console.log("DONE\n"+data);
});
On the other hand, it is much slower than a "for".
Otherwise, the excellent Async library can do this: https://caolan.github.io/async/docs.html#each
These code snippet will give you better understanding of forEach and forOf comparison.
/* eslint-disable no-console */
async function forEachTest() {
console.log('########### Testing forEach ################ ')
console.log('start of forEachTest func')
let a = [1, 2, 3]
await a.forEach(async (v) => {
console.log('start of forEach: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forEach: ', v)
})
console.log('end of forEachTest func')
}
forEachTest()
async function forOfTest() {
await new Promise(resolve => setTimeout(resolve, 10000)) //just see console in proper way
console.log('\n\n########### Testing forOf ################ ')
console.log('start of forOfTest func')
let a = [1, 2, 3]
for (const v of a) {
console.log('start of forOf: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forOf: ', v)
}
console.log('end of forOfTest func')
}
forOfTest()
Here is a small example you can run to test it:
[1,2,3,4,5,6,7,8,9].forEach(function(n){
var sum = 0;
console.log('Start for:' + n);
for (var i = 0; i < ( 10 - n) * 100000000; i++)
sum++;
console.log('Ended for:' + n, sum);
});
It will produce something like this(if it takes too less/much time, increase/decrease the number of iterations):
(index):48 Start for:1
(index):52 Ended for:1 900000000
(index):48 Start for:2
(index):52 Ended for:2 800000000
(index):48 Start for:3
(index):52 Ended for:3 700000000
(index):48 Start for:4
(index):52 Ended for:4 600000000
(index):48 Start for:5
(index):52 Ended for:5 500000000
(index):48 Start for:6
(index):52 Ended for:6 400000000
(index):48 Start for:7
(index):52 Ended for:7 300000000
(index):48 Start for:8
(index):52 Ended for:8 200000000
(index):48 Start for:9
(index):52 Ended for:9 100000000
(index):45 [Violation] 'load' handler took 7285ms
Although Array.forEach is not asynchronous, you can get asynchronous "end result". Example below:
function delayFunction(x) {
return new Promise(
(resolve) => setTimeout(() => resolve(x), 1000)
);
}
[1, 2, 3].forEach(async(x) => {
console.log(x);
console.log(await delayFunction(x));
});
Use Promise.each of bluebird library.
Promise.each(
Iterable<any>|Promise<Iterable<any>> input,
function(any item, int index, int length) iterator
) -> Promise
This method iterates over an array, or a promise of an array, which contains promises (or a mix of promises and values) with the given iterator function with the signature (value, index, length) where the value is the resolved value of a respective promise in the input array. Iteration happens serially. If the iterator function returns a promise or a thenable, then the result of the promise is awaited before continuing with next iteration. If any promise in the input array is rejected, then the returned promise is rejected as well.
If all of the iterations resolve successfully, Promise.each resolves to the original array unmodified. However, if one iteration rejects or errors, Promise.each ceases execution immediately and does not process any further iterations. The error or rejected value is returned in this case instead of the original array.
This method is meant to be used for side effects.
var fileNames = ["1.txt", "2.txt", "3.txt"];
Promise.each(fileNames, function(fileName) {
return fs.readFileAsync(fileName).then(function(val){
// do stuff with 'val' here.
});
}).then(function() {
console.log("done");
});