I'm experimenting with ES6 generators with the help of babel, and I have trouble understand how (or if!) I can effectively use callback based async function to output an iterator.
Let's say I want be able to write a function that takes a number of urls, asynchronously download them and returns them as soon as they are downloaded.
I would like to be able to write something like the following:
let urls = ['http://www.google.com', 'http://www.stackoverflow.com' ];
for ( {url, data} of downloadUrls(urls) ) {
console.log("Content of url", url, "is");
console.log(data);
}
How can I implement downloadUrls ?
Ideally I would like to be able to write the following:
var downloadUrls = function*(urls) {
for( let url of urls ) {
$.ajax(url).done( function(data) {
yield data;
});
}
};
This of course doesn't work, since ``yield'' is being invoked inside a callback and not directly inside the generator.
I can find many examples online of people trying the same, they are either not much transparent), require enabling browser/node flags, or use node-specific features/libraries.
The library closest to what I need seems to be task.js, but I'm unable to have even the simplest example run on current Chrome.
Is there a way to get the intended behaviour using standard and current features , (With current I mean usable with transpilers like babel, but without the need to enable extra flags on the browser) or do I have to wait for async/await ?
2019 update
Yielding via callbacks is actually pretty simple. Since you can only call yield directly from the generator function* where it appears (and not from callbacks), you need to yield a Promise instead, which will be resolved from the callback:
async function* fetchUrls(urls) {
for (const url of urls)
yield new Promise((resolve, reject) => {
fetch(url, { mode: 'no-cors' }).then(response => resolve(response.status));
});
}
(async function main() {
const urls = ['https://www.ietf.org/rfc/rfc2616.txt', 'https://www.w3.org/TR/PNG/iso_8859-1.txt'];
// for-await-of syntax
for await (const status of fetchUrls(urls))
console.log(status);
}());
If the example doesn't work in the browser (it my return 0 instead of 200 due to Cross Origin Read Blocking), try it live on repl.it.
Is there a way to get the intended behaviour using standard and current features
Yes, use promises and generators. Many promise libraries, and some standalone ones, feature the use of generator "coroutines".
But notice that you cannot mix iteration with asynchrony, you can use generators for either only. Your example seems to confuse them a bit - it looks like you expect that for ( {url, data} of downloadUrls(urls) ) { loop to work synchronously, which cannot work.
do I have to wait for async/await?
No, you don't have to wait, Babel already supports them!
Here is a clean way to use a generator / iterator to flatten asynchronous code which works for me in node.js:
var asyncProcedureGenerator1 = function*() {
var it = yield(0); //get a reference to the iterator
try {
var a = yield (asyncPart1.bind(it))(0); //call the function, set this = it
var b = yield (asyncPart2.bind(it))(a);
var c = yield (asyncPart3.bind(it))(b);
console.log("c = ", c);
}
catch(err)
{
console.log("Something went wrong: ", err);
}
};
var runAsyncGenerator = function(generator) {
var asyncProcedureIterator = generator(); //create an iterator
asyncProcedureIterator.next(); //start the iterator
asyncProcedureIterator.next(asyncProcedureIterator); //pass a reference of the iterator to itself
}
var asyncPart1 = function(param1) {
var it = this; //the iterator will be equal to this.
console.log("Starting asyncPart1 with param1 = ", param1);
setTimeout(function() {
console.log("Done with asyncPart1");
var returnValue = 42 + param1;
console.log("asyncPart1 returned ", returnValue);
it.next(returnValue); //when we are done, resume the iterator which has yielded to us.
},2000);
};
var asyncPart2 = function(param1) {
var it = this; //the iterator will be equal to this.
console.log("Starting asyncPart2 with param1 = ", param1);
setTimeout(function() {
console.log("Done with asyncPart2");
var returnValue = param1 / 2;
console.log("asyncPart2 returned ", returnValue);
//it.throw("Uh oh.");
it.next(returnValue);
},2000);
};
var asyncPart3 = function(param1) {
var it = this; //the iterator will be equal to this.
console.log("Starting asyncPart3 with param1 = ", param1);
setTimeout(function() {
console.log("Done with asyncPart3");
var returnValue = param1 / 3;
console.log("asyncPart3 returned ", returnValue);
it.next(returnValue);
},2000);
};
runAsyncGenerator(asyncProcedureGenerator1);
The idea is to run the generator, creator an iterator, and then pass a reference of that iterator to itself.
Then the iterator can call asynchronous functions (with yield) and pass them a reference to itself which allows those functions to either return success and resume the execution by calling iterator.next(result) or failure by calling iterator.throw(error).
I just came up with this pattern, so there may be some gotchas I haven't found yet, but it seems to work and allows very flat code with minimal additions.
Related
This question is somewhat academic in that I don't have a real need to do this.
I'm wondering if I can force the resolution of a promise into a returned value from a function such that the function callers are not aware that the functions contain promised async operations.
In .NET I can do things like this by using functions on Task[] or return Task.Result which causes the caller to await the completion of the task and callers won't know or care that the work has been done using tasks.
If you're using ES6 you can use a generator to make code like this. It essentially comes close to 'blocking' on the promise, so you have the appearance of a long-running method that just returns the value you want, but async/promises live under the covers.
let asyncTask = () =>
new Promise(resolve => {
let delay = Math.floor(Math.random() * 100);
setTimeout(function () {
resolve(delay);
}, delay);
});
let makeMeLookSync = fn => {
let iterator = fn();
let loop = result => {
!result.done && result.value.then(res =>
loop(iterator.next(res)));
};
loop(iterator.next());
};
makeMeLookSync(function* () {
let result = yield asyncTask();
console.log(result);
});
More explanation and the source available here: http://www.tivix.com/blog/making-promises-in-a-synchronous-manner/
Here is the code compiled on Babeljs.io
I am trying to learn how to use generators and yield, so I tried the following but it doesn't seem to be working.
I am using the following function, which contains 2 async calls:
var client = require('mongodb').MongoClient;
$db = function*(collection, obj){
var documents;
yield client.connect('mongodb://localhost/test', function*(err, db){
var c = db.collection(collection);
yield c.find(obj).toArray(function(err, docs){
documents = docs;
db.close();
});
});
return documents.length;
};
Then to make the call original call, I am doing this:
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
var total = $db("ads", {"details.keywords": {$in: query["keywords[]"]}});
console.log(total);
When I get my output back in the console, I get this:
{}
I was expecting a number such as 200. What is it that I am doing wrong?
TL;DR
For the short answer, you're looking for a helper like co.
var co = require("co");
co(myGen( )).then(function (result) { });
But Why?
There is nothing inherently asynchronous about ES6 iterators, or the generators which define them.
function * allIntegers ( ) {
var i = 1;
while (true) {
yield i;
i += 1;
}
}
var ints = allIntegers();
ints.next().value; // 1
ints.next().value; // 2
ints.next().value; // 3
The .next( ) method, though, actually lets you send data back in to the iterator.
function * exampleGen ( ) {
var a = yield undefined;
var b = yield a + 1;
return b;
}
var exampleIter = exampleGen();
exampleIter.next().value; // undefined
exampleIter.next(12).value; // 13 (I passed 12 back in, which is assigned to a)
exampleIter.next("Hi").value; // "Hi" is assigned to b, and then returned
It might be confusing to think about, but when you yield it's like a return statement; the left hand side hasn't been assigned the value yet... ...and more importantly, if you had put the var y = (yield x) + 1; the parenthesis are resolved before the rest of the expression... ...so you return, and the +1 is put on hold, until a value comes back.
Then when it arrives (passed in, via the .next( )), the rest of the expression is evaluated (and then assigned to the left hand side).
The object that's returned from each call has two properties { value: ..., done: false }
value is what you've returned/yielded and done is whether or not it's hit the actual return statement at the end of the function (including implicit returns).
This is the part that can then be used to make this async magic happen.
function * asyncGen ( id ) {
var key = yield getKeyPromise( id );
var values = yield getValuesPromise( key );
return values;
}
var asyncProcess = asyncGen( 123 );
var getKey = asyncProcess.next( ).value;
getKey.then(function (key) {
return asyncProcess.next( key ).value;
}).then(function (values) {
doStuff(values);
});
There's no magic.
Instead of returning a value, I'm returning a promise.
When the promise completes, I'm pushing the result back in, using .next( result ), which gets me another promise.
When that promise resolves, I push that back in, using .next( newResult ), et cetera, until I'm done.
Can we do better?
We know now that we're just waiting for promises to resolve, then calling .next on the iterator with the result.
Do we have to know, ahead of time what the iterator looks like, to know when we're done?
Not really.
function coroutine (iterator) {
return new Promise(function (resolve, reject) {
function turnIterator (value) {
var result = iterator.next( value );
if (result.done) {
resolve(result.value);
} else {
result.value.then(turnIterator);
}
}
turnIterator();
};
}
coroutine( myGen ).then(function (result) { });
This isn't complete and perfect. co covers extra bases (making sure all yields get treated like promises, so you don't blow up by passing a non-promise value... ...or allowing arrays of promises to be yielded, which becomes one promise which will return the array of results for that yield ...or try/catch around the promise handling, to throw the error back into the iterator... yes, try/catch works perfectly with yield statements, done this way, thanks to a .throw(err) method on the iterator).
These things aren't hard to implement, but they make the example muddier than it needs to be.
This is exactly why co or some other "coroutine" or "spawn" method is perfect for this stuff.
The guys behind the Express server built KoaJS, using Co as a library, and Koa's middleware system just takes generators in its .use method and does the right thing.
But Wait, there's more!
As of ES7, it's very likely that the spec will add language for this exact use-case.
async function doAsyncProcess (id) {
var key = await getKeyPromise(id);
var values = await getValuesPromise(key);
return values;
}
doAsyncProcess(123).then(values => doStuff(values));
The async and await keywords are used together, to achieve the same functionality as the coroutine-wrapped promise-yielding generator, without all of the external boilerplate (and with engine-level optimizations, eventually).
You can try this today, if you're using a transpiler like BabelJS.
I hope this helps.
Yield and generators have nothing to do with asynchrony, their primary purpose is to produce iterable sequences of values, just like this:
function * gen() {
var i = 0;
while (i < 10) {
yield i++;
}
}
for (var i of gen()) {
console.log(i);
}
Just calling a function with a star (generator function) merely creates generator object (that is why you see {} in console), that can be interacted with using next function.
That said, you can use generator functions as an analogue of asynchronous functions, but you need a special runner, like co.
var client = require('mongodb').MongoClient;
$db = function*(collection, obj){
var documents;
yield client.connect('mongodb://localhost/test', function*(err, db){
var c = db.collection(collection);
yield c.find(obj).toArray(function(err, docs){
documents = docs;
db.close();
});
});
return documents.length;
};
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
var total = $db("ads", {"details.keywords": {$in: query["keywords[]"]}});
console.log(total);
As is, total is the iterator for the $db generator function. You would retrieve its yield values via total.next().value. However, the mongodb library is callback based and as such, its functions do not return values, so yield will return null.
You mentioned you were using Promises elsewhere; I would suggest taking a look at the bluebird in particular its promisify functionality. Promisification inverts the callback model so that the arguments to the callback are now used to resolve the promisified function. Even better, promisifyAll will convert an entire callback based API.
Finally, bluebird provides coroutine functionality as well; however its coroutines must return promises. So, your code may be rewritten as follows:
var mongo = require('mongodb');
var Promise = require('bluebird');
//here we convert the mongodb callback based API to a promised based API
Promise.promisifyAll(mongo);
$db = Promise.coroutine(function*(collection, obj){
//existing functions are converted to promised based versions which have
//the same name with 'Async' appended to them
return yield mongo.MongoClient.connectAsync('mongodb://localhost/test')
.then(function(db){
return db.collectionAsync(collection);})
.then(function(collection) {
return collection.countAsync();});
});
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
$db('ads',{"details.keywords": {$in: query["keywords[]"]}})
.then(console.log)
I understand how to use generators to make async code look nice. I have a simple generator *all, that takes a page, will return a single value.
Then I have another generator *allDo, that will use *all for pages 1 to 30 and for each result, do some async task.
Then I have another generator *allBatchDo, that will batch 3 pages, and do some async task.
function mockPromise(value) {
return Promise(function(resolve, reject) {
resolve(value);
});
}
function *all(page) {
var ls = yield mockPromise("page " + page);
// do all kinds of promises
return yield ls;
};
function *allDo(task) {
var page = 1;
while (true) {
var res = yield * all(page);
res = yield task(res);
if (page == 30) {
break;
}
page++;
}
}
function *allBatchDo(task) {
var page = 1;
var arr = [];
while (true) {
var res = yield * all(author, page);
arr.push(res);
if (arr.length >= 3) {
yield task(arr);
arr = [];
}
if (page == 30) {
break;
}
page++;
}
}
function logTask(res) {
return mockPromise(res).then(function(v) {
console.log(v);
});
}
Example use of these generators would be:
// return a single page promise
async(all(1)).then(function(value) { console.log(value); });
// do `logTask` for all pages 1 thru 30
async(allDo(logTask));
// do `logTask` for all pages with batches of 10
async(allBatchDo(logTask));
The question is, is this a legitimate use of es6 async features, or is there an abstract built-in solution for my use case?
If you want use generators to make async then you code is valid. ES6 contains only promises to async operations. ES7 will have async/await. You can also use a good library: https://github.com/kriskowal/q or use only native promises Promise.All without generators.
I would say this code might be quite slow, since you are using yield* all the task will run sequentially potentially taking much more time than necessary (assuming mockPromise does some io) you might be better of yielding Promise.all or just using promises
also your usage of while(true) is very weird..
Below are some links that can help you for asynquence's runner
http://davidwalsh.name/concurrent-generators And
http://spion.github.io/posts/analysis-generators-and-other-async-patterns-node.html
From what I understand, the future style to write async code in JS is to use generators instead of callbacks. At least, or esp. in the V8 / Nodejs community. Is that right? (But that might be debatable and is not my main question here.)
To write async code with generators, I have found a few libraries:
gen-run (What I'm currently using.)
co
task.js
Galaxy
They all look kind of similar and I'm not that sure which of them to use (or if that even matters). (However, that might again be debatable and is also not my main question here -- but I still would be very happy about any advice.)
(I'm anyway only using pure V8 - if that matters. I'm not using Nodejs but I use pure V8 in my custom C++ app. However, I already have a few node-style elements in my code, including my custom require().)
Now I have some function X written in callback-style, which itself calls other async functions with callback arguments, e.g.:
function X(v, callback) {
return Y(onGotY);
function onGotY(err, res) {
if(err) return callback(err);
return Z(onGotZ);
}
function onGotZ(err, res, resExtended) {
if(err) return callback(err);
return callback(null, v + res + resExtended);
}
}
And I want to turn X into a generator, e.g. I guess function* X(v) { ... }. How would that look like?
I went with my very simple own lib which works quite well for my small V8 environment and also makes it easy to debug because the JS callstack stays intact. To make it work with Nodejs or on the web, it would need some modifications, though.
Rationales here:
For debugging, we don't really want async code - we want to have nice understandable call stacks in debuggers, esp. in the node-inspector debugger.
Also note that so far, all async code is completely artificial and we execute everything completely in sync, somewhat emulated via V8 Microtasks.
Callback-style async code is already hard to understand in call stacks.
Generator-style async code looses the call stack information completely in conventional debuggers - that includes the current Chrome Beta Developer Tools V8 debugger used with node-inspector. That is very much the nature of generators (coroutines in general). Later versions of the debugger might handle that, but that is not the case today. We even need some special C++ code to get the info. Example code can be found here:
https://github.com/bjouhier/galaxy-stack/blob/master/src/galaxy-stack.cc
So if we want to have useful debuggers, today, we cannot use generators -- at least not as it is usually done.
We still want to use generator-style async code because it makes the code much more readable, compared to callback-style code.
We introduce a new function async to overcome this. Imagine the following callback-style async code:
function doSthX(a, b, callback) { ... }
function doSthY(c, callback) {
doSthX(c/2, 42, function(err, res) {
if(err) return callback(err);
callback(null, c + res);
})
}
Now, the same code with the async function and generator-style code:
function* doSthX(a, b) { ... }
function* doSthY(c) {
var res = yield async(doSthX(c/2, 42));
return c + res;
}
We can provide two versions of async(iter):
Run the iterator iter on top of the stack. That will keep a sane call stack and everything is run serially.
Yield the iterator down to some lower handler and make it really async.
Note that there are a few notable libraries which can be used for the second approach:
https://github.com/visionmedia/co
http://taskjs.org/
https://github.com/creationix/gen-run
https://github.com/bjouhier/galaxy
For now, we just implement the first approach - to make debugging easier.
If we once want to have both, we can introduce a debug flag to switch via both implementations.
global.async = async;
function async(iter) {
// must be an iterator
assert(iter.next);
var gotValue;
var sendValue;
while(true) {
var next = iter.next(sendValue);
gotValue = next.value;
if(!next.done) {
// We expect gotValue as a value returned from this function `async`.
assert(gotValue.getResult);
var res = gotValue.getResult();
sendValue = res;
}
if(next.done) break;
}
return {
getResult: function() {
return gotValue;
}
};
}
// Like `async`, but wraps a callback-style function.
global.async_call_cb = async_call_cb;
function async_call_cb(f, thisArg /* , ... */) {
assert(f.apply && f.call);
var args = Array.prototype.slice.call(arguments, 2);
return async((function*() {
var gotCalled = false;
var res;
// This expects that the callback is run on top of the stack.
// We will get this if we always use the wrapped enqueueMicrotask().
// If we have to force this somehow else at some point, we could
// call runMicrotasks() here - or some other waiter function,
// to wait for our callback.
args.push(callback);
f.apply(thisArg, args);
function callback(err, _res) {
assert(!gotCalled);
if(err) throw err;
gotCalled = true;
res = _res;
}
assert(gotCalled);
return res;
})());
}
// get the result synchronously from async
global.sync_from_async = sync_from_async;
function sync_from_async(s) {
assert(s.getResult); // see async()
return s.getResult();
}
// creates a node.js callback-style function from async
global.callback_from_async = callback_from_async;
function callback_from_async(s) {
return function(callback) {
var res;
try { res = sync_from_async(s); }
catch(err) {
return callback(err);
}
return callback(null, res);
};
}
global.sync_get = sync_get;
function sync_get(iter) {
return sync_from_async(async(iter));
}
// this is like in gen-run.
// it's supposed to run the main-function which is expected to be a generator.
// f must be a generator
// returns the result.
global.run = run;
function run(f) {
return sync_get(f());
}
I assume that by "turning X into a generator" you mean "turning X into something you can yield", because that's how these libs work.
If you are using co (which I recommend), you need to make your function return thunks, that is a function, which accepts only a callback. Quite simple:
function coX (v) {
return function (cb) {
X(v, cb);
}
}
And then just:
co(function * () {
yield coX('v');
})();
I have a question regarding the native Array.forEach implementation of JavaScript: Does it behave asynchronously?
For example, if I call:
[many many elements].forEach(function () {lots of work to do})
Will this be non-blocking?
No, it is blocking. Have a look at the specification of the algorithm.
However a maybe easier to understand implementation is given on MDN:
if (!Array.prototype.forEach)
{
Array.prototype.forEach = function(fun /*, thisp */)
{
"use strict";
if (this === void 0 || this === null)
throw new TypeError();
var t = Object(this);
var len = t.length >>> 0;
if (typeof fun !== "function")
throw new TypeError();
var thisp = arguments[1];
for (var i = 0; i < len; i++)
{
if (i in t)
fun.call(thisp, t[i], i, t);
}
};
}
If you have to execute a lot of code for each element, you should consider to use a different approach:
function processArray(items, process) {
var todo = items.concat();
setTimeout(function() {
process(todo.shift());
if(todo.length > 0) {
setTimeout(arguments.callee, 25);
}
}, 25);
}
and then call it with:
processArray([many many elements], function () {lots of work to do});
This would be non-blocking then. The example is taken from High Performance JavaScript.
Another option might be web workers.
If you need an asynchronous-friendly version of Array.forEach and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.
async.each(openFiles, saveFile, function(err){
// if any of the saves produced an error, err would equal that error
});
There is a common pattern for doing a really heavy computation in Node that may be applicable to you...
Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.
Forking a child -
For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.
For example:
var child_process = require('child_process');
function run_in_child(array, cb) {
var process = child_process.exec('node libfn.js', function(err, stdout, stderr) {
var output = JSON.parse(stdout);
cb(err, output);
});
process.stdin.write(JSON.stringify(array), 'utf8');
process.stdin.end();
}
Array.forEach is meant for computing stuff not waiting, and there is nothing to be gained making computations asynchronous in an event loop (webworkers add multiprocessing, if you need multi-core computation). If you want to wait for multiple tasks to end, use a counter, which you can wrap in a semaphore class.
Edit 2018-10-11:
It looks like there is a good chance the standard described below may not go through, consider pipelineing as an alternative (does not behave exactly the same but methods could be implemented in a similar manor).
This is exactly why I am excited about es7, in future you will be able to do something like the code below (some of the specs are not complete so use with caution, I will try to keep this up to date). But basically using the new :: bind operator, you will be able to run a method on an object as if the object's prototype contains the method. eg [Object]::[Method] where normally you would call [Object].[ObjectsMethod]
Note to do this today (24-July-16) and have it work in all browsers you will need to transpile your code for the following functionality:Import / Export, Arrow functions, Promises, Async / Await and most importantly function bind. The code below could be modfied to use only function bind if nessesary, all this functionality is neatly available today by using babel.
YourCode.js (where 'lots of work to do' must simply return a promise, resolving it when the asynchronous work is done.)
import { asyncForEach } from './ArrayExtensions.js';
await [many many elements]::asyncForEach(() => lots of work to do);
ArrayExtensions.js
export function asyncForEach(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
for(let i=0;i<ar.length;i++)
{
await callback.call(ar, ar[i], i, ar);
}
});
};
export function asyncMap(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
const out = [];
for(let i=0;i<ar.length;i++)
{
out[i] = await callback.call(ar, ar[i], i, ar);
}
return out;
});
};
This is a short asynchronous function to use without requiring third party libs
Array.prototype.each = function (iterator, callback) {
var iterate = function () {
pointer++;
if (pointer >= this.length) {
callback();
return;
}
iterator.call(iterator, this[pointer], iterate, pointer);
}.bind(this),
pointer = -1;
iterate(this);
};
There is a package on npm for easy asynchronous for each loops.
var forEachAsync = require('futures').forEachAsync;
// waits for one request to finish before beginning the next
forEachAsync(['dogs', 'cats', 'octocats'], function (next, element, index, array) {
getPics(element, next);
// then after all of the elements have been handled
// the final callback fires to let you know it's all done
}).then(function () {
console.log('All requests have finished');
});
Also another variation forAllAsync
It is possible to code even the solution like this for example :
var loop = function(i, data, callback) {
if (i < data.length) {
//TODO("SELECT * FROM stackoverflowUsers;", function(res) {
//data[i].meta = res;
console.log(i, data[i].title);
return loop(i+1, data, errors, callback);
//});
} else {
return callback(data);
}
};
loop(0, [{"title": "hello"}, {"title": "world"}], function(data) {
console.log("DONE\n"+data);
});
On the other hand, it is much slower than a "for".
Otherwise, the excellent Async library can do this: https://caolan.github.io/async/docs.html#each
These code snippet will give you better understanding of forEach and forOf comparison.
/* eslint-disable no-console */
async function forEachTest() {
console.log('########### Testing forEach ################ ')
console.log('start of forEachTest func')
let a = [1, 2, 3]
await a.forEach(async (v) => {
console.log('start of forEach: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forEach: ', v)
})
console.log('end of forEachTest func')
}
forEachTest()
async function forOfTest() {
await new Promise(resolve => setTimeout(resolve, 10000)) //just see console in proper way
console.log('\n\n########### Testing forOf ################ ')
console.log('start of forOfTest func')
let a = [1, 2, 3]
for (const v of a) {
console.log('start of forOf: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forOf: ', v)
}
console.log('end of forOfTest func')
}
forOfTest()
Here is a small example you can run to test it:
[1,2,3,4,5,6,7,8,9].forEach(function(n){
var sum = 0;
console.log('Start for:' + n);
for (var i = 0; i < ( 10 - n) * 100000000; i++)
sum++;
console.log('Ended for:' + n, sum);
});
It will produce something like this(if it takes too less/much time, increase/decrease the number of iterations):
(index):48 Start for:1
(index):52 Ended for:1 900000000
(index):48 Start for:2
(index):52 Ended for:2 800000000
(index):48 Start for:3
(index):52 Ended for:3 700000000
(index):48 Start for:4
(index):52 Ended for:4 600000000
(index):48 Start for:5
(index):52 Ended for:5 500000000
(index):48 Start for:6
(index):52 Ended for:6 400000000
(index):48 Start for:7
(index):52 Ended for:7 300000000
(index):48 Start for:8
(index):52 Ended for:8 200000000
(index):48 Start for:9
(index):52 Ended for:9 100000000
(index):45 [Violation] 'load' handler took 7285ms
Although Array.forEach is not asynchronous, you can get asynchronous "end result". Example below:
function delayFunction(x) {
return new Promise(
(resolve) => setTimeout(() => resolve(x), 1000)
);
}
[1, 2, 3].forEach(async(x) => {
console.log(x);
console.log(await delayFunction(x));
});
Use Promise.each of bluebird library.
Promise.each(
Iterable<any>|Promise<Iterable<any>> input,
function(any item, int index, int length) iterator
) -> Promise
This method iterates over an array, or a promise of an array, which contains promises (or a mix of promises and values) with the given iterator function with the signature (value, index, length) where the value is the resolved value of a respective promise in the input array. Iteration happens serially. If the iterator function returns a promise or a thenable, then the result of the promise is awaited before continuing with next iteration. If any promise in the input array is rejected, then the returned promise is rejected as well.
If all of the iterations resolve successfully, Promise.each resolves to the original array unmodified. However, if one iteration rejects or errors, Promise.each ceases execution immediately and does not process any further iterations. The error or rejected value is returned in this case instead of the original array.
This method is meant to be used for side effects.
var fileNames = ["1.txt", "2.txt", "3.txt"];
Promise.each(fileNames, function(fileName) {
return fs.readFileAsync(fileName).then(function(val){
// do stuff with 'val' here.
});
}).then(function() {
console.log("done");
});