promise chaining over async await - javascript [duplicate] - javascript

I know that the async await is the new Promise in the town and it is a new way to write asynchronous code and I also know that
We didn’t have to write .then, create an anonymous function to handle the response
Async/await makes it finally possible to handle both synchronous and asynchronous errors with the same construct, good old try/catch
The error stack returned from a promise chain gives no clue of where the error happened. However, the error stack from async/await points to the function that contains the error
AND SO ON...
but
here I have done a simple bench mark https://repl.it/repls/FormalAbandonedChimpanzee
In the benchmark I have run 2 loops for 1 million times.
In first loop I am calling a function that is returning 1
in another function I am calling a function that is throwing 1 as an exception.
the time taken by first loop which is calling a function that is returning 1 is almost half of the function that is throwing 1 as error.
Which shows that time taken by throw is almost double of the time taken by return
node v7.4 linux/amd64
return takes 1.233seconds
1000000
throw takes 2.128seconds
1000000
Benchmark Code Below
function f1() {
return 1;
}
function f2() {
throw 1;
}
function parseHrtimeToSeconds(hrtime) {
var seconds = (hrtime[0] + (hrtime[1] / 1e9)).toFixed(3);
return seconds;
}
var sum = 0;
var start = 0;
var i = 0;
start = process.hrtime();
for (i = 0; i < 1e6; i++) {
try {
sum += f1();
} catch (e) {
sum += e;
}
}
var seconds = parseHrtimeToSeconds(process.hrtime(start));
console.log('return takes ' + seconds + 'seconds');
console.log(sum);
sum = 0;
start = process.hrtime();
for (i = 0; i < 1e6; i++) {
try {
sum += f2();
} catch (e) {
sum += e;
}
}
seconds = parseHrtimeToSeconds(process.hrtime(start));
console.log('throw takes ' + seconds + 'seconds');
console.log(sum);

Your benchmark has nothing to do with the performance between async/await vs raw promises. All I can see is that throwing an error takes a longer time to compute. This is expected.
Back to the main question, should use async/await rather than .then with raw promises?
Keep in mind that async/await is merely syntactic sugar over raw promises, so there shouldn't be much impact on the overall performance. However, it does make your code more linear which removes a lot of cognitive overhead from the developer.
The conclusion is use what you prefer. Promises can be polyfill'd but new syntaxes cannot, so you might want to keep that in mind when deciding which style to use.
Some misunderstanding:
The error stack returned from a promise chain gives no clue of where the error happened
That is not true. A quick check with:
function error() {
return new Promise(function(res, rej) {
res(undefined()); // uh oh
});
}
error().then(console.log, e => console.log("Uh oh!", e.stack));
shows the entire error stack including the location.

As most things go, the answer is "it depends".
Before talking about performance, the more important aspect is the maintainability of the code, and limitation of async/await vs raw Promise.
async/await is a great way to execute asynchronous code sequentially, while Promise enables you to run asynchronous code concurrently.
async function foo() {
const a = await backend.doSomething()
const b = await backend.doAnotherThing()
return a + b
}
In the code above, backend.doAnotherThing() will not be executed until backend.doSomething() has returned. On the other hand:
function foo() {
Promise.all([backend.doSomething(), backend.doAnotherThing()])
.then(([a, b]) => {
return a + b
})
}
will execute both calls, and wait for both to complete.
As you mentioned about the benefits of async/await, I personally use it extensively. Except for the cases above.
If you need performance and to you, the performance difference between async/await vs Promise is more important than the readability benefit of async/await over Promise, by all mean go ahead.
As long as it is a conscious choice, you should be fine.
UPDATE: as mentioned by Derek 朕會功夫
You can get parallel execution with async/await by:
async function foo() {
const p1 = backend.doSomething()
const p2 = backend.doAnotherThing()
return await p1 + await p2
}

Building on unional's answer:
You can achieve the same behavior as Promise.all with async/await
function foo() {
Promise.all([backend.doSomething(), backend.doAnotherThing()])
.then(([a, b]) => {
return a + b
})
}
async function foo() {
const a = backend.doSomething()
const b = backend.doAnotherThing()
return await a + await b
}
Backend tasks happen concurrently and we wait on both to be finished before we return. See also the MDN example I wrote
Based on this I am not sure if there is any performance advantage to directly using Promises over async/await.

Related

Node Async practical benefit?

I am new to node async/sync concept. these day i am confused by the sequence of async function executions.
I am kinda hard to get the benefit of the async calls of single-process node:
Here is an example:
1.Sequence of async/await
Assume an async function:
async readPathInFile(filePath) {
//open filePath, read the content and return
return file path in filePath
}
We have several calls like:
var a = 'a.txt';
var b = await readPathInFile(a); // call_a
var c = await readPathInFile(b); // call_b
var d = await readPathInFile(c); // call_c
As we know the readPathInFile is defined as asynchronous, but the calls
call_a, call_b, call_c relies on the previous ones sequentially.
For these calls, the calls have no differences with synchronous calls.
So, What is the benefit of the asynchronous definition ?
Callbacks concept has the same concerns:
For example:
readPathInFile(filePath, callback) {
//open filePath, read the content and return
callback(file path in filePath)
}
var a = 'a.txt';
readPathInFile(a, (b)=>{
//call_a
readPathInFile(b, (c)=>{
//call_b
readPathInFile(c, (d)=>{
//
});
});
}); // call_a
call_z();
//only if the call_z does not rely on a,b,c, z finishes without caring about a,b,c. This is a only benefit.
Please correct me if my assumption is wrong.
As the term implies, asynchronous means that functions run out of order. In this respect your understanding is correct. What this implies however is not at all obvious.
To answer your question is a few words, the async/await keywords are syntax sugar, meaning they're not so much intrinsic qualities of the language, rather are shortcuts for saying something else. Specifically, the await keyword translates the expression into the equivalent Promise syntax.
function someFunction(): Promise<string> {
return new Promise(function(resolve) {
resolve('Hello World');
});
}
console.log('Step 1');
console.log(await someFunction());
console.log('Step 3');
is the same as saying
...
console.log('Step 1');
someFunction().then(function (res) {
console.log(res);
console.log('Step 3');
});
Note the 3rd console.log is inside the then callback.
When you redefine someFunction using the async syntax, you'd get the following:
async function someFunction() {
return 'Hello World';
}
The two are equivalent.
To answer your question
I'd like to reiterate that async/await are syntactical sugar. their effect can be replicated by existing functionality
Their purpose therefore is to combine the worlds of asynchronous and synchronous code. If you've ever seen an older, larger JS file, you'll quickly notice how many callbacks are used throughout the script. await is the mechanism to use asynchronous functions while avoiding the callback hell, but fundamentally, a function marked async will always be async regardless of how many awaits you put in front of it.
If you're struggling to understand how an await call works, just remember that the callback you pass to .then(...) can be called at any time. This is the essence of asynchronicity. await just adds a clean way to use it.
A visual example
I like analogies. Here's my favourite;
Picture yourself at a cafe. There are 3 people waiting in line. The barista takes your order and informs you that your coffee will be ready in 5 minutes. So this means you can sit down at a table and read a magazine or what ever, instead of waiting in line for the barista to make your coffee and accept payment etc.
The advantage of that is that the people behind you in the queue can get their own things done, instead of waiting in line for the coffees to be ready.
This is fundamentally how asynchronous code works. The barista can only do one thing at a time, but their time can be optimised by allowing everyone in the queue to do other things while they wait
Hope that helps
This article (linked below in the comments) has an awesome illustration: Here it is
What is the benefit of the asynchronous definition ?
To enable standard logical-flow code to run asynchronously. Contrast your examples:
var a = 'a.txt';
var b = await readPathInFile(a); // call_a
var c = await readPathInFile(b); // call_b
var d = await readPathInFile(c); // call_c
vs.
var a = 'a.txt';
readPathInFile(a, (b)=>{
//call_a
readPathInFile(b, (c)=>{
//call_b
readPathInFile(c, (d)=>{
//
});
});
}); // call_a
The latter is often called "callback hell." It's much harder to reason about even in that case, but as soon as you add branching and such, it gets extremely hard to reason about. In contrast, branching with async functions and await is just like branching in synchronous code.
Now, you could write readPathInFileSync (a synchronous version of readPathInFile), which would not be an async function and would let you do this:
var a = 'a.txt';
var b = readPathInFileSync(a); // call_a
var c = readPathInFileSync(b); // call_b
var d = readPathInFileSync(c); // call_c
So what's the difference between that and the async version at the beginning of the answer? The async version lets other things happen on the one main JavaScript thread while waiting for those I/O calls to complete. The synchronous version doesn't, it blocks the main thread, waiting for I/O to complete, preventing anything else from being handled.
Here's an example, using setTimeout to stand in for I/O:
const logElement = document.getElementById("log");
function log(msg) {
const entry = document.createElement("pre");
entry.textContent = Date.now() + ": " + msg;
logElement.appendChild(entry);
entry.scrollIntoView();
}
// Set up "other things" to happen
const things = [
"this", "that", "the other", "something else", "yet another thing", "yada yada"
];
let starting = false; // We need this flag so we can show
// the "Doing X work" messsage before
// we actually start doing it, without
// giving the false impression in
// rare cases that other things happened
// during sync work
otherThing();
function otherThing() {
if (!starting) {
log(things[Math.floor(Math.random() * things.length)]);
}
setTimeout(otherThing, 250);
}
// Stand-in for a synchronous task that takes n milliseconds
function doSomethingSync(n) {
const done = Date.now() + n;
while (Date.now() < done) {
// Busy-wait. NEVER DO THIS IN REAL CODE,
// THIS IS SIMULATING BLOCKING I/O.
}
}
// Stand-in for an asynchronous task that takes n milliseconds
function doSomething(n) {
return new Promise(resolve => {
setTimeout(resolve, n);
});
}
function doWorkSync() {
doSomethingSync(200);
doSomethingSync(150);
doSomethingSync(50);
doSomethingSync(400);
}
async function doWorkAsync() {
await doSomething(200);
await doSomething(150);
await doSomething(50);
await doSomething(400);
}
// Do the work synchronously
document.getElementById("run-sync").addEventListener("click", (e) => {
const btn = e.currentTarget;
btn.disabled = true;
log(">>>> Doing sync work");
starting = true;
setTimeout(() => {
starting = false;
doWorkSync();
log("<<<< Done with sync work");
btn.disabled = false;
}, 50);
});
// Do the work asynchronously
document.getElementById("run-async").addEventListener("click", (e) => {
const btn = e.currentTarget;
btn.disabled = true;
log(">>>> Doing async work");
starting = true;
setTimeout(() => {
starting = false;
doWorkAsync().then(() => {
log("<<<< Done with async work");
btn.disabled = false;
});
}, 50);
});
html {
font-family: sans-serif;
}
#log {
max-height: 5em;
min-height: 5em;
overflow: auto;
border: 1px solid grey;
}
#log * {
margin: 0;
padding: 0;
}
<div>Things happening:</div>
<div id="log"></div>
<input type="button" id="run-sync" value="Run Sync">
<input type="button" id="run-async" value="Run Async">
There are times you don't care about that blocking, which is what the xyzSync functions in the Node.js API are for. For instance, if you're writing a shell script and you just need to do things in order, it's perfectly reasonable to write synchronous code to do that.
In constrast, if you're running a Node.js-based server of any kind, it's crucial not to allow the one main JavaScript thread to be blocked on I/O, unable to handle any other requests that are coming in.
So, in summary:
It's fine to write synchronous I/O code if you don't care about the JavaScript thread doing other things (for instance, a shell script).
If you do care about blocking the JavaScript thread, writing code with async/await makes it much easier (vs. callback functions) to avoid doing that with simple, straight-forward code following the logic of your operation rather than its temporality.
Javascript Promises lets asynchronous functions return values like synchronous methods instead of immediately returning the final value the asynchronous function returns a promise to provide data in the future.
Actually Promises solves the callback hell issue.

How do use Javascript Async-Await as an alternative to polling for a statechange?

I'd like to accomplish the following using promises: only execute further once the state of something is ready. I.e. like polling for an external state-change.
I've tried using promises and async-await but am not getting the desired outcome. What am I doing wrong here, and how do I fix it?
The MDN docs have something similar but their settimeout is called within the promise--that's not exactly what I'm looking for though.
I expect the console.log to show "This function is now good to go!" after 5 seconds, but instead execution seems to stop after calling await promiseForState();
var state = false;
function stateReady (){
state = true;
}
function promiseForState(){
var msg = "good to go!";
var promise = new Promise(function (resolve,reject){
if (state){
resolve(msg);
}
});
return promise;
}
async function waiting (intro){
var result = await promiseForState();
console.log(intro + result)
}
setTimeout(stateReady,5000);
waiting("This function is now ");
What you're doing wrong is the promise constructor executor function executes immediately when the promise is created, and then never again. At that point, state is false, so nothing happens.
Promises (and async/await) are not a replacement for polling. You still need to poll somewhere.
The good news: async functions make it easy to do conditional code with loops and promises.
But don't put code inside promise constructor executor functions, because of their poor error handling characteristics. They are meant to wrap legacy code.
Instead, try this:
var state = false;
function stateReady() {
state = true;
}
const wait = ms => new Promise(resolve => setTimeout(resolve, ms));
async function promiseForState() {
while (!state) {
await wait(1000);
}
return "good to go!";
}
async function waiting(intro) {
var result = await promiseForState();
console.log(intro + result)
}
setTimeout(stateReady,5000);
waiting("This function is now ");
Based on your comments that you are waiting for messages from a server it appears you are trying to solve an X/Y problem. I am therefore going to answer the question of "how do I wait for server messages" instead of waiting for global variable to change.
If your network API accepts a callback
Plenty of networking API such as XMLHttpRequest and node's Http.request() are callback based. If the API you are using is callback or event based then you can do something like this:
function myFunctionToFetchFromServer () {
// example is jQuery's ajax but it can easily be replaced with other API
return new Promise(function (resolve, reject) {
$.ajax('http://some.server/somewhere', {
success: resolve,
error: reject
});
});
}
async function waiting (intro){
var result = await myFunctionToFetchFromServer();
console.log(intro + result);
}
If your network API is promise based
If on the other hand you are using a more modern promise based networking API such as fetch() you can simply await the promise:
function myFunctionToFetchFromServer () {
return fetch('http://some.server/somewhere');
}
async function waiting (intro){
var result = await myFunctionToFetchFromServer();
console.log(intro + result);
}
Decoupling network access from your event handler
Note that the following are only my opinion but it is also the normal standard practice in the javascript community:
In either case above, once you have a promise it is possible to decouple your network API form your waiting() event handler. You just need to save the promise somewhere else. Evert's answer shows one way you can do this.
However, in my not-so-humble opinion, you should not do this. In projects of significant size this leads to difficulty in tracing the source of where the state change comes form. This is what we did in the 90s and early 2000s with javascript. We had a lot of events in our code like onChange and onReady or onData instead of callbacks passed as function parameters. The result was that sometimes it takes you a long time to figure out what code is triggering what event.
Callback parameters and promises forces the event generator to be in the same place in the code as the event consumer:
let this_variable_consumes_result_of_a_promise = await generate_a_promise();
this_function_generate_async_event((consume_async_result) => { /* ... */ });
From the wording of your question you seem to be wanting to do this instead;
..somewhere in your code:
this_function_generate_async_event(() => { set_global_state() });
..somewhere else in your code:
let this_variable_consumes_result_of_a_promise = await global_state();
I would consider this an anti-pattern.
Calling asynchronous functions in class constructors
This is not only an anti-pattern but an impossibility (as you've no doubt discovered when you find that you cannot return the asynchronous result).
There are however design patterns that can work around this. The following is an example of exposing a database connection that is created asynchronously:
class MyClass {
constructor () {
// constructor logic
}
db () {
if (this.connection) {
return Promise.resolve(this.connection);
}
else {
return new Promise (function (resolve, reject) {
createDbConnection(function (error, conn) {
if (error) {
reject(error);
}
else {
this.connection = conn; // cache the connection
resolve(this.connection);
}
});
});
}
}
}
Usage:
const myObj = new MyClass();
async function waiting (intro){
const db = await myObj.db();
db.doSomething(); // you can now use the database connection.
}
You can read more about asynchronous constructors from my answer to this other question: Async/Await Class Constructor
The way I would solve this, is as follows. I am not 100% certain this solves your problem, but the assumption here is that you have control over stateReady().
let state = false;
let stateResolver;
const statePromise = new Promise( (res, rej) => {
stateResolver = res;
});
function stateReady(){
state = true;
stateResolver();
}
async function promiseForState(){
await stateResolver();
const msg = "good to go!";
return msg;
}
async function waiting (intro){
const result = await promiseForState();
console.log(intro + result)
}
setTimeout(stateReady,5000);
waiting("This function is now ");
Some key points:
The way this is written currently is that the 'state' can only transition to true once. If you want to allow this to be fired many times, some of those const will need to be let and the promise needs to be re-created.
I created the promise once, globally and always return the same one because it's really just one event that every caller subscribes to.
I needed a stateResolver variable to lift the res argument out of the promise constructor into the global scope.
Here is an alternative using .requestAnimationFrame().
It provides a clean interface that is simple to understand.
var serverStuffComplete = false
// mock the server delay of 5 seconds
setTimeout(()=>serverStuffComplete = true, 5000);
// continue until serverStuffComplete is true
function waitForServer(now) {
if (serverStuffComplete) {
doSomethingElse();
} else {
// place this request on the next tick
requestAnimationFrame(waitForServer);
}
}
console.log("Waiting for server...");
// starts the process off
requestAnimationFrame(waitForServer);
//resolve the promise or whatever
function doSomethingElse() {
console.log('Done baby!');
}

Chaining a lot of (>40,000) native promises together eats up too much memory

I have a project where I need to write a function to calculate several things sequentially and then write the results to a SQL DB. Unfortunately, I need to repeat this over 40,000 times.
I use node.js and promises to accomplish this but the memory usage goes to almost 2GB a lot of times the program just dies after 10,000 or so calculations. I developed my own native promiseEach function which takes a array items sequantially and chains it with promises.
What Am I doing wrong here?:
function promiseEach(array,promiseFn){
return new Promise(function(resolve,reject){
try {
var promiseArray = [];
var json = { array: array, counter: 0 }
for (var i = 0; i < array.length; i++) {
promiseArray.push(promiseFn)
}
promiseArray.reduce(function(preFn,curFn,index,pArray){
return preFn
.then( function(z){ return json.array[json.counter++] })
.then(curFn)
},
Promise.resolve(json.array[json.counter]))
.then(resolve,reject)
}catch(err){
console.log("promiseEach ERROR:");
reject(err)
}
})
}
Well, you're overcomplicating the function for a start - from what I can tell, you're calling promiseFn for each of the content of array, in sequence
The following code is identical in function to your code
function promiseEach(array, promiseFn) {
return array.reduce(function(prev, item) {
return prev.then(function() {
return promiseFn(item);
});
}, Promise.resolve());
}
No guarantee that this will fix the actual issue though
for completeness, as you are coding in nodejs, the same code as written in ES2015+
let promiseEach = (array, promiseFn) =>
array.reduce((prev, item) =>
prev.then(() =>
promiseFn(item)
)
, Promise.resolve());
Two lines in the posted code
.then( function(z){ return json.array[json.counter++] })
then(curFn)
seem to indicate promseFn is to be called with a new parameter after the operation performed by a previous call to it completes, and that the fulfilled value of the previous call is not made use of in the promise chain.
A suggestion to avoid creating (39,999 or so) intermediate chained promises before returning from the each function, is to use intermediate promises returned by promiseFn to call a promise factory function when they become fulfilled, and return a final promise which is only fulfilled by the last intermediate promise.
The concept code which follows does not contain a call to reduce because no reduce operation is performed:
function promiseEach( array, promiseFn) {
var resolve, reject;
var counter = 0;
var final = new Promise( function( r, j) { resolve = r; reject = j});
function nextPromise() {
promiseFn( array[counter++])
.then( (counter < array.length ? nextPromise : resolve), reject);
}
if( array.length) {
nextPromise();
}
else {
reject( new Error("promiseEach called on empty array"));
}
return final;
}
Added Notes:
The promiseEach function above was tested in node/js using both a synchronous and asynchronous promiseFn with no concurrent updates to the data array.
Synchronous Test
function promiseSynch( z) {
return new Promise( function(resolve,reject){resolve(z);});
}
was able to process an array with 1 million (1000,000) entries) in about 10 seconds using a Win7 i86 notebook with 1GB memory and a browser, editor and task manager open at the same time. Memory usage was flat at around 80% of available memory.
Asynchronous Test
function promiseAsynch( z) {
var resolve;
function delayResolve() {
resolve( z);
}
return new Promise( ( r, j)=>{
resolve = r;
setTimeout(delayResolve,1);
});
}
managed an array of 100,000 entries in a little over 20 minutes on the same notebook, again with flat memory usage of around 80%.
Conclusion
The testing suggests that Promises are not causing memory leaks simply because they are being used, and that they should be capable of handling passing lengthy array data sequentially into a promise returning function.
This implies there is some other causes of memory allocation failure, or mysterious processing outages, which needs to be found before the problem can be fully solved. As a suggestion you might start with a search for memory leak issues related to the DB library being used.

Promise not calling "then" callback

C#/Dart's async/await feature has spoiled me a little. I noticed that ES7 has a proposal for similar syntax, and that there is a library that adds that feature to Node.JS apps.
This library doesn't work in the browser. I thought by trying to write my own mini-solution might help me see why, so I decided to try it out, just to educate myself. This is my attempt so far, on Github Gist. I've included snippets below.
In the await function:
function await (promise) {
/* PromiseState declared here */
var self = {
result: null,
state: PromiseState.pending
};
function onPromiseUpdate(context, newState) {
return function (value) {
console.log("Promise Updated!");
context.result = value;
context.state = newState;
}
}
console.log("awaiting");
// this never shows the Promise to be pending (using example below)
console.log(promise);
promise
.then(onPromiseUpdate(self, PromiseState.resolved)) // this is never called
.catch(onPromiseUpdate(self, PromiseState.rejected));
// Shouldn't this not block the app if it's inside a Promise?
while (self.state == PromiseState.pending) ;
console.log("delegating");
/* just returning the value here */
}
Example:
// is there another way to pass 'await' without a parameter?
unstableFunc = async(function (await) {
console.log("running unstable");
if(Math.random() > 0.5) return Math.random() * 15 + 5;
else throw "fail";
});
expensiveFunc = async(function (await, x, y) {
result = await(unstableFunc())
for (var i = y * 8; i >= 0; i--) {
result *= i ** y / x;
console.log(result);
}
return result;
});
window.addEventListener('load', function () {
console.log("about to do something expensive");
// 'expensiveFunc' returns a Promise. Why does this block the webpage?
expensiveFunc(10, 2).then(function (val) {
console.log("Result: " + val.toString());
});
console.log("called expensive function");
});
When running this, the browser doesn't finish loading. It's to do with the loop I set up to check for the state of a Promise being resolved, but that's not the focus of my question. What I'm wondering is why neither the then or catch callbacks are being called. When logging, the console never logs a pending Promise, and I always thought that then and catch executes their callbacks immediately if the future is not pending. Why isn't that so in this case?
The moment this line of code is reached/executed:
while (self.state == PromiseState.pending) ;
your script becomes blocked forever (or until the tab crashes or the browser kills it). While that loop is running, callbacks cannot run (nor can anything else) and therefore your promise state will never change to pending thus causing an infinite loop. Having promises involved doesn't change any of the above.

JavaScript, Node.js: is Array.forEach asynchronous?

I have a question regarding the native Array.forEach implementation of JavaScript: Does it behave asynchronously?
For example, if I call:
[many many elements].forEach(function () {lots of work to do})
Will this be non-blocking?
No, it is blocking. Have a look at the specification of the algorithm.
However a maybe easier to understand implementation is given on MDN:
if (!Array.prototype.forEach)
{
Array.prototype.forEach = function(fun /*, thisp */)
{
"use strict";
if (this === void 0 || this === null)
throw new TypeError();
var t = Object(this);
var len = t.length >>> 0;
if (typeof fun !== "function")
throw new TypeError();
var thisp = arguments[1];
for (var i = 0; i < len; i++)
{
if (i in t)
fun.call(thisp, t[i], i, t);
}
};
}
If you have to execute a lot of code for each element, you should consider to use a different approach:
function processArray(items, process) {
var todo = items.concat();
setTimeout(function() {
process(todo.shift());
if(todo.length > 0) {
setTimeout(arguments.callee, 25);
}
}, 25);
}
and then call it with:
processArray([many many elements], function () {lots of work to do});
This would be non-blocking then. The example is taken from High Performance JavaScript.
Another option might be web workers.
If you need an asynchronous-friendly version of Array.forEach and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.
async.each(openFiles, saveFile, function(err){
// if any of the saves produced an error, err would equal that error
});
There is a common pattern for doing a really heavy computation in Node that may be applicable to you...
Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.
Forking a child -
For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.
For example:
var child_process = require('child_process');
function run_in_child(array, cb) {
var process = child_process.exec('node libfn.js', function(err, stdout, stderr) {
var output = JSON.parse(stdout);
cb(err, output);
});
process.stdin.write(JSON.stringify(array), 'utf8');
process.stdin.end();
}
Array.forEach is meant for computing stuff not waiting, and there is nothing to be gained making computations asynchronous in an event loop (webworkers add multiprocessing, if you need multi-core computation). If you want to wait for multiple tasks to end, use a counter, which you can wrap in a semaphore class.
Edit 2018-10-11:
It looks like there is a good chance the standard described below may not go through, consider pipelineing as an alternative (does not behave exactly the same but methods could be implemented in a similar manor).
This is exactly why I am excited about es7, in future you will be able to do something like the code below (some of the specs are not complete so use with caution, I will try to keep this up to date). But basically using the new :: bind operator, you will be able to run a method on an object as if the object's prototype contains the method. eg [Object]::[Method] where normally you would call [Object].[ObjectsMethod]
Note to do this today (24-July-16) and have it work in all browsers you will need to transpile your code for the following functionality:Import / Export, Arrow functions, Promises, Async / Await and most importantly function bind. The code below could be modfied to use only function bind if nessesary, all this functionality is neatly available today by using babel.
YourCode.js (where 'lots of work to do' must simply return a promise, resolving it when the asynchronous work is done.)
import { asyncForEach } from './ArrayExtensions.js';
await [many many elements]::asyncForEach(() => lots of work to do);
ArrayExtensions.js
export function asyncForEach(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
for(let i=0;i<ar.length;i++)
{
await callback.call(ar, ar[i], i, ar);
}
});
};
export function asyncMap(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
const out = [];
for(let i=0;i<ar.length;i++)
{
out[i] = await callback.call(ar, ar[i], i, ar);
}
return out;
});
};
This is a short asynchronous function to use without requiring third party libs
Array.prototype.each = function (iterator, callback) {
var iterate = function () {
pointer++;
if (pointer >= this.length) {
callback();
return;
}
iterator.call(iterator, this[pointer], iterate, pointer);
}.bind(this),
pointer = -1;
iterate(this);
};
There is a package on npm for easy asynchronous for each loops.
var forEachAsync = require('futures').forEachAsync;
// waits for one request to finish before beginning the next
forEachAsync(['dogs', 'cats', 'octocats'], function (next, element, index, array) {
getPics(element, next);
// then after all of the elements have been handled
// the final callback fires to let you know it's all done
}).then(function () {
console.log('All requests have finished');
});
Also another variation forAllAsync
It is possible to code even the solution like this for example :
var loop = function(i, data, callback) {
if (i < data.length) {
//TODO("SELECT * FROM stackoverflowUsers;", function(res) {
//data[i].meta = res;
console.log(i, data[i].title);
return loop(i+1, data, errors, callback);
//});
} else {
return callback(data);
}
};
loop(0, [{"title": "hello"}, {"title": "world"}], function(data) {
console.log("DONE\n"+data);
});
On the other hand, it is much slower than a "for".
Otherwise, the excellent Async library can do this: https://caolan.github.io/async/docs.html#each
These code snippet will give you better understanding of forEach and forOf comparison.
/* eslint-disable no-console */
async function forEachTest() {
console.log('########### Testing forEach ################ ')
console.log('start of forEachTest func')
let a = [1, 2, 3]
await a.forEach(async (v) => {
console.log('start of forEach: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forEach: ', v)
})
console.log('end of forEachTest func')
}
forEachTest()
async function forOfTest() {
await new Promise(resolve => setTimeout(resolve, 10000)) //just see console in proper way
console.log('\n\n########### Testing forOf ################ ')
console.log('start of forOfTest func')
let a = [1, 2, 3]
for (const v of a) {
console.log('start of forOf: ', v)
await new Promise(resolve => setTimeout(resolve, v * 1000))
console.log('end of forOf: ', v)
}
console.log('end of forOfTest func')
}
forOfTest()
Here is a small example you can run to test it:
[1,2,3,4,5,6,7,8,9].forEach(function(n){
var sum = 0;
console.log('Start for:' + n);
for (var i = 0; i < ( 10 - n) * 100000000; i++)
sum++;
console.log('Ended for:' + n, sum);
});
It will produce something like this(if it takes too less/much time, increase/decrease the number of iterations):
(index):48 Start for:1
(index):52 Ended for:1 900000000
(index):48 Start for:2
(index):52 Ended for:2 800000000
(index):48 Start for:3
(index):52 Ended for:3 700000000
(index):48 Start for:4
(index):52 Ended for:4 600000000
(index):48 Start for:5
(index):52 Ended for:5 500000000
(index):48 Start for:6
(index):52 Ended for:6 400000000
(index):48 Start for:7
(index):52 Ended for:7 300000000
(index):48 Start for:8
(index):52 Ended for:8 200000000
(index):48 Start for:9
(index):52 Ended for:9 100000000
(index):45 [Violation] 'load' handler took 7285ms
Although Array.forEach is not asynchronous, you can get asynchronous "end result". Example below:
function delayFunction(x) {
return new Promise(
(resolve) => setTimeout(() => resolve(x), 1000)
);
}
[1, 2, 3].forEach(async(x) => {
console.log(x);
console.log(await delayFunction(x));
});
Use Promise.each of bluebird library.
Promise.each(
Iterable<any>|Promise<Iterable<any>> input,
function(any item, int index, int length) iterator
) -> Promise
This method iterates over an array, or a promise of an array, which contains promises (or a mix of promises and values) with the given iterator function with the signature (value, index, length) where the value is the resolved value of a respective promise in the input array. Iteration happens serially. If the iterator function returns a promise or a thenable, then the result of the promise is awaited before continuing with next iteration. If any promise in the input array is rejected, then the returned promise is rejected as well.
If all of the iterations resolve successfully, Promise.each resolves to the original array unmodified. However, if one iteration rejects or errors, Promise.each ceases execution immediately and does not process any further iterations. The error or rejected value is returned in this case instead of the original array.
This method is meant to be used for side effects.
var fileNames = ["1.txt", "2.txt", "3.txt"];
Promise.each(fileNames, function(fileName) {
return fs.readFileAsync(fileName).then(function(val){
// do stuff with 'val' here.
});
}).then(function() {
console.log("done");
});

Categories

Resources