Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
The newer node js has async await, which are really cool because it makes the code look better.
I was wondering if it's a good idea to make every class method async, even if it doesn't need to return a promise?
In my use case I actually sort of need that because I'm trying to share dependencies between multiple child processes, and I'm using a combination of Proxy and child process communication to achieve this. And obviously I need promises because I need to wait for the processes to respond or send messages.
But does this have any potential side effects, maybe in the long term?
To be more clear, I want to do this solely for the benefit of cool syntax.
const database = CreateProxy('database');
await database.something();
From another process.
versus some code that just requests something from the parent process like
process.send('getSomethingFromDb');
Under the hood both use messaging, but the first one makes it look like it doesn't on the surface
This concept is a subject for Occam's razor. There are no benefits, while there may be disadvantages.
Extra overhead (both CPU and time) is one of the problems. Promises take some time and resources to be resolved. This may take much less than a millisecond, yet a lag can accumulate.
Another problem is that asynchronicity is contagious, once it reaches module scope, async IIFE should be used everywhere - because top-level await isn't supported yet:
module.export = (async () => {
await require('foo');
// await was accidentally dropped
// this results in race condition and incorrect error handling
require('foo');
...
})();
And here's a good example of a disadvantage that complicates error handling:
async function foo() {
throw new Error('foo');
}
async function bar() {
try {
return foo();
} catch (err) {
console.log('caught with bar');
}
}
bar(); // UnhandledPromiseRejectionWarning: Error: foo
Despite control flow looks synchrounous-like in async, errors are handled differently. foo returns rejected promise, and returned values aren't handled with try..catch in async functions. A rejection won't be handled in bar.
This wouldn't happen with regular function:
function foo() {
throw new Error('foo');
}
function bar() {
try {
return foo();
} catch (err) {
console.log('caught with bar');
}
}
bar(); // caught with bar
Things may become more complicated with third-party libraries that were designed to work synchronously.
I was wondering if it's a good idea to make every class method async,
even if it doesn't need to return a promise?
Your code will work, but I would not recommand it for two reason :
Unnecessary memory/cpu usage
It will makes your code hard to understand. Knowing which function is asynchronous or synchronous is important to understand how a system work and what it is doing.
The result of calling an async function will always be a Promise, regardless of if the function implements any asynchronous behavior.
const asyncTest = async () => 3;
console.log(asyncTest()); // logs 'Promise {<resolved>: 3}'
Because of that, you have to always be sure to call such function with await. But that is purely a comfort issue, if even that for you. But also, creating and resolving Promises adds a little bit of time to each function call, so if performance is critical, you should avoid to call async functions in large numbers, if it can be avoided.
Related
TLDR: must I use async and await through a complicated call stack if most functions are not actually async? Are there alternative programming patterns?
Context
This question is probably more about design patterns and overall software architecture than a specific syntax issue. I am writing an algorithm in node.js that is fairly involved. The program flow involves an initial async call to fetch some data, and then moves on to a series of synchronous calculation steps based on the data. The calculation steps are iterative, and as they iterate, they generate results. But occasionally, if certain conditions are met by the calculations, more data will need to be fetched. Here is a simplified version in diagram form:
The calcStep loop runs thousands of times sychronously, pushing to the results. But occasionally it kicks back to getData, and the program must wait for more data to come in before continuing on again to the calcStep loop.
In code
A boiled-down version of the above might look like this in JS code:
let data;
async function init() {
data = await getData();
processData();
calcStep1();
}
function calcStep1() {
// do some calculations
calcStep2();
}
function calcStep2() {
// do more calculations
calcStep3();
}
function calcStep3() {
// do more calculations
pushToResults();
if (some_condition) {
getData(); // <------ question is here
}
if (stop_condition) {
finish();
} else {
calcStep1();
}
}
Where pushToResults and finish are also simple synchronous functions. I write the calcStep functions here are separate because in the real code, they are actually class methods from classes defined based on separation of concerns.
The problem
The obvious problem arises that if some_condition is met, and I need to wait to get more data before continuing the calcStep loop, I need to use the await keyword before calling getData in calcStep3, which means calcStep3 must be called async, and we must await that as well in calcStep2, and all the way up the chain, even synchronous functions must be labeled async and awaited.
In this simplified example, it would not be so offensive to do that. But in reality, my algorithm is far more complicated, with a much deeper call stack involving many class methods, iterations, etc. Is there a better way to manage awaiting functions in this type of scenario? Other tools I can use, like generators, or event emitters? I'm open to simple solutions or paradigm shifts.
If you don't want to make the function async and propagate this up the chain, use .then(). You'll need to duplicate the following code inside .then(); you can simplify this by putting it in its own function.
function maybeRepeat() {
if (stop_condition) {
finish();
} else {
calcStep1();
}
}
function calcStep3() {
// do more calculations
pushToResults();
if (some_condition) {
getData().then(maybeRepeat);
} else {
maybeRepeat()
}
}
I'm having trouble understanding control flow with asynchronous programming in JS. I come from classic OOP background. eg. C++. Your program starts in the "main" -- top level -- function and it calls other functions to do stuff, but everything always comes back to that main function and it retains overall control. And each sub-function retains control of what they're doing even when they call sub functions. Ultimately the program ends when that main function ends. (That said, that's about as much as I remember of my C++ days so answers with C++ analogies might not be helpful lol).
This makes control flow relatively easy. But I get how that's not designed to handle event driven programming as needed on something like a web server. While Javascript (let's talk node for now, not browser) handles event-driven web servers with callbacks and promises, with relative ease... apparently.
I think I've finally got my head around the idea that with event-driven programming the entry point of the app might do little more than set up a bunch of listeners and then get out of the way (effectively end itself). The listeners pick up all the action and respond.
But sometimes stuff still has to be synchronous, and this is where I keep getting unstuck.
With callbacks, promises, or async/await, we can effectively build synchronous chains of events. eg with Promises:
doSomething()
.then(result => doSomethingElse(result))
.then(newResult => doThirdThing(newResult))
.then(finalResult => {
console.log(`Got the final result: ${finalResult}`);
})
.catch(failureCallback);
});
Great. I've got a series of tasks I can do in order -- kinda like more traditional synchronous programming.
My question is: sometimes you need to deviate from the chain. Ask some questions and act differently depending on the answers. Perhaps conditionally there's some other function you need to call to get something else you need along the way. You can't continue without it. But what if it's an async function and all it's going to give me back is a promise? How do I get the actual result without the control flow running off and eloping with that function and never coming back?
Example:
I want to call an API in a database, get a record, do something with the data in that record, then write something back to the database. I can't do any of those steps without completing the previous step first. Let's assume there aren't any sync functions that can handle this API. No problem. A Promise chain (like the above) seems like a good solution.
But... Let's say when I call the database the first time, the authorization token I picked up earlier for it has expired and I have to get a new one. I don't know that until I make that first call. I don't want to get (or even test for the need for) a new auth token every time. I just want to be able to respond when a call fails because I need one.
Ok... In synchronous pseudo-code that might look something like this:
let Token = X
Step 1: Call the database(Token). Wait for the response.
Step 2: If response says need new token, then:
Token = syncFunctionThatGetsAndReturnsNewToken().
// here the program waits till that function is done and I've got my token.
Repeat Step 1
End if
Step 3: Do the rest of what I need to do.
But now we need to do it in Javascript/node with only async functions, so we can use a promise (or callback) chain?
let Token = X
CallDatabase(Token)
.then(check if response says we need new token, and if so, get one)
.then(...
Wait a sec. That "if so, get one" is the part that's screwing me. All this asynchronicity in JS/node isn't going to wait around for that. That function is just going to "promise" me a new token sometime in the future. It's an IOU. Great. Can't call the database with an IOU. Well ok, I'd be happy to wait, but node and JS won't let me, because that's blocking.
That's it in a (well, ok, rather large) nutshell. What am I missing? How do I do something like the above with callbacks or Promises?
I'm sure there's a stupid "duh" moment in my near future here, thanks to one or more of you wonderful people. I look forward to it. 😉 Thanks in advance!
What you do with the .then call is to attach a function which will run when the Promise resolves in a future task. The processing of that function is itself synchronous, and can use all the control flows you'd want:
getResponse()
.then(response => {
if(response.needsToken)
return getNewToken().then(getResponse);
})
.then(() => /* either runs if token is not expired or token was renewed */)
If the token is expired, instead of directly scheduling the Promise returned by .then, a new asynchronous action gets started to retrieve a new token. If that asynchronous action is done, in a new task it'll resolve the Promise it returns, and as that Promise was returned from the .then callback, this will also then resolve the outer Promise and the Promise chain continues.
Note that these Promise chains can get complicated very quick, and with async functions this can be written more elegantly (though under the hood it is about the same):
do {
response = await getResponse();
if(response.needsToken)
await renewToken();
} while(response.needsToken)
Fist of all, I would recommend against using then and catch method to listen to Promise result. They tend to create a too nested code which is hard to read and maintain.
I worked a prototype for your case which makes use of async/await. It also features a mechanism to keep track of attempts we are making to authenticate to database. If we reach max attempts, it would be viable to send an emergency alert to administrator etc for notification purposes. This avoid the endless loop of trying to authenticate and instead helps you to take proper actions.
'use strict'
var token;
async function getBooks() {
// In case you are not using an ORM(Sequelize, TypeORM), I would suggest to use
// at least a query builder like Knex
const query = generateQuery(options);
const books = executeQuery(query)
}
async function executeQuery(query) {
let attempts = 0;
let authError = true;
if (!token) {
await getDbAuthToken();
}
while (attemps < maxAttemps) {
try {
attempts++;
// call database
// return result
}
catch(err) {
// token expired
if (err.code == 401) {
await getDbAuthToken();
}
else {
authError = false;
}
}
}
throw new Error('Crital error! After several attempts, authentication to db failed. Take immediate steps to fix this')
}
// This can be sync or async depending on the flow
// how the auth token is retrieved
async function getDbAuthToken() {
}
I'm aware of the power of promises, however I have several old functions that are synchronous:
function getSomething() {
return someExternalLibrary.functionReturnsAValue()
}
console.log(getSomething()); // eg prints 'foo'
Unfortunately, when someExternalLibrary updated, it has removed functionReturnsAValue() and has lumped me with functionReturnsAPromise():
function getSomething() {
return someExternalLibrary.functionReturnsAPromise()
}
console.log(getSomething()); // now prints '[object]'
This of course, breaks absolutely everything written that depends on what used to be a simple value.
Obviously, I'd prefer two things:
ask the original library to keep a synchronous return value. (Not going to happen -- b/c they have refused)
A way to actually wait for a value
I have read numerous articles on why promises are great, ad nauseam, but the simple fact is: If I embrace promises, all I really do is shuffle promises onto some other part of the code, which then must deal with the promise of a value...
Is there a way (in nodejs) to actually wait for a promise to get itself together?
The best I can find is to use coroutines and yield, but really, it's still passing the buck. To be clear, I want the function getSomething to continue to return a value. Is there a way to do it?
Clearly, I fear I've misunderstood something about Promises...
The app is for non-browser implementations and runs purely from the command line. I've been trying to understand how bluebird's reflect() might help, to no avail.
(Yes, I'm aware this question has been asked many times in various formats, but I can't find a suitable answer to the core issue. If anything, I'm looking for the opposite of this question. The closest related (but unhelpful) question I can find is: Managing promise dependencies.)
There's the concept of generator functions. These are a special kind of function in both syntax (asterisk notation) and semantics. Unlike regular functions, generator functions return something that's also new to ECMAScript: iterators. Iterators happen to be objects made specifically to be iterated on, e.g. with the all new for...of loop. They can be also iterated on manually by calling their 'next' method. Each such call produces an object containing two properties: 'value' (iterator's current value) and 'done' (a boolean indicating whether we reached the last value of the iterable). However, the best thing about generator functions is their ability to suspend their execution each time a keyword 'yield' is encountered. Let's have a glimpse of how it all works together:
'use strict';
let asyncTask = () =>
new Promise((resolve, reject) => {
if (Math.random() > 0.5) {
resolve(1);
} else {
reject(new Error('Something went wrong'));
}
});
let makeMeLookSync = fn => {
let iterator = fn();
let loop = result => {
!result.done && result.value.then(
res => loop(iterator.next(res)),
err => loop(iterator.throw(err))
);
};
loop(iterator.next());
};
makeMeLookSync(function* () {
try {
let result = yield asyncTask();
console.log(result);
} catch (err) {
console.log(err.message);
}
});
The short answer
I am told repeatedly: You can't undo functions that have been promisified.
Edit: An upcoming solution
It appears that the ES2017 (although still draft), goes a long way in making promisified code easier to work with:
https://ponyfoo.com/articles/understanding-javascript-async-await
It seems that there is also a node library ready for this support too: https://github.com/normalize/mz.
Using this methodology, having apis converted to Promises won't be so bad (although it still appears that promises still poison the rest of the codebase):
const fs = require('mz/fs')
async function doSomething () {
if (await fs.exists(__filename)) // do something
}
The rest of this answer is just a general commentary on the problem.
Why we need a solution
Let's start with a sample piece of traditional synchronous code, in 3 flavours from more 'older-fashioned' to 'newer':
This is the traditional javascript way, requiring exception based programming to handle unexpected errors:
function getSomething() {
if (someproblem) throw new Error('There is a problem');
return 'foo';
}
However, adding try/ catch statements becomes very laborious and tedious, very quickly.
With the advent of node.js, callbacks were made popular, which nicely circumvented the issue, since each caller was explicitly forced to deal with error conditions in the same callback. This meant less errors in the caller's code:
function getSomething(callback) {
if (callback) {
if (someproblem)
callback(new Error('There is a problem'), null);
else
callback(null, 'foo');
}
return 'foo';
}
Then, the after some teething issues, node.js quickly proved itself for server-side communications, and people were amazed at the speed that asynchronous solutions provided. Node application frameworks like Express and Meteor grew, which focused on this.
Unfortunately, using the same callback scheme quickly became troublesome and the developers dealing in asynchronous code started using Promises in an effort to linearize the code, to make it readable, like the traditional (try/catch) code was.
The problem is that it got evangenlized too much. Everyone started thinking that Promises are the way to go. Personally, I call it a poison on a codebase. Once you have anything that uses Promises, your whole codebase must become asynchronous. This is not always a sensible nor a practical solution, IMHO.
The worst of all side effects is that the above function, even though it is completely synchronous, can be written in Promises too:
var bluebird = require('bluebird');
function getSomething() {
// IMHO, this is ridiculous code, but is increasingly popular.
if (someproblem) return Promise.reject(new Error('There is a problem'));
return Promise.resolve('foo');
}
For those who doubt this is a problem, perhaps should look at the SO question: How do I convert an existing callback API to promises?. Pay particular attention to #3, Node-style callback.
So, for anyone who cares, I would like to suggest that there needs to be a 'pill' for Promises. I urge that we need more than promises: we need results, and sometimes in a timely manner.
Take a look at the default node.js api. It does not use Promises. It also provides both synchronous and asynchronous calls to appropriate parts of the api (eg File System).
For those of you who feel tempted to downvote this answer: that is your prerogative, but there are clear issues on when Promises are not the answer, and I feel strongly that there are cases when we need to be able to re-synchronize decoupled code.
I also apologize for this 'blog-post' styled answer.
As cowboy says down in the comments here, we all want to "write [non-blocking JavaScript] asynchronous code in a style similar to this:
try
{
var foo = getSomething(); // async call that would normally block
var bar = doSomething(foo);
console.log(bar);
}
catch (error)
{
console.error(error);
}
"
So people have come up solutions to this problem like
callback libraries (eg async)
promises
event patterns
streamline
domains and
generators.
But none of these lead to code as simple and easy to understand as the sync-style code above.
So why isn't possible for javascript compilers/interpreters to just NOT block on the statements we currently know as "blocking"? So why isn't possible for javascript compilers/interpreters to handle the sync syntax above AS IF we'd written it in an async style?"
For example, upon processing getSomething() above, the compiler/interpreter could just say "this statement is a call to [file system/network resource/...], so I'll make a note to listen to responses from that call and in the meantime get on with whatever's in my event loop". When the call returns, execution can proceed to doSomething().
You would still maintain all of the basic features of popular JavaScript runtime environments
single threaded
event loop
blocking operations (I/O, network, wait timers) handled "asynchronously"
This would be simply a tweak to the syntax, that would allow the interpreter to pause execution on any given bit of code whenever IT DETECTS an async operation, and instead of needing callbacks, code just continues from the line after the async call when the call returns.
As Jeremy says
there is nothing in the JavaScript runtime that will preemptively
pause the execution of a given task, permit some other code to execute
for a while, and then resume the original task
Why not? (As in, "why couldn't there be?"... I'm not interested in a history lesson)
Why does a developer have to care about whether a statement is blocking or not? Computers are for automating stuff that humans are bad at (eg writing non-blocking code).
You could perhaps implement it with
a statement like "use noblock"; (a bit like "use strict";) to turn this "mode" on for a whole page of code. EDIT: "use noblock"; was a bad choice, and misled some answerers that I was trying to change the nature of common JavaScript runtimes altogether. Something like 'use syncsyntax'; might better describe it.
some kind of parallel(fn, fn, ...); statement allowing you to run things in parallel while in "use syncsyntax"; mode - eg to allow multiple async activities to be kicked off at once
EDIT: a simple sync-style syntax wait(), which would be used instead of setTimeout() in "use syncsyntax"; mode
EDIT:
As an example, instead of writing (standard callback version)
function fnInsertDB(myString, fnNextTask) {
fnDAL('insert into tbl (field) values (' + myString + ');', function(recordID) {
fnNextTask(recordID);
});
}
fnInsertDB('stuff', fnDeleteDB);
You could write
'use syncsyntax';
function fnInsertDB(myString) {
return fnDAL('insert into tbl (field) values (' + myString ');'); // returns recordID
}
var recordID = fnInsertDB('stuff');
fnDeleteDB(recordID);
The syncsyntax version would process exactly the same way as the standard version, but it's much easier to understand what the programmer intended (as long as you understand that syncsyntax pauses execution on this code as discussed).
So why isn't possible for javascript compilers/interpreters to just NOT block on the statements we currently know as "blocking"?
Because of concurrency control. We want them to block, so that (in JavaScript's single-threaded nature) we are safe from race conditions that alter the state of our function while we still are executing it. We must not have an interpreter that suspends the execution of the current function at any arbitrary statement/expression and resumes with some different part of the program.
Example:
function Bank() {
this.savings = 0;
}
Bank.prototype.transfer = function(howMuch) {
var savings = this.savings;
this.savings = savings + +howMuch(); // we expect `howMuch()` to be blocking
}
Synchronous code:
var bank = new Bank();
setTimeout(function() {
bank.transfer(prompt); // Enter 5
alert(bank.savings); // 5
}, 0);
setTimeout(function() {
bank.transfer(prompt); // Enter 3
alert(bank.savings); // 8
}, 100);
Asynchronous, arbitrarily non-blocking code:
function guiPrompt() {
"use noblock";
// open form
// wait for user input
// close form
return input;
}
var bank = new Bank();
setTimeout(function() {
bank.transfer(guiPrompt); // Enter 5
alert(bank.savings); // 5
}, 0);
setTimeout(function() {
bank.transfer(guiPrompt); // Enter 3
alert(bank.savings); // 3 // WTF?!
}, 100);
See https://glyph.twistedmatrix.com/2014/02/unyielding.html for a longer (and language-agnostic) explanation.
there is nothing in the JavaScript runtime that will preemptively pause the execution of a given task, permit some other code to execute for a while, and then resume the original task
Why not?
For simplicity and security, see above. (And, for the history lesson: That's how it just was done)
However, this is no longer true. With ES6 generators, there is something that lets you explicitly pause execution of the current function generator: the yield keyword.
As the language evolves, there are also async and await keywords planned for ES7.
generators [… don't …] lead to code as simple and easy to understand as the sync code above.
But they do! It's even right in that article:
suspend(function* () {
// ^ "use noblock" - this "function" doesn't run continuously
try {
var foo = yield getSomething();
// ^^^^^ async call that does not block the thread
var bar = doSomething(foo);
console.log(bar);
} catch (error) {
console.error(error);
}
})
There is also a very good article on this subject here: http://howtonode.org/generators-vs-fibers
Why not? No reason, it just hadn't been done.
And here in 2017, it has been done in ES2017: async functions can use await to wait, non-blocking, for the result of a promise. You can write your code like this if getSomething returns a promise (note the await) and if this is inside an async function:
try
{
var foo = await getSomething();
var bar = doSomething(foo);
console.log(bar);
}
catch (error)
{
console.error(error);
}
(I've assumed there that you only intended getSomething to be asynchronous, but they both could be.)
Live Example (requires up-to-date browser like recent Chrome):
function getSomething() {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (Math.random() < 0.5) {
reject(new Error("failed"));
} else {
resolve(Math.floor(Math.random() * 100));
}
}, 200);
});
}
function doSomething(x) {
return x * 2;
}
(async () => {
try
{
var foo = await getSomething();
console.log("foo:", foo);
var bar = doSomething(foo);
console.log("bar:", bar);
}
catch (error)
{
console.error(error);
}
})();
The first promise fails half the time, so click Run repeatedly to see both failure and success.
You've tagged your question with NodeJS. If you wrap the Node API in promises (for instance, with promisify), you can write nice straight-forward synchronous-looking code that runs asynchronously.
Because Javascript interpreters are single-threaded, event driven. This is how the initial language was developed.
You can't do "use noblock" because no other work can occur during that phase. This means your UI will not update. You cannot respond to mouse or other input event from the user. You cannot redraw the screen. Nothing.
So you want to know why? Because javascript can cause the display to change. If you were able to do both simultaneously you'd have all these horrible race conditions with your code and the display. You might think you've moved something on the screen, but it hasn't drawn, or it drew and you moved it after it drew and now it's gotta draw again, etc. This asynchronous nature allows, for any given event in the execution stack to have a known good state -- nothing is going to modify the data that is being used while this is being executed.
That is not to say what you want doesn't exist, in some form.
The async library allows you to do things like your parallel idea (amongst others).
Generators/async/wait will allow you to write code that LOOKS like what you want (although it'll be asynchronous by nature).
Although you are making a false claim here -- humans are NOT bad at writing asynchronous code.
The other answers talked about the problems multi-threading and parallelism introduce. However, I want to address your answer directly.
Why not? (As in, "why couldn't there be?"... I'm not interested in a history lesson)
Absolutely no reason. ECMAScript - the JavaScript specification says nothing about concurrency, it does not specify the order code runs in, it does not specify an event loop or events at all and it does not specify anything about blocking or not blocking.
The way concurrency works in JavaScript is defined by its host environment - in the browser for example that's the DOM and the DOM specifies the semantics of the event loop. "async" functions like setTimeout are only the concern of the DOM and not the JavaScript language.
Moreover there is nothing that says JavaScript runtimes have to run single threaded and so on. If you have sequential code the order of execution is specified, but there is nothing stopping anyone from embedding the JavaScript language in a multi threaded environment.
Supposed, I have a async function in Node.js, basically something such as:
var addAsync = function (first, second, callback) {
setTimeout(function () {
callback(null, first + second);
}, 1 * 1000);
};
Now of course I can call this function in an asynchronous style:
addAsync(23, 42, function (err, result) {
console.log(result); // => 65
});
What I am wondering about is whether you can make it somehow to call this function synchronously. For that, I'd like to have a wrapper function sync, which basically does the following thing:
var sync = function (fn, params) {
var res,
finished = false;
fn.call(null, params[0], params[1], function (err, result) {
res = result;
finished = true;
});
while (!finished) {}
return res;
};
Then, I'd be able to run addAsync synchronously, by calling it this way:
var sum = sync(addAsync, [23, 42]);
Note: Of course you wouldn't work using params[0] and params[1] in reality, but use the arguments array accordingly, but I wanted to keep things simple in this example.
Now, the problem is, that the above code does not work. It just blocks, as the while loop blocks and does not release the event loop.
My question is: Is it possible in any way to make this sample run as intended?
I have already experimented with setImmediate and process.nextTick and various other things, but non of them helped. Basically, what you'd need was a way to tell Node.js to please pause the current function, continue running the event loop, and getting back at a later point in time.
I know that you can achieve something similar using yield and generator functions, at least in Node.js 0.11.2 and above. But, I'm curious whether it works even without?
Please note that I am fully aware of how to do asynchronous programming in Node.js, of the event loop and all the related stuff. I am also fully aware that writing code like this is a bad idea, especially in Node.js. And I am also fully aware that an 'active wait' is a stupid idea, as well. So please don't give the advice to learn how to do it asynchronously or something like that. I know that.
The reason why I am asking is just out of curiosity and for the wish to learn.
I've recently created simpler abstraction WaitFor to call async functions in sync mode (based on Fibers). It's at an early stage but works. Please try it: https://github.com/luciotato/waitfor
using WaitFor your code will be:
console.log ( wait.for ( addAsync,23,42 ) ) ;
You can call any standard nodejs async function, as if it were a sync function.
wait.for(fn,args...) fulfills the same need as the "sync" function in your example, but inside a Fiber (without blocking node's event loop)
You can use npm fibers (C++ AddOn project) & node-sync
implement a blocking call in C(++) and provide it as a library
Yes I know-you know - BUT EVER-EVER-EVER ;)
Non-Blocking) use a control flow library
Standard propmises and yield (generator functions) will make this straigthforward:
http://blog.alexmaccaw.com/how-yield-will-transform-node
http://jlongster.com/A-Study-on-Solving-Callbacks-with-JavaScript-Generators