How to wrangle Node.JS async - javascript

I am struggling with getting my head around how to overcome and handle the async nature of Node.JS. I have done quite a bit of reading on it and tried to make Node do what I want by either using a message passing solution or callback functions.
My problem is I have a object where I want to constructor to load a file and populate an array. Then I want all calls to this function use that loaded data. So I need the original call to wait for the file to be loaded and all subsequent calls to use the already loaded private member.
My issue is that the function to load load the data and get the data is being executed async even if it return a function with a callback.
Anyways, is there something simple I am missing? Or is there an easier pattern I could use here? This function should return part of the loaded file but returns undefined. I have checked that the file is actually being loaded, and works correctly.
function Song() {
this.verses = undefined;
this.loadVerses = function(verseNum, callback) {
if (this.verses === undefined) {
var fs = require('fs'),
filename = 'README.md';
fs.readFile(filename, 'utf8', function(err, data) {
if (err) {
console.log('error throw opening file: %s, err: %s', filename, err);
throw err;
} else {
this.verses = data;
return callback(verseNum);
}
});
} else {
return callback(verseNum);
}
}
this.getVerse = function(verseNum) {
return this.verses[verseNum + 1];
}
}
Song.prototype = {
verse: function(input) {
return this.loadVerses(input, this.getVerse);
}
}
module.exports = new Song();
Update:
This is how I am using the song module from another module
var song = require('./song');
return song.verse(1);

"My issue is that the function to load the data and get the data is being executed async even if it return a function with a callback."
#AlbertoZaccagni what I mean by that scentence is that this line return this.loadVerses(input, this.getVerse); returns before the file is loaded when I expect it to wait for the callback.
That is how node works, I will try to clarify it with an example.
function readFile(path, callback) {
console.log('about to read...');
fs.readFile(path, 'utf8', function(err, data) {
callback();
});
}
console.log('start');
readFile('/path/to/the/file', function() {
console.log('...read!');
});
console.log('end');
You are reading a file and in the console you will likely have
start
about to read...
end
...read!
You can try that separately to see it in action and tweak it to understand the point. What's important to notice here is that your code will keep on running skipping the execution of the callback, until the file is read.
Just because you declared a callback does not mean that the execution will halt until the callback is called and then resumed.
So this is how I would change that code:
function Song() {
this.verses = undefined;
this.loadVerses = function(verseNum, callback) {
if (this.verses === undefined) {
var fs = require('fs'),
filename = 'README.md';
fs.readFile(filename, 'utf8', function(err, data) {
if (err) {
console.log('error throw opening file: %s, err: %s', filename, err);
throw err;
} else {
this.verses = data;
return callback(verseNum);
}
});
} else {
return callback(verseNum);
}
}
}
Song.prototype = {
verse: function(input, callback) {
// I've removed returns here
// I think they were confusing you, feel free to add them back in
// but they are not actually returning your value, which is instead an
// argument of the callback function
this.loadVerses(input, function(verseNum) {
callback(this.verses[verseNum + 1]);
});
}
}
module.exports = new Song();
To use it:
var song = require('./song');
song.verse(1, function(verse) {
console.log(verse);
});
I've ignored
the fact that we're not treating the error as first argument of the callback
the fact that calling this fast enough will create racing conditions, but I believe this is another question

[Collected into an answer and expanded from my previous comments]
TL;DR You need to structure your code such that the result of any operation is only used inside that operation's callback, since you do not have access to it anywhere else.
And while assigning it to an external global variable will certainly work as expected, do so will only occur after the callback has fired, which happens at a time you cannot predict.
Commentary
Callbacks do not return values because by their very nature, they are executed sometime in the future.
Once you pass a callback function into a controlling asynchronous function, it will be executed when the surrounding function decides to call it. You do not control this, and so waiting for a returned result won't work.
Your example code, song.verse(1); cannot be expected to return anything useful because it is called immediately and since the callback hasn't yet fired, will simply return the only value it can: null.
I'm afraid this reliance on asynchronous functions with passed callbacks is an irremovable feature of how NodeJS operates; it is at the very core of it.
Don't be disheartened though. A quick survey of all the NodeJS questions here shows quite clearly that this idea that one must work with the results of async operations only in their callbacks is the single greatest impediment to anyone understanding how to program in NodeJS.
For a truly excellent explanation/tutorial on the various ways to correctly structure NodeJS code, see Managing Node.js Callback Hell with Promises, Generators and Other Approaches.
I believe it clearly and succinctly describes the problem you face and provides several ways to refactor your code correctly.
Two of the features mentioned there, Promises and Generators, are programming features/concepts, the understanding of which would I believe be of great use to you.
Promises (or as some call them, Futures) is/are a programming abstraction that allows one to write code a little more linearly in a if this then that style, like
fs.readFileAsync(path).then(function(data){
/* do something with data here */
return result;
}).catch(function(err){
/* deal with errors from readFileAsync here */
}).then(function(result_of_last_operation){
/* do something with result_of_last_operation here */
if(there_is_a_problem) throw new Error('there is a problem');
return final_result;
})
.catch(function(err){
/* deal with errors when there_is_a_problem here */
}).done(function(final_result){
/* do something with the final result */
});
In reality, Promises are simply a means of marshaling the standard callback pyramid in a more linear fashion. (Personally I believe they need a new name, since the idea of "a promise of some value that might appear in the future" is not an easy one to wrap one's head around, at least it wasn't for me.)
Promises do this by (behind the scenes) restructuring "callback hell" such that:
asyncFunc(args,function callback(err,result){
if(err) throw err;
/* do something with the result here*/
});
becomes something more akin to:
var p=function(){
return new Promise(function(resolve,reject){
asyncFunc(args,function callback(err,result){
if(err) reject(err)
resolve(result);
});
});
});
p();
where any value you provide to resolve() becomes the only argument to the next "then-able" callback and any error is passed via rejected(), so it can be caught by any .catch(function(err){ ... }) handlers you define.
Promises also do all the things you'd expect from the (somewhat standard) async module, like running callbacks in series or in parallel and operating over the elements of an array, returning their collected results to a callback once all the results have been gathered.
But you will note that Promises don't quite do what you want, because everything is still in callbacks.
(See bluebird for what I believe is the simplest and thus, best Promises package to learn first.)
(And note that fs.readFileAsync is not a typo. One useful feature of bluebird is that it can be made to add this and other Promises-based versions of fs's existing functions to the standard fs object. It also understands how to "promisify" other modules such as request and mkdirp).
Generators are the other feature described in the tutorial above, but are only available in the new, updated but not yet officially released version of JavaScript (codenamed "Harmony").
Using generators would also allow you to write code in a more linear manner, since one of the features it provides is the ability of waiting on the results of an asynchronous operation in a way that doesn't wreak havoc with the JavaScript event-loop. (But as I said, it's not a feature in general use yet.)
You can however use generators in the current release of node if you'd like, simply add "--harmony" to the node command line to tell it to turn on the newest features of the next version of JavaScript.

Related

How does Javascript know if there has been an error when executing a function (callbacks)

So I am reading about callbacks because I´m learning backend development on Node.js, and in several webs they say this good practice about writing callbacks with error argument as its first argument:
For example:
fs.readFile('/foo.txt', function(err, data) {
// If an error occurred, handle it (throw, propagate, etc)
if(err) {
console.log('Unknown Error');
return;
}
// Otherwise, log the file contents
console.log(data);
});
Ok sure, I think I understand it clearly what is happening. If once the module fs finishes reading the file "foo.text" there is an error, then the callback function executes console.log("Uknown error") but how Javascript / Node knows that the variable err corresponds to an error in the code??
Because If i name it error instead of err , I imagine it also works right? And what If put it in the second argument? I imagine then it wouldn´t work. Is that it? If its why it is called a good practice if there is no other way to put the error argument but in the first place.
but how Javascript / Node knows that the variable err corresponds to an error in the code??
By convention. The way readFile (and other Node.js callback-style functions) are written, they call their callback with the error as the first argument, or null as the first argument. The name of that parameter in the function signature is irrelevant (you can call it anything you like; err, e, and error are all common). It's the fact it's the first parameter that matters, because it will receive the first argument when called.
In these modern times, though, things are moving away from Node.js callback-style APIs and toward APIs using Promises, which make the error and success paths much more distinct. Then async/await syntax is layered on top of promises to make it possible to write asynchronous code using the standard logical flow control structures.
Node.js callback style (like your code):
const fs = require("fs");
// ...
fs.readFile('/foo.txt', function(err, data) {
// If an error occurred, handle it (throw, propagate, etc)
if (err) {
// It failed
console.log(err);
return;
}
// It worked
console.log(data);
});
With promises via the fs.promises API:
const fsp = require("fs").promises;
// ...
fsp.readFile('/foo.txt')
.then(data => {
// It worked
console.log(data);
})
.catch(err => {
console.log(err);
});
Of course, you may not handle errors at that level; you might instead return the result of calling then so that the caller can chain off it and handle errors (or pass it off to its caller, etc.).
With async/await (this must be inside an async function, although in modules top-level await is coming):
const fsp = require("fs").promises;
// ...inside an `async` function:
try {
const data = await fsp.readFile('/foo.txt');
} catch (err) {
console.log(err);
}
And again, you might not handle errors at that level; you might let them propagate and have the caller handle them (or the caller might let them propgate to its caller to handle them, etc.).
Not all of Node.js's API has promises yet. You can wrap a single callback-style API function with promises via util.promisify; you can wrap an entire API via various npm modules like promisify.

Node.js: Why should you return the result of a callback during error handling?

Newbie to Node.js here. I'm learning Node.js via the tutorials at NodeSchool.io, and in one tutorial where we learned about modules, we were required to write something like this code:
// Some code...
function filteredLs(dir, ext, callback) {
fs.readdir(dir, function(err, files) {
if (err)
return callback(err); // return statement necessary here...
callback(null, withExtension(files, ext)); // ...but not here
})
}
module.exports = filteredLs;
My question is, in examples like these, why is it necessary to include the return statement when handling the error, but OK to omit when it's null? I don't see what use the return value of the function could have to readdir anyhow, since it happens after it finishes its work. Why does it make a difference?
The use of return when calling a callback function is typically there to prevent the code that follows from running. The returned value is typically irrelevant.
That's why it's needed in the error case, so the callback call for the non-error case isn't also called.
It's not needed in the non-error case because it's already the last line of the function.
The error handling boilerplate you posted is indeed confusing. It uses a maximally-terse way of expressing the code, but is indeed confusing, and you are right that the return value is discarded. Thus my preferred boilerplate for clarity is
if (error) {
callback(error)
return
}
Which I feel is slightly clearer and the reduced concision is not that important to me (I type it with a macro anway).
I find this to make it clearer that there are 2 distinct intentions being expressed here:
Bubble the error back up to the caller
Exit the function as there's nothing else useful to be done. No return value because the callback protocol does not require one and the calling code does not need to and probably will not even capture the return value into a variable.

How to get around the asynchronous Node.js behaviour?

I'm working on a script that pings websites and returns the results in a web UI. However, I've run into a problem which I am trying to figure out the best solution for.
This block of code needs to return a array of statuses but due to the asynchronous behaviour of Node.js, it returns an empty array because the code takes time to execute.
Here is what I have:
var ping = require('ping');
function checkConnection(hosts) {
var results = [];
hosts.forEach(function (host) {
ping.sys.probe(host, function (isAlive) {
results.push({"host": host, "status": isAlive});
});
});
return {results: results, timestamp: new Date().getTime()};
}
module.exports.checkConnection = checkConnection;
I know that you could solve this problem with the use of timers but what would be the simples and most ideal solution here?
How to get around the asynchronous Node.js behaviour?
Don't. Instead, embrace it, by having your checkConection accept a callback or return a promise.
Callback example:
function checkConnection(hosts, callback) {
var results = [];
hosts = hosts.slice(0); // Copy
hosts.forEach(function (host) {
ping.sys.probe(host, function (isAlive) {
results.push({"host": host, "status": isAlive});
if (results.length === hosts.length) {
callback({results: results, timestamp: new Date().getTime()});
}
});
});
}
Note the defensive shallow copy of hosts. If you don't do that, then since this code runs asynchronously, the calling code could add to or remove from the hosts array while you were processing responses, and the lengths would never match.
An alternate way to handle that without copying is to simply count how many requests you've initiated:
function checkConnection(hosts, callback) {
var results = [];
var requests = hosts.length;
hosts.forEach(function (host) {
ping.sys.probe(host, function (isAlive) {
results.push({"host": host, "status": isAlive});
if (results.length === requests) {
callback({results: results, timestamp: new Date().getTime()});
}
});
});
}
That looks like it sets up a race condition (what if something modifies hosts after you set requests but before you're done initiating your probe queries?) but it doesn't, because Node runs your JavaScript on a single thread, so no other code can reach in and modify hosts between the requests = hosts.length and hosts.forEach lines.
Like T.J. said, you will need to embrace asynchronous behavior if you are going to program in node.js as that is a fundamental tenet of how it works and how you code a responsive, scalable server using node.js.
T.J.'s answer is a straightforward way of solving this particular problem. But, since async issues will arise over and over again in node.js, promises can be a very useful tool for managing asynchronous behavior and they quickly become indispensable for more complicated multi-operation sequences with robust error handling.
So, here's a solution to your coding issue using Promises:
var ping = require('ping');
var Promise = require('bluebird');
// make a version of ping.sys.probe that returns a promise when done
ping.sys.probeAsync = function(host) {
return new Promise(function(resolve, reject) {
ping.sys.probe(host, function(isAlive) {
resolve({"host": host, "status": isAlive});
});
}
}
function checkConnection(hosts) {
var promises = hosts.map(function(host) {
return ping.sys.probeAsync(host);
});
return Promise.all(promises).then(function(results) {
return {results: results, timestamp: new Date().getTime()};
});
}
module.exports.checkConnection = checkConnection;
Sample Usage:
myModule.checkConnection(myArrayOfHosts).then(function(results) {
// results is the {results: results, timestamp: time} object
});
Step-by-step, here's how this works:
Load the Bluebird promise library.
Create a promisified version of ping.sys.probe called ping.sys.probeAsync that returns a promise that will be resolved when the underlying call is done.
Using .map() on your array, create an array of promises from calling ping.sys.probeAsync on each item in the array.
Using Promise.all(), create a new promise that is the aggregation of all the promises in the array. It will call it's .then() handler only when all the promises in the array have been resolved (e.g. have finished).
Add a .then() handler to Promise.all() so the timestamp can be added to the results.
Return the Promise.all() promise so the caller of checkConnection() gets a promise back they can use.
When calling checkConnection() use a .then() handler to know when all the operations are done and to obtain the results.
Hopefully you can see that once you have a promisified version of your function and you understand how promises work, you can then write the actual async code much simpler. And, if you also had error handling or had a sequence of async operations that had to be run one after the other (something you don't have here), the advantages of using promises is even greater.
P.S. I think Bluebird's Promise.map() can be used to combine the hosts.map() and Promise.all() into a single call, but I've not used that function myself so I didn't offer it here.

NodeJS Callback Scoping Issue

I am quite new (just started this week) to Node.js and there is a fundamental piece that I am having trouble understanding. I have a helper function which makes a MySQL database call to get a bit of information. I then use a callback function to get that data back to the caller which works fine but when I want to use that data outside of that callback I run into trouble. Here is the code:
/** Helper Function **/
function getCompanyId(token, callback) {
var query = db.query('SELECT * FROM companies WHERE token = ?', token, function(err, result) {
var count = Object.keys(result).length;
if(count == 0) {
return;
} else {
callback(null, result[0].api_id);
}
});
}
/*** Function which uses the data from the helper function ***/
api.post('/alert', function(request, response) {
var data = JSON.parse(request.body.data);
var token = data.token;
getCompanyId(token, function(err, result) {
// this works
console.log(result);
});
// the problem is that I need result here so that I can use it else where in this function.
});
As you can see I have access to the return value from getCompanyId() so long as I stay within the scope of the callback but I need to use that value outside of the callback. I was able to get around this in another function by just sticking all the logic inside of that callback but that will not work in this case. Any insight on how to better structure this would be most appreciated. I am really enjoying Node.js thus far but obviously I have a lot of learning to do.
Short answer - you can't do that without violating the asynchronous nature of Node.js.
Think about the consequences of trying to access result outside of your callback - if you need to use that value, and the callback hasn't run yet, what will you do? You can't sleep and wait for the value to be set - that is incompatible with Node's single threaded, event-driven design. Your entire program would have to stop executing whilst waiting for the callback to run.
Any code that depends on result should be inside the getCompanyId callback:
api.post('/alert', function(request, response) {
var data = JSON.parse(request.body.data);
var token = data.token;
getCompanyId(token, function(err, result) {
//Any logic that depends on result has to be nested in here
});
});
One of the hardest parts about learning Node.js (and async programming is general) is learning to think asynchronously. It can be difficult at first but it is worth persisting. You can try to fight and code procedurally, but it will inevitably result in unmaintainable, convoluted code.
If you don't like the idea of multiple nested callbacks, you can look into promises, which let you chain methods together instead of nesting them. This article is a good introduction to Q, one implementation of promises.
If you are concerned about having everything crammed inside the callback function, you can always name the function, move it out, and then pass the function as the callback:
getCompanyId(token, doSomethingAfter); // Pass the function in
function doSomethingAfter(err, result) {
// Code here
}
My "aha" moment came when I began thinking of these as "fire and forget" methods. Don't look for return values coming back from the methods, because they don't come back. The calling code should move on, or just end. Yes, it feels weird.
As #joews says, you have to put everything depending on that value inside the callback(s).
This often requires you passing down an extra parameter(s). For example, if you are doing a typical HTTP request/response, plan on sending the response down every step along the callback chain. The final callback will (hopefully) set data in the response, or set an error code, and then send it back to the user.
If you want to avoid callback smells you need to use Node's Event Emitter Class like so:
at top of file require event module -
var emitter = require('events').EventEmitter();
then in your callback:
api.post('/alert', function(request, response) {
var data = JSON.parse(request.body.data);
var token = data.token;
getCompanyId(token, function(err, result) {
// this works
console.log(result);
emitter.emit('company:id:returned', result);
});
// the problem is that I need result here so that I can use it else where in this function.
});
then after your function you can use the on method anywhere like so:
getCompanyId(token, function(err, result) {
// this works
console.log(result);
emitter.emit('company:id:returned', result);
});
// the problem is that I need result here so that I can use it else where in this function.
emitter.on('company:id:returned', function(results) {
// do what you need with results
});
just be careful to set up good namespacing conventions for your events so you don't get a mess of on events and also you should watch the number of listeners you attach, here is a good link for reference:
http://www.sitepoint.com/nodejs-events-and-eventemitter/

exception handling, thrown errors, within promises

I am running external code as a 3rd party extension to a node.js service. The API methods return promises. A resolved promise means the action was carried out successfully, a failed promise means there was some problem carrying out the operation.
Now here's where I'm having trouble.
Since the 3rd party code is unknown, there could be bugs, syntax errors, type issues, any number of things that could cause node.js to throw an exception.
However, since all the code is wrapped up in promises, these thrown exceptions are actually coming back as failed promises.
I tried to put the function call within a try/catch block, but it's never triggered:
// worker process
var mod = require('./3rdparty/module.js');
try {
mod.run().then(function (data) {
sendToClient(true, data);
}, function (err) {
sendToClient(false, err);
});
} catch (e) {
// unrecoverable error inside of module
// ... send signal to restart this worker process ...
});
In the above psuedo-code example, when an error is thrown it turns up in the failed promise function, and not in the catch.
From what I read, this is a feature, not an issue, with promises. However I'm having trouble wrapping my head around why you'd always want to treat exceptions and expected rejections exactly the same.
One case is about actual bugs in the code, possibly irrecoverable -- the other is just possible missing configuration information, or a parameter, or something recoverable.
Thanks for any help!
Crashing and restarting a process is not a valid strategy to deal with errors, not even bugs. It would be fine in Erlang, where a process is cheap and does one isolated thing, like serving a single client. That doesn't apply in node, where a process costs orders of magnitude more and serves thousands of clients at once
Lets say that you have 200 requests per second being served by your service. If 1% of those hit a throwing path in your code, you would get 20 process shutdowns per second, roughly one every 50ms. If you have 4 cores with 1 process per core, you would lose them in 200ms. So if a process takes more than 200ms to start and prepare to serve requests (minimum cost is around 50ms for a node process that doesn't load any modules), we now have a successful total denial of service. Not to mention that users hitting an error tend to do things like e.g. repeatedly refresh the page, thereby compounding the problem.
Domains don't solve the issue because they cannot ensure that resources are not leaked.
Read more at issues #5114 and #5149.
Now you can try to be "smart" about this and have a process recycling policy of some sort based on a certain number of errors, but whatever strategy you approach it will severely change the scalability profile of node. We're talking several dozen requests per second per process, instead of several thousands.
However, promises catch all exceptions and then propagate them in a manner very similar to how synchronous exceptions propagate up the stack. Additionally, they often provide a method finally which is meant to be an equivalent of try...finally Thanks to those two features, we can encapsulate that clean-up logic by building "context-managers" (similar to with in python, using in C# or try-with-resources in Java) that always clean up resources.
Lets assume our resources are represented as objects with acquire and dispose methods, both of which return promises. No connections are being made when the function is called, we only return a resource object. This object will be handled by using later on:
function connect(url) {
return {acquire: cb => pg.connect(url), dispose: conn => conn.dispose()}
}
We want the API to work like this:
using(connect(process.env.DATABASE_URL), async (conn) => {
await conn.query(...);
do other things
return some result;
});
We can easily achieve this API:
function using(resource, fn) {
return Promise.resolve()
.then(() => resource.acquire())
.then(item =>
Promise.resolve(item).then(fn).finally(() =>
// bail if disposing fails, for any reason (sync or async)
Promise.resolve()
.then(() => resource.dispose(item))
.catch(terminate)
)
);
}
The resources will always be disposed of after the promise chain returned within using's fn argument completes. Even if an error was thrown within that function (e.g. from JSON.parse) or its inner .then closures (like the second JSON.parse), or if a promise in the chain was rejected (equivalent to callbacks calling with an error). This is why its so important for promises to catch errors and propagate them.
If however disposing the resource really fails, that is indeed a good reason to terminate. Its extremely likely that we've leaked a resource in this case, and its a good idea to start winding down that process. But now our chances of crashing are isolated to a much smaller part of our code - the part that actually deals with leakable resources!
Note: terminate is basically throwing out-of-band so that promises cannot catch it, e.g. process.nextTick(() => { throw e });. What implementation makes sense might depend on your setup - a nextTick based one works similar to how callbacks bail.
How about using callback based libraries? They could potentially be unsafe. Lets look at an example to see where those errors could come from and which ones could cause problems:
function unwrapped(arg1, arg2, done) {
var resource = allocateResource();
mayThrowError1();
resource.doesntThrow(arg1, (err, res) => {
mayThrowError2(arg2);
done(err, res);
});
}
mayThrowError2() is within an inner callback and will still crash the process if it throws, even if unwrapped is called within another promise's .then. These kinds of errors aren't caught by typical promisify wrappers and will continue to cause a process crash as per usual.
However, mayThrowError1() will be caught by the promise if called within .then, and the inner allocated resource might leak.
We can write a paranoid version of promisify that makes sure that any thrown errors are unrecoverable and crash the process:
function paranoidPromisify(fn) {
return function(...args) {
return new Promise((resolve, reject) =>
try {
fn(...args, (err, res) => err != null ? reject(err) : resolve(res));
} catch (e) {
process.nextTick(() => { throw e; });
}
}
}
}
Using the promisified function within another promise's .then callback now results with a process crash if unwrapped throws, falling back to the throw-crash paradigm.
Its the general hope that as you use more and more promise based libraries, they would use the context manager pattern to manage their resources and therefore you would have less need to let the process crash.
None of these solutions are bulletproof - not even crashing on thrown errors. Its very easy to accidentally write code that leaks resources despite not throwing. For example, this node style function will leak resources even though it doesn't throw:
function unwrapped(arg1, arg2, done) {
var resource = allocateResource();
resource.doSomething(arg1, function(err, res) {
if (err) return done(err);
resource.doSomethingElse(res, function(err, res) {
resource.dispose();
done(err, res);
});
});
}
Why? Because when doSomething's callback receives an error, the code forgets to dispose of the resource.
This sort of problem doesn't happen with context-managers. You cannot forget to call dispose: you don't have to, since using does it for you!
References: why I am switching to promises, context managers and transactions
It is almost the most important feature of promises. If it wasn't there, you might as well use callbacks:
var fs = require("fs");
fs.readFile("myfile.json", function(err, contents) {
if( err ) {
console.error("Cannot read file");
}
else {
try {
var result = JSON.parse(contents);
console.log(result);
}
catch(e) {
console.error("Invalid json");
}
}
});
(Before you say that JSON.parse is the only thing that throws in js, did you know that even coercing a variable to a number e.g. +a can throw a TypeError?
However, the above code can be expressed much more clearly with promises because there is just one exception channel instead of 2:
var Promise = require("bluebird");
var readFile = Promise.promisify(require("fs").readFile);
readFile("myfile.json").then(JSON.parse).then(function(result){
console.log(result);
}).catch(SyntaxError, function(e){
console.error("Invalid json");
}).catch(function(e){
console.error("Cannot read file");
});
Note that catch is sugar for .then(null, fn). If you understand how the exception flow works you will see it is kinda of an anti-pattern to generally use .then(fnSuccess, fnFail).
The point is not at all to do .then(success, fail) over , function(fail, success) (I.E. it is not an alternative way to attach your callbacks) but make written code look almost the same as it would look when writing synchronous code:
try {
var result = JSON.parse(readFileSync("myjson.json"));
console.log(result);
}
catch(SyntaxError e) {
console.error("Invalid json");
}
catch(Error e) {
console.error("Cannot read file");
}
(The sync code will actually be uglier in reality because javascript doesn't have typed catches)
Promise rejection is simply a from of failure abstraction. So are node-style callbacks (err, res) and exceptions. Since promises are asynchronous you can't use try-catch to actually catch anything, because errors a likely to happen not in the same tick of event loop.
A quick example:
function test(callback){
throw 'error';
callback(null);
}
try {
test(function () {});
} catch (e) {
console.log('Caught: ' + e);
}
Here we can catch an error, as function is synchronous (though callback-based). Another:
function test(callback){
process.nextTick(function () {
throw 'error';
callback(null);
});
}
try {
test(function () {});
} catch (e) {
console.log('Caught: ' + e);
}
Now we can't catch the error! The only option is to pass it in the callback:
function test(callback){
process.nextTick(function () {
callback('error', null);
});
}
test(function (err, res) {
if (err) return console.log('Caught: ' + err);
});
Now it's working just like in the first example.The same applies to promises: you can't use try-catch, so you use rejections for error-handling.

Categories

Resources