I need to abstract a set of REST APIs in to one simple to use API. I was planning on creating a simple nodejs/express API that makes the individual callouts asynchronously and then returns all of the results at one time.
The JS scene changes rapidly and a lot of information I've seen seems to be outdated. I was hoping someone could give me some advice and point me in the way of the best practices or frameworks that might be set up for a scenario like this.
This just sounds like a simple Express app - nothing complicated. I'd use the request-promise module to give you a nice promise-based interface for making requests of other hosts and then use promises to coordinate the multiple requests into one response.
Other than that, you'd have to show us more details on exactly what you're trying to do for us to offer more specifics.
Here's a rough outline example if you just wanted to make three simultaneous requests and then combine the results:
const rp = require('request-promise');
const express = require('express');
const app = express();
app.get('/getAll', (req, res) => {
// construct urls
let p1 = rp(url1);
let p2 = rp(url2);
let p3 = rp(url3);
Promise.all([p1, p2, p3]).then(results => {
// construct full response from the results array
req.send(fullResponse);
}).catch(err => {
res.status(500).send(err.message);
});
});
app.listen(80);
EDIT Jan, 2020 - request() module in maintenance mode
FYI, the request module and its derivatives like request-promise are now in maintenance mode and will not be actively developed to add new features. You can read more about the reasoning here. There is a list of alternatives in this table with some discussion of each one. I have been using got() myself and it's built from the beginning to use promises and is simple to use.
Personally, I use async for nodejs (Link Here), the async.parallel method takes an array of ajax calls, each with it's own optional callback, as well as a callback for when all are done.
Related
I am trying to read multiple JSON files simultaneously and create a single array using the data available in the files and do some processing with the created data array in the Node.js server.
I would like to read these files and do the processing tasks simultaneously using web workers.
I read a few interesting tutorials and articles about the subject, but no one clearly explains how to process simultaneous tasks using web workers.
They talk about running a single separated task from the main thread. But I need to do multiple tasks at once.
I also know that creating multiple workers is not recommended according to the documentation of Node.js.
Maybe I have a misunderstanding of how the web worker is functioning or with the implementation in order to perform multiple tasks.
I also tried this great library Thread.js - https://threads.js.org/ still the documentation is unclear about running multiple tasks.
Can anyone please explain what is the way of implementing this kind of work with best practice along with the pros and cons?
I would prefer implementing the vanilla JS solution other than using a library so the explanation would also be a reference to readers.
Also if possible someone can explain the usage of the Thread.js library as well for future reference.
Thank you very much.
As I'm sure you have read, the node is single-threaded, so running transactions in parallel is not going to work, even with worker threads as they are not designed to run in parallel.
A worker thread is more for longer, more process intense functions that you want to pass off and not block the main event loop, so if you think of it in terms of uploading and processing an image.. well we don't really want to hang up the entire event loop while the image is processed, so we can pass it off to a worker thread and it will tell the event loop when it's done, and it will return the response.
I think what you may be looking to do is just create a promise, so you would have a promise and say an array of the JSON file name like ["file1.JSON", "file2.JSON"] Then in your promise you would loop over, read the contents and 'return' the JSON object, insert or concat the main array variable.
Once the promise resolves, you would use the
.then(()=>{ //Do you processing of the full array })
Here's an example with a library (node-worker-threads-pool).
Thread/worker management is a complex endeavor, and I would not recommend trying to have some generic solution. Even the library I'm suggesting may not be correct.
// sample.js
const { StaticPool } = require('node-worker-threads-pool');
const start = async function () {
const staticPool = new StaticPool({
size: 4,
task: async function(n) {
const sleep = async function (ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
console.log(`thread ${n} started`);
await sleep(1000 * n);
return n + 1
}
});
// start 4 workers, each will run asynchronously and take a longer time to finish
for (let index = 0; index < 4; index++) {
staticPool.exec(index)
.then((result) => {
console.log(`result from thread pool for thread ${index}: ${result}`);
})
.catch((err) => console.error(`Error: ${err}`));
}
}
start();
I ran this in npm using node sample.js
As discussed in the other answer, it may not be useful (in terms of performance) to do this, but this example shows how it can be done.
The library also has examples where you give the tasks specific work.
I've written a script to deploy a web project. It fist uploads a bunch of files via FTP and then sends a request to a chat bot posting a message to https://chat.stackexchange.com/.
I'm new to JavaScript and Node.js, and didn't know about promises when I first wrote the code. I'm now in the process of converting it from using nested callbacks to promises with the Node build-in Promise.
For making the HTTP request to the bot I've been using request. There's another library called request-promise using Bluebird promises. Are these compatible with the built-in promise implementation? Are there any gotchas I have to look out for?
There's a site listing Conformant Promise/A+ Implementations, but neither Node.js nor Chromium is listed there. Does this mean that I can't use them together?
You will have to trust the claim that Request-promise is a drop-in replacement for Request
bluebird is a superset of the current built in Promise implementation in node. That is to say that you can use them interchangeably except that bluebird has more features/methods. Rather than try to mix them I would just use bluebird everywhere.
If you really don't want to, though, it shouldn't make any difference in terms of chaining promises together. The following still logs hello as expected.
let bluebird = require("bluebird");
new bluebird(resolver => resolver())
.then(() => new Promise(resolver => resolver()))
.then(() => console.log("hello"));
Using Promise = require("bluebird") is pretty common as well.
They are compatible. Probably some implementations differ a little bit, but the main Promise flow is the same. Bluebird seems to be faster even than the native Node.JS implementation.
In the book https://pragprog.com/book/tbajs/async-javascript, I found this:
Node’s early iterations used Promises in its nonblocking API. However,
in February 2010, Ryan Dahl made the decision to switch to the
now-familiar callback(err, results...) format, on the grounds that
Promises are a higher-level construct that belongs in “userland.”
It looks quite confusing to me, because as an API to read files, this
fs.readFile('/etc/passwd')
.onSuccess(function(data){console.log(data)})
.onError(function(err){throw err})
looks much better than this:
fs.readFile('/etc/passwd', function (err, data) {
if (err) throw err;
console.log(data);
});
Does anyone have ideas about why "Promises are a higher-level construct" will stops itself from being used in NodeJS API?
Node v8 ships with util.promisify that converts callback APIs to promises, Node v10 ships with native promises support (experimental):
const fs = require('fs').promises;
// in an async function:
let data = await fs.readFile('/etc/passwd');
console.log(data);
The future is promises:
NodeJS will use promises for the new APIs. In fact it is currently discussed how. An earlier attempt in 0.2 to use Promises in node years ago failed because of friction and performance issues.
What has to happen first:
Now promises are a native language feature, but the following has to happen before they make it to the core APIs:
Promises have to be a native language construct this already happened.
The NodeJS and io.js merger that was recently announced has to happen - the time frame is a few short months probably.
The v8 (JavaScript engine) team has to finish working on private symbols which will enable fast promise creation. At the moment the promise constructor is the only way to create promises in native promises and it allocates a closure which is relatively expensive. This is currently being done with Domenic working in tight coordination between the io.js and v8 team to ensure this is done properly.
The v8 team has to optimize the promise implementation, currently native promises lose consistently to userland implementations like bluebird. This is also happening now.
Once all these happen the API will be forked and a version containing promises will be integrated into core. Here is a long and uninteresting discussion about it - there is a better one at the io.js/NG repo but neither are really too informative.
What can be done today
Libraries like bluebird give you tools to instantly convert a callback API to promises in a fast and efficient way. You can use them today and get that functionality.
Historically callbacks are the default for performance reasons, but...
Update 2017 / Node 8: Promises are now supported by the core!
Node.js supports promises since Node v8.x. The APIs are all still written in callback style (for backwards compatibility etc.), but there now is a utility class in node core to convert the callback-based APIs to promise-based APIs (similarly to bluebird):
https://nodejs.org/api/util.html#util_util_promisify_original
From the Node.js docs:
For example:
const util = require('util');
const fs = require('fs');
const stat = util.promisify(fs.stat);
stat('.').then((stats) => {
// Do something with `stats`
}).catch((error) => {
// Handle the error.
});
Or, equivalently using async functions:
const util = require('util');
const fs = require('fs');
const stat = util.promisify(fs.stat);
async function callStat() {
const stats = await stat('.');
console.log(`This directory is owned by ${stats.uid}`);
}
Update 2018 / Node 10: New fs.promises API
The fs.promises API provides an alternative set of asynchronous file system methods that return Promise objects rather than using callbacks. The API is accessible via require('fs').promises.
https://nodejs.org/api/fs.html#fs_fs_promises_api
(experimental at this moment, but working perfectly on node latest)
Promises is a library, when using promise it requires to return Promise constructor from function but using callback function chaining same thing is achievable that's why "Promises are a higher-level construct"
Reference: Promises in node js
I am developing an application using NodeJS where two queries depend on each other here is explanation of my situation.
I have to query database for some values say A then I have to query
database for some other value B only if there is an A in the database.
I know I can do this in NodeJS' natural way like execute query for A when its result is ready execute query for B in A's callback and then finally render response in B's callback.
Here is my question, is there a design issue in the solution I just mentioned perhaps about nested callbacks.
Second is there any method in which I can make NodeJs IO as Blocking from NON-Blocking?
I was made aware of a library called Async Which i wish id found when i started my NodeJs projects. It can take care of this and much more.
Async provides around 20 functions that include the usual 'functional'
suspects (map, reduce, filter, each…) as well as some common patterns
for asynchronous control flow (parallel, series, waterfall…). All
these functions assume you follow the node.js convention of providing
a single callback as the last argument of your async function.
basic example from site that would help in your scenario
async.series([
function(callback){
// do db call one
callback(null, 'one');
},
function(callback){
// do db call two
callback(null, 'two');
}
],
function(err, results){
// results is now equal to ['one', 'two']
});
Here is my question, is there a design issue in the solution I just mentioned perhaps about nested callbacks.
No, your solution is perfectly fine.
Second is there any method in which I can make NodeJs IO as Blocking from NON-Blocking?
Yes. You can write your own c++ extension to provide blocking calls.
So I started a little project in Node.js to learn a bit about it. It's a simple caching proxy for arch linux's package system as node provides most of the heavy lifting.
This has two "main" phases, server setup and serving.
Then serving has two main phases, response setup and response.
The "main" setup involves checking some files, loading some config from files. loading some json from a web address. Then launching the http server and proxy instance with this info.
setup logger/options - read config - read mirrors - read webmirror
start serving
Serving involves checking the request to see if the file exists, creating directories if needed, then providing a response.
check request - check dir - check file
proxy request or serve file
I keep referring to them as synchronisation points but searches don't lead to many results. Points where a set of async tasks have to be finished before the process can complete a next step. Perl's AnyEvent has conditional variables which I guess is what I'm trying to do, without the blocking.
To start with I found I was "cheating" and using the synchronous versions of any functions where provided but that had to stop with the web requests, so I started restructuring things. Immediately most search's led to using async or step to control the flow. To start with I was trying lots of series/parallel setups but running into issues if there were any async calls underneath the functions would "complete" straight away and the series would finish.
After much wailing and gnashing of teeth, I ended up with a "waiter" function using async.until that tests for some program state to be set by all the tasks finishing before launching the next function.
// wait for "test" to be true, execute "run",
// bail after "count" tries, waiting "sleep" ms between tries;
function waiter( test, run, count, sleep, message ) {
var i=0;
async.until(
function () {
if ( i > count ) { return true; }
logger.debug('waiting for',message, test() );
return test();
},
function (callback) {
i++;
setTimeout(callback, sleep );
},
function (err) {
if ( i > count ) {
logger.error('timeout for', message, count*sleep );
return;
}
run()
}
);
}
It struck me as being rather large and ugly and requiring a module to implement for something that I thought was standard, so I am wondering what's a better way. Am I still thinking in a non-async way? Is there something simple in Node I have overlooked? Is there a standard way of doing this?
I imagine with this setup, if the program get's complex there's going to be a lot of nesting functions to describe the flow of the program and I'm struggling to see a good way to lay it all out.
any tips would be appreciated.
You can't really make everything to be synchronous. Nodejs is designed to perform asynchronously (which may of course torment you at times). But there are a few ways techniques to make it work in a synchronous way (provided the pseudo-code is well-thought and code is designed carefully):
Using callbacks
Using events
Using promises
Callbacks and events are easy to use and understand. But with these, sometimes the code can get real messy and hard to debug.
But with promises, you can avoid all that. You can make dependency chains, called 'promises' (for instance, perform Promise B only when Promise A is complete).
Earlier versions of node.js had implementation of promises. They promised to do some work and then had separate callbacks that would be executed for success and failure as well as handling timeouts.
But in later versions, that was removed. By removing them from the core node.js, it created possibility of building up modules with different implementations of promises that can sit on top of the core. Some of these are node-promise, futures, and promises.
See these links for more info:
Framework
Promises and Futures
Deferred Promise - jQuery