Node server async call to backend service - javascript

I am new to Node and I am writing my very first node server. It should answer to a simple get request with a simple page after calling a backend rest service.
I am using express to manage the request and the axios package to make the backend request. The problem is that the server is blocking the event loop and I have problems understanding how to make the call to the backend asynchronous.
As of now the frontend server can only manage one request at a time!! I expected that if the backend service takes 10 seconds to answer everytime, the frontend server can answer two concurrent request in 10 seconds and not in 20 seconds.
Where am I wrong?
Here is an extract of the frontend node code:
app.get('/', function(req, res) {
//Making the call to the backend service. This should be asynchronous...
axios.post(env.get("BACKEND_SERVICE"),
{ "user": "some kind of input"})
.then(function(response){
//do somenthing with the data returned from the backend...
res.render('homepage');
})
}
And here it is and extract of the backend node code:
app.post('/api/getTypes', jsonParser, function (req, res) {
console.log("> API request for 'api/getTypes' SLEEP");
var now = new Date().getTime();
while(new Date().getTime() < now + 10000){ /* do nothing */ }
console.log("> API request for 'api/getTypes' WAKE-UP");
res.json({"types":"1"});
}

The problem is your busy-wait ties up the backend server such that it can't even begin to process the second request.
I assume you're trying to simulate the process of getting the types taking a while. Odds are what you're going to be doing to get the types will be async and I/O-bound (reading files, querying a database, etc.). To simulate that, just use setTimeout:
app.post('/api/getTypes', jsonParser, function (req, res) {
console.log("> API request for 'api/getTypes' SLEEP");
setTimeout(function() {
console.log("> API request for 'api/getTypes' WAKE-UP");
res.json({"types":"1"});
}, 10000);
});
That avoids hogging the backend server's only thread, leaving it free to start overlapping handling for the second (third, fourth, ...) request.
This is one of the key principles of Node: Don't do things synchronously if you an avoid it. :-) That's why the API is so async-oriented.
If you do find at some point that you have heavy CPU-burning crunching you need to do to process a request, you might spin it off as a child process of the server rather than doing it in the server process. Node is single-threaded by design, achieving very high throughput via an emphasis on asynchronous I/O. Which works great for most of what you need to do...until it doesn't. :-)
Re your comment:
The backend process will be written in another technology other than node, it will call a DB and it could take a while. I wrote that simple node rest service to simulate that. What I would like to understand is how the frontend server will react if the backend takes time to process the requests.
There's a big difference between taking time to process the requests and tying up the only server thread busy-waiting (or doing massive CPU-heavy work). Your busy-wait models doing massive CPU-heavy work, but if getting the types is going to be external to Node, you won't be busy-waiting on it, you'll be queuing a callback for an asynchronus completion (waiting for I/O from a child process, or I/O from a socket connected to a third server process, or waiting on I/O from the DB, etc.). So the setTimeout example above is a better model for what you'll really be doing.
The busy-wait keeps the front-end from completing because it goes like this:
Backend
Time Frontend Queue Backend
−−−− −−−−−−−−−− −−−−−−−−−−−−−−−−−−−−− −−−−−−−−−−−−−−−−−−−−−
0 sec Request #1 −−−−−−> Receive request #1 −−−−−> Pick up job for request #1
0 sec Request #1 −−−−−−> Receive request #2
Busy wait 10 seconds
10 sec Got #1 back &lt−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−− Send response #1
−−−−−> Pick up job for request #2
Busy wait 10 seconds
20 sec Got #2 back &lt−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−−− Send response #2
So even though the front-end isn't busy-waiting, it sees 20 seconds go by because the backend busy-waits (unable to do anything else) for 10 seconds for each request.
But that's not what your real setup will do, unless the other technology you're using is also single-threaded. (If it is, you may want to have more than one of them run in parallel.)

Related

Handle Multiple Concurent Requests for Express Sever on Same Endpoint API

this question might be duplicated but I am still not getting the answer. I am fairly new to node.js so I might need some help. Many have said that node.js is perfectly free to run incoming requests asynchronously, but the code below shows that if multiple requests hit the same endpoint, say /test3, the callback function will:
Print "test3"
Call setTimeout() to prevent blocking of event loop
Wait for 5 seconds and send a response of "test3" to the client
My question here is if client 1 and client 2 call /test3 endpoint at the same time, and the assumption here is that client 1 hits the endpoint first, client 2 has to wait for client 1 to finish first before entering the event loop.
Can anybody here tells me if it is possible for multiple clients to call a single endpoint and run concurrently, not sequentially, but something like 1 thread per connection kind of analogy.
Of course, if I were to call other endpoint /test1 or /test2 while the code is still executing on /test3, I would still get a response straight from /test2, which is "test2" immediately.
app.get("/test1", (req, res) => {
console.log("test1");
setTimeout(() => res.send("test1"), 5000);
});
app.get("/test2", async (req, res, next) => {
console.log("test2");
res.send("test2");
});
app.get("/test3", (req, res) => {
console.log("test3");
setTimeout(() => res.send("test3"), 5000);
});
For those who have visited, it has got nothing to do with blocking of event loop.
I have found something interesting. The answer to the question can be found here.
When I was using chrome, the requests keep getting blocked after the first request. However, with safari, I was able to hit the endpoint concurrently. For more details look at the following link below.
GET requests from Chrome browser are blocking the API to receive further requests in NODEJS
Run your application in cluster. Lookup Pm2
This question needs more details to be answer and is clearly an opinion-based question. just because it is an strawman argument I will answer it.
first of all we need to define run concurrently, it is ambiguous if we assume the literal meaning in stric theory nothing RUNS CONCURRENTLY
CPUs can only carry out one instruction at a time.
The speed at which the CPU can carry out instructions is called the clock speed. This is controlled by a clock. With every tick of the clock, the CPU fetches and executes one instruction. The clock speed is measured in cycles per second, and 1c/s is known as 1 hertz. This means that a CPU with a clock speed of 2 gigahertz (GHz) can carry out two thousand million (or two billion for those in the US) for the rest of us/world 2000 million cycles per second.
cpu running multiple task "concurrently"
yes you're right now-days computers even cell phones comes with multi core which means the number of tasks running at the same time will depend upon the number of cores, but If you ask any expert such as this Associate Staff Engineer AKA me will tell you that is very very rarely you'll find a server with more than one core. why would you spend 500 USD for a multi core server if you can spawn a hold bunch of ...nano or whatever option available in the free trial... with kubernetes.
Another thing. why would you handle/configurate node to be incharge of the routing let apache and/or nginx to worry about that.
as you mentioned there is one thing call event loop which is a fancy way of naming a Queue Data Structure FIFO
so in other words. no, NO nodejs as well as any other programming language out there will run
but definitly it depends on your infrastructure.

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

How to make express Node.JS reply a request during heavy workload?

I'm creating an nodejs web processor. I's is processing time that takes ~ 1 minute. I POST to my server and get status by using GET
this is my simplified code
// Configure Express
const app = express();
app.listen(8080);
// Console
app.post('/clean, async function(req, res, next) {
// start proccess
let result = await worker.process(data);
// Send result when finish
res.send(result);
});
// reply with when asked
app.get('/clean, async function(req, res, next) {
res.send(worker.status);
});
The problem is. The server is working so hard in the POST /clean process that GET /clean are not replied in time.
All GET /clean requests are replied after the worker finishes its task and free the processor to respond the request.
In other words. The application are unable to respond during workload.
How can I get around this situation?
Because node.js runs your Javascript as single threaded (only one piece of Javascript ever running at once) and does not time slice, as long as your worker.process() is running it's synchronous code, no other requests can be processed by your server. This is why worker.process() has to finish before any of the http requests that arrived while it was running get serviced. The node.js event loop is busy until worker.process() is done so it can't service any other events (like incoming http requests).
These are some of the ways to work around that:
Cluster your app with the built-in cluster module so that you have a bunch of processes that can either work on worker.process() code or handle incoming http requests.
When it's time to call worker.process(), fire up a new node.js process, run the processing there and communicate back the result with standard interprocess communication. Then, your main node.js process stays reading to handle incoming http requests near instantly as they arrive.
Create a work queue of a group of additional node.js processes that run jobs that are put in the queue and configure these processes to be able to run your worker.process() code from the queue. This is a variation of #2 that bounds the number of processes and serializes the work into a queue (better controlled than #2).
Rework the way worker.process() does its work so that it can do a few ms of work at a time, then return back to the message loop so other events can run (like incoming http requests) and then resume it's work afterwards for a few more ms at a time. This usually requires building some sort of stateful object that can do a little bit of work at a time each time it is called, but is often a pain to program effectively.
Note that #1, #2 and #3 all require that the work be done in other processes. That means that the process.status() will need to get the status from those other processes. So, you will either need some sort of interprocess way of communicating with the other processes or you will need to store the status as you go in some storage that is accessible from all processes (such as redis) so it can just be retrieved from there.
There's no working around the single-threaded nature of JS short of converting your service to a cluster of processes or to use something experimental like Worker Threads.
If neither of these options work for you, you'll need to yield up the processing thread periodically to give other tasks the ability to work on things:
function workPart1() {
// Do a bunch of stuff
setTimeout(workPart2, 10);
}
function workPart2() {
// More stuff
setTimeout(workPart3, 10); // etc.
}

How to write nodejs service, running all time

I am new into nodeJs (and JS), so can you explain me (or give a link) how to write simple service of nodeJs, which run permanently?
I want to write service, which sends a request every second to foreign API at store the results it DB.
So, maybe nodeJs have some simple module to run js method (methods) over and over again?
Or, I just have to write while loop and do it there?
setInterval() is what you would use to run something every second in node.js. This will call a callback every NN milliseconds where you pass the value of NN.
var interval = setInterval(function() {
// execute your request here
}, 1000);
If you want this to run forever, you will need to also handle the situation where the remote server you are contacting is off-line or having issues and is not as responsive as normal (for example, it may take more than a second for a troublesome request to timeout or fail). It might be safer to repeatedly use setTimeout() to schedule the next request 1 second after one finishes processing.
function runRequest() {
issueRequest(..., function(err, data) {
// process request results here
// schedule next request
setTimeout(runRequest, 1000);
})
}
// start the repeated requests
runRequest();
In this code block issueRequest() is just a placeholder for whatever networking operation you are doing (I personally would probably use the request() module and use request.get().
Because your request processing is asynchronous, this will not actually be recursion and will not cause a stack buildup.
FYI, a node.js process with an active incoming server or an active timer or an in-process networking operation will not exit (it will keep running) so as long as you always have a timer running or a network request running, your node.js process will keep running.

Explaining nodejs style of development to a PHP developer in a specific circumstance

I'm a PHP web application developer that has built several large projects in PHP w/ CodeIgniter. PHP has always gotten the job done, but now I'm working on a new project that I'm building with the javascript extjs4 framework on the client-side. And I have some questions for experienced nodejs developers.
In my most recent PHP project, a user-login request required my server to make an API call to Facebook. The way I handled this, to improve scalability, was my client would make the initial login request, the server would pass the request to a 'gearman' job queing server, and a background worker process would grab the job and perform the API call. In the meantime, the server would reply to the client and then the client's browser would start polling the server with AJAX to see if the job had completed. (Oh, and I passed the results of the Facebook API call from the worker to the application server with memcached). I did this to free up my application servers to handle more concurrent requests from users since PHP is locking and a Facebook API call takes several seconds.
My question is, does this whole model of App servers, a gearman job queing server, and background workers make sense for nodejs development since nodejs is non-locking? Would I simply accept an ajax request from the client to login, call the facebook API from the application server and wait for it's response (while handling other user's requests since nodejs is non-locking) and then reply to the user?
I'm also considering getting into nodejs development for the sake of being able to take advantage of the awesome heroku environment.
The short answer is yes, the way you would typically handle this in a node system is exactly how you describe it.
Because node is non-blocking, the event-loop is constantly on the lookout for actionable requests. Here's an example using node-facebook-client (one of many npm modules built to be used with facebook APIs)
console.log('start');
client.graphCall(path, params, method)(function(result) {
// fires whenever request is completed
console.log('facebook returned');
});
console.log('end');
Outputs
start
end
facebook returned
As you can imagine, this is basically what all the fuss is about with node (plus it's really really fast). That said, it's also where the learning-curve is with node -- asyncronous execution. If 'end' needs to come after 'facebook returns', then you'd have to put it in the callback
console.log('start');
client.graphCall(path, params, method)(function(result) {
// fires whenever request is completed
console.log('facebook returned');
console.log('end');
});
Additionally, it's elementary to integrate dynamic child processes into your application when it is needed, including additional node processes. From the official docs for child_process.fork:
var cp = require('child_process');
var n = cp.fork(__dirname + '/sub.js');
n.on('message', function(m) {
console.log('PARENT got message:', m);
});
n.send({ hello: 'world' });
And then the child script, 'sub.js' might look like this:
process.on('message', function(m) {
console.log('CHILD got message:', m);
});
process.send({ foo: 'bar' });
In the child the process object will have a send() method, and process will emit objects each time it receives a message on its channel.

Categories

Resources