Understanding Event-Driven in NodeJS - javascript

Consider this simple example:
var BinaryServer = require('../../').BinaryServer;
var fs = require('fs');
// Start Binary.js server
var server = BinaryServer({port: 9000});
// Wait for new user connections
server.on('connection', function(client){
// Stream a flower image!
var file = fs.createReadStream(__dirname + '/flower.png');
client.send(file);
sleep_routine(5);//in seconds
});
When a client connects to the server I block the event for about 5 seconds (imagine that time has some complex operations). What is expect to happen if another client connects (meanwhile)? One thing that I read about NodeJS is non-blocking I/O. But in this case the second client only receive the flower after the sleeping of the first, right?

One thing that I read about NodeJS is non-blocking I/O. But in this case the second client only receive the flower after the sleeping of the first, right?
That's correct, assuming that you are doing blocking synchronous operations for five seconds straight. If you do any file system IO, or any IO for that matter, or use a setTimeout, then the other client will get their opportunity to use the thread and get the flower image. So, if you're doing really heavy cpu intensive processing, you have a few choices:
Fire it off in a separate process that runs asynchronously, EG using the built-in child_process module
Keep track of how long you've been processing for and every 100ms or so give up the thread by saving your state, and then using setTimeout to continue processing where you left off
Have multiple node processes already running, so that if one is busy there is another that can serve the second user (EG. behind a load balancer, or using the cluster module)
I would recommend a combination of 1 and 3 if this is ever a problem; but so much of node can be made asynchronous that it rarely is. Even things like computing password hashes can be done asynchronously

No - the two requests will be handled independently. So if the first request had to wait 5 seconds, and for some reason the second request only took 2 seconds, the second would return before the first.
In practice you would have your server connection called a second time before the first one had finished. But since they have all different state you would normally be unaware of that.

Related

Handle Multiple Concurent Requests for Express Sever on Same Endpoint API

this question might be duplicated but I am still not getting the answer. I am fairly new to node.js so I might need some help. Many have said that node.js is perfectly free to run incoming requests asynchronously, but the code below shows that if multiple requests hit the same endpoint, say /test3, the callback function will:
Print "test3"
Call setTimeout() to prevent blocking of event loop
Wait for 5 seconds and send a response of "test3" to the client
My question here is if client 1 and client 2 call /test3 endpoint at the same time, and the assumption here is that client 1 hits the endpoint first, client 2 has to wait for client 1 to finish first before entering the event loop.
Can anybody here tells me if it is possible for multiple clients to call a single endpoint and run concurrently, not sequentially, but something like 1 thread per connection kind of analogy.
Of course, if I were to call other endpoint /test1 or /test2 while the code is still executing on /test3, I would still get a response straight from /test2, which is "test2" immediately.
app.get("/test1", (req, res) => {
console.log("test1");
setTimeout(() => res.send("test1"), 5000);
});
app.get("/test2", async (req, res, next) => {
console.log("test2");
res.send("test2");
});
app.get("/test3", (req, res) => {
console.log("test3");
setTimeout(() => res.send("test3"), 5000);
});
For those who have visited, it has got nothing to do with blocking of event loop.
I have found something interesting. The answer to the question can be found here.
When I was using chrome, the requests keep getting blocked after the first request. However, with safari, I was able to hit the endpoint concurrently. For more details look at the following link below.
GET requests from Chrome browser are blocking the API to receive further requests in NODEJS
Run your application in cluster. Lookup Pm2
This question needs more details to be answer and is clearly an opinion-based question. just because it is an strawman argument I will answer it.
first of all we need to define run concurrently, it is ambiguous if we assume the literal meaning in stric theory nothing RUNS CONCURRENTLY
CPUs can only carry out one instruction at a time.
The speed at which the CPU can carry out instructions is called the clock speed. This is controlled by a clock. With every tick of the clock, the CPU fetches and executes one instruction. The clock speed is measured in cycles per second, and 1c/s is known as 1 hertz. This means that a CPU with a clock speed of 2 gigahertz (GHz) can carry out two thousand million (or two billion for those in the US) for the rest of us/world 2000 million cycles per second.
cpu running multiple task "concurrently"
yes you're right now-days computers even cell phones comes with multi core which means the number of tasks running at the same time will depend upon the number of cores, but If you ask any expert such as this Associate Staff Engineer AKA me will tell you that is very very rarely you'll find a server with more than one core. why would you spend 500 USD for a multi core server if you can spawn a hold bunch of ...nano or whatever option available in the free trial... with kubernetes.
Another thing. why would you handle/configurate node to be incharge of the routing let apache and/or nginx to worry about that.
as you mentioned there is one thing call event loop which is a fancy way of naming a Queue Data Structure FIFO
so in other words. no, NO nodejs as well as any other programming language out there will run
but definitly it depends on your infrastructure.

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

How to make express Node.JS reply a request during heavy workload?

I'm creating an nodejs web processor. I's is processing time that takes ~ 1 minute. I POST to my server and get status by using GET
this is my simplified code
// Configure Express
const app = express();
app.listen(8080);
// Console
app.post('/clean, async function(req, res, next) {
// start proccess
let result = await worker.process(data);
// Send result when finish
res.send(result);
});
// reply with when asked
app.get('/clean, async function(req, res, next) {
res.send(worker.status);
});
The problem is. The server is working so hard in the POST /clean process that GET /clean are not replied in time.
All GET /clean requests are replied after the worker finishes its task and free the processor to respond the request.
In other words. The application are unable to respond during workload.
How can I get around this situation?
Because node.js runs your Javascript as single threaded (only one piece of Javascript ever running at once) and does not time slice, as long as your worker.process() is running it's synchronous code, no other requests can be processed by your server. This is why worker.process() has to finish before any of the http requests that arrived while it was running get serviced. The node.js event loop is busy until worker.process() is done so it can't service any other events (like incoming http requests).
These are some of the ways to work around that:
Cluster your app with the built-in cluster module so that you have a bunch of processes that can either work on worker.process() code or handle incoming http requests.
When it's time to call worker.process(), fire up a new node.js process, run the processing there and communicate back the result with standard interprocess communication. Then, your main node.js process stays reading to handle incoming http requests near instantly as they arrive.
Create a work queue of a group of additional node.js processes that run jobs that are put in the queue and configure these processes to be able to run your worker.process() code from the queue. This is a variation of #2 that bounds the number of processes and serializes the work into a queue (better controlled than #2).
Rework the way worker.process() does its work so that it can do a few ms of work at a time, then return back to the message loop so other events can run (like incoming http requests) and then resume it's work afterwards for a few more ms at a time. This usually requires building some sort of stateful object that can do a little bit of work at a time each time it is called, but is often a pain to program effectively.
Note that #1, #2 and #3 all require that the work be done in other processes. That means that the process.status() will need to get the status from those other processes. So, you will either need some sort of interprocess way of communicating with the other processes or you will need to store the status as you go in some storage that is accessible from all processes (such as redis) so it can just be retrieved from there.
There's no working around the single-threaded nature of JS short of converting your service to a cluster of processes or to use something experimental like Worker Threads.
If neither of these options work for you, you'll need to yield up the processing thread periodically to give other tasks the ability to work on things:
function workPart1() {
// Do a bunch of stuff
setTimeout(workPart2, 10);
}
function workPart2() {
// More stuff
setTimeout(workPart3, 10); // etc.
}

Node JS single Threaded Vs MultiThreading (CPU Utilization : Any difference ? )

I have started reading a lot about Node JS lately and one thing I am not able to clearly understand from differentiation perspective is what is the real difference between how I/O is handled by Asynchronous Vs Synchronous call.
As I understand, in a multi threaded synchronous environment , if I/O is started , the running thread is preempted and moves back to a waiting state.So essentially this is same as what happens with NodeJS asynchronous I/O call. In Node JS, when I/O is called, the I/O operation is moved out of current running thread and sent to event De-multiplexer for completion and notification. As soon as I/O is complete the callback method is pushed to event Queue for further processing.
So , the only difference I see is that in Node JS we are saving memory (due to multiple call stacks owned by each thread) and CPU ( saved because of no context switching). If I just consider that I have enough memory to buy , does saving of CPU due to context switching alone is making the huge performance difference?
If my above understanding is not correct, how does I/O handling is any different between a java thread Vs Node JS w.r.t to keeping the CPU busy and not wasting CPU cycles.Are we saving only context switching CPU cycles with Node JS or there is more to that?
Based on the responses , I would like to add another scenario:
Request A , Request B comes to J2ee server at the same time. Each request takes 10 ms to complete in this multi threaded environment.Out of 10 ms , 5 ms is spent in executing the code logic to compute some logic and 5 ms is spent in I/O for pulling a large dataset from a DBMS.The call to DBMS is the last line of the code after which the response should be sent to the client.
If same application converted to a node JS application, this is what might happen
Request A comes, 5 ms is used for processing the request.
DBMS call is hit from the code but it's non blocking.So a callback method
is pushed to event Queue.
After 5 ms, Request B is served and again
request B is pushed to event Queue for I/O completion.Request B
takes 5 ms for processing.
Event Loop runs, pickups callback
handler for request A, which then sends the response to client.So
the response is sent after 10 ms because Req A and Req B both took 5
ms for synchronous code block processing.
Now where is the time saved in such a scenario?Apart from context switching and creating 2 threads. Req A & Req B both took 10 ms anyway with Node JS. ?
As I understand, in a multi threaded synchronous environment , if I/O
is started , the running thread is preempted and moves back to a
waiting state.So essentially this is same as what happens with NodeJS
asynchronous I/O call.
Nope, in NodeJS an asynchronous I/O call is a non-blocking I/O. Which means that once the thread has made an I/O call it doesn't wait for the I/O to complete and move on to the next statement/task to execute.
Once the I/O completes it picks up the next task from event-loop-queue and eventually executes callback handler which was given to it while making the I/O call.
If I just consider that I have enough memory to buy , does saving of
CPU due to context switching alone is making the huge performance
difference?
Apart from this, saving is also coming from these two things
Not-Having-To-Wait for the I/O to complete
Not-Having-To-Make-Threads since threads are limited, so the system's capacity is not limited by how many threads it can make.
Apart from context switching and creating 2 threads. Req A & Req B
both took 10 ms anyway with Node JS. ?
You are discounting one thing here - Thread is getting two request one after the other after a specific interval. So if one thread is going to take 10 seconds, then it will take a new thread to execute the second request. Extapolate this to thousands of requests and your OS having to make thousands of threads to deal with so many concurrent users. Refer to this analogy.

How to write nodejs service, running all time

I am new into nodeJs (and JS), so can you explain me (or give a link) how to write simple service of nodeJs, which run permanently?
I want to write service, which sends a request every second to foreign API at store the results it DB.
So, maybe nodeJs have some simple module to run js method (methods) over and over again?
Or, I just have to write while loop and do it there?
setInterval() is what you would use to run something every second in node.js. This will call a callback every NN milliseconds where you pass the value of NN.
var interval = setInterval(function() {
// execute your request here
}, 1000);
If you want this to run forever, you will need to also handle the situation where the remote server you are contacting is off-line or having issues and is not as responsive as normal (for example, it may take more than a second for a troublesome request to timeout or fail). It might be safer to repeatedly use setTimeout() to schedule the next request 1 second after one finishes processing.
function runRequest() {
issueRequest(..., function(err, data) {
// process request results here
// schedule next request
setTimeout(runRequest, 1000);
})
}
// start the repeated requests
runRequest();
In this code block issueRequest() is just a placeholder for whatever networking operation you are doing (I personally would probably use the request() module and use request.get().
Because your request processing is asynchronous, this will not actually be recursion and will not cause a stack buildup.
FYI, a node.js process with an active incoming server or an active timer or an in-process networking operation will not exit (it will keep running) so as long as you always have a timer running or a network request running, your node.js process will keep running.

Categories

Resources