NodeJS Returning data to client browser - javascript

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.

The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

Related

How to make express Node.JS reply a request during heavy workload?

I'm creating an nodejs web processor. I's is processing time that takes ~ 1 minute. I POST to my server and get status by using GET
this is my simplified code
// Configure Express
const app = express();
app.listen(8080);
// Console
app.post('/clean, async function(req, res, next) {
// start proccess
let result = await worker.process(data);
// Send result when finish
res.send(result);
});
// reply with when asked
app.get('/clean, async function(req, res, next) {
res.send(worker.status);
});
The problem is. The server is working so hard in the POST /clean process that GET /clean are not replied in time.
All GET /clean requests are replied after the worker finishes its task and free the processor to respond the request.
In other words. The application are unable to respond during workload.
How can I get around this situation?
Because node.js runs your Javascript as single threaded (only one piece of Javascript ever running at once) and does not time slice, as long as your worker.process() is running it's synchronous code, no other requests can be processed by your server. This is why worker.process() has to finish before any of the http requests that arrived while it was running get serviced. The node.js event loop is busy until worker.process() is done so it can't service any other events (like incoming http requests).
These are some of the ways to work around that:
Cluster your app with the built-in cluster module so that you have a bunch of processes that can either work on worker.process() code or handle incoming http requests.
When it's time to call worker.process(), fire up a new node.js process, run the processing there and communicate back the result with standard interprocess communication. Then, your main node.js process stays reading to handle incoming http requests near instantly as they arrive.
Create a work queue of a group of additional node.js processes that run jobs that are put in the queue and configure these processes to be able to run your worker.process() code from the queue. This is a variation of #2 that bounds the number of processes and serializes the work into a queue (better controlled than #2).
Rework the way worker.process() does its work so that it can do a few ms of work at a time, then return back to the message loop so other events can run (like incoming http requests) and then resume it's work afterwards for a few more ms at a time. This usually requires building some sort of stateful object that can do a little bit of work at a time each time it is called, but is often a pain to program effectively.
Note that #1, #2 and #3 all require that the work be done in other processes. That means that the process.status() will need to get the status from those other processes. So, you will either need some sort of interprocess way of communicating with the other processes or you will need to store the status as you go in some storage that is accessible from all processes (such as redis) so it can just be retrieved from there.
There's no working around the single-threaded nature of JS short of converting your service to a cluster of processes or to use something experimental like Worker Threads.
If neither of these options work for you, you'll need to yield up the processing thread periodically to give other tasks the ability to work on things:
function workPart1() {
// Do a bunch of stuff
setTimeout(workPart2, 10);
}
function workPart2() {
// More stuff
setTimeout(workPart3, 10); // etc.
}

With Electron/Node.js, how do I implement simple sequential code asynchronously?

I am working on a project where my Electron App interacts with a physical device using serial commands, via serialport. The app sends a string to the device, the device executes the command (which can take ~30s) and then sends back a string to signify completion and results from that operation.
My goal is to automate a series of actions. For that, basically the following needs to be done asynchronously, so that the render thread doesn't get blocked:
Start a loop
Send a string to the device
Wait until a specific response comes back
Tell the render thread about the response, so it can update the UI
Afterwards, repeat with the next string.
Actually, multiple different commands need to be send in each loop cycle, and between each one the app has to wait for a specific string from the device.
This is kind of related to my last question, What's the correct way to run a function asynchronously in Electron?. From that, I know I should use web workers to run something asynchronously. However, my plan turned out to involve more problems than I anticipated, and I wanted to ask what would be a good way to implement this, having the whole plan in mind and not just a certain aspect of it.
I am especially not sure how to make the worker work with serialport. The serial device it needs to interact with is a child of the render process, so sending commands will probably be done over web worker messages. But I have no idea on how to make the worker wait for a specific response from the device.
(Since this question is of a more general nature, I am unsure whether I should provide some code snippets. If this is to general, I can try to write some pseudo code to make my problem more clear.)
I would go for a promise-based approach like this:
let promiseChain = Promise.resolve();
waitForEvent = function(){
return new Promise(resolve=>{
event.on("someEvent", (eventData => {
resolve(eventData)
}))
})
}
while(someLoopCondition) {
promiseChain = promiseChain
.then(sendToSerialPort(someString))
.then(waitForEvent)
.then(result=>{
updateUI(result)
})
}

Node.js API to spawn off a call to another API

I created a Node.js API.
When this API gets called I return to the caller fairly quickly. Which is good.
But now I also want API to call or launch an different API or function or something that will go off and run on it's own. Kind of like calling a child process with child.unref(). In fact, I would use child.spawn() but I don't see how to have spawn() call another API. Maybe that alone would be my answer?
Of this other process, I don't care if it crashes or finishes without error.
So it doesn't need to be attached to anything. But if it does remain attached to the Node.js console then icing on the cake.
I'm still thinking about how to identify & what to do if the spawn somehow gets caught up in running a really long time. But ready to cross that part of this yet.
Your thoughts on what I might be able to do?
I guess I could child.spawn('node', [somescript])
What do you think?
I would have to explore if my cloud host will permit this too.
You need to specify exactly what the other spawned thing is supposed to do. If it is calling an HTTP API, with Node.js you should not launch a new process to do that. Node is built to run HTTP requests asynchronously.
The normal pattern, if you really need some stuff to happen in a different process, is to use something like a message queue, the cluster module, or other messaging/queue between processes that the worker will monitor, and the worker is usually set up to handle some particular task or set of tasks this way. It is pretty unusual to be spawning another process after receiving an HTTP request since launching new processes is pretty heavy-weight and can use up all of your server resources if you aren't careful, and due to Node's async capabilities usually isn't necessary especially for things mainly involving IO.
This is from a test API I built some time ago. Note I'm even passing a value into the script as a parameter.
router.put('/test', function (req, res, next) {
var u = req.body.u;
var cp = require('child_process');
var c = cp.spawn('node', ['yourtest.js', '"' + u + '"'], { detach: true });
c.unref();
res.sendStatus(200);
});
The yourtest.js script can be just about anything you want it to be. But I thought I would have enjoy learning more if I thought to first treat the script as a node.js console desktop app. FIRST get your yourtest.js script to run without error by manually running/testing it from your console's command line node yourstest.js yourparamtervalue THEN integrate it in to the child.spawn()
var u = process.argv[2];
console.log('f2u', u);
function f1() {
console.log('f1-hello');
}
function f2() {
console.log('f2-hello');
}
setTimeout(f2, 3000); // wait 3 second before execution f2(). I do this just for troubleshooting. You can watch node.exe open and then close in TaskManager if node.exe is running long enough.
f1();

understanding of node js performance

I recently discovered Node js and I read in various articles that Node js is fast and can handle more requests than a Java server although Node js use a single thread.
I understood that Node is based on an event loop, each call to a remote api or a database is done with an async call so the main thread is never blocked and the server can continue to handle others client requests.
If I understood well, each portion of code that can take times should be processed with an async call otherwise the server will be blocked and it won't be able to handle others requests ?
var server = http.createServer(function (request, response) {
//CALL A METHOD WHICH CAN TAKE LONG TIME TO EXECUTE
slowSyncMethod();
//THE SERVER WILL STILL BE ABLE TO HANDLER OTHERS REQUESTS ??
response.writeHead(200, {"Content-Type":"text/plain"});
response.end("");
});
So if my understanding is correct, the above code is bad because the synchronous call to the slow method will block the Node js main thread ? Is Node js fast on condition that all the code that can take times are executed in an async manner ?
NodeJs is as fast as your hardware(vm) and the v8 that is running it. that being said, any heavy duty task like any type of media(music, image, video etc) file processing will definitively lock your application. so will computation on large collections thats why the async model is leveraged though events, and deferred invocations. that being said nothing stops you from spawning child processes to relegate heavy duty and asynchronously get back the result. But if you are finding your self in the need to do this for many tasks, maybe you should revisit your architecture.
I hope thhis helps

Explaining nodejs style of development to a PHP developer in a specific circumstance

I'm a PHP web application developer that has built several large projects in PHP w/ CodeIgniter. PHP has always gotten the job done, but now I'm working on a new project that I'm building with the javascript extjs4 framework on the client-side. And I have some questions for experienced nodejs developers.
In my most recent PHP project, a user-login request required my server to make an API call to Facebook. The way I handled this, to improve scalability, was my client would make the initial login request, the server would pass the request to a 'gearman' job queing server, and a background worker process would grab the job and perform the API call. In the meantime, the server would reply to the client and then the client's browser would start polling the server with AJAX to see if the job had completed. (Oh, and I passed the results of the Facebook API call from the worker to the application server with memcached). I did this to free up my application servers to handle more concurrent requests from users since PHP is locking and a Facebook API call takes several seconds.
My question is, does this whole model of App servers, a gearman job queing server, and background workers make sense for nodejs development since nodejs is non-locking? Would I simply accept an ajax request from the client to login, call the facebook API from the application server and wait for it's response (while handling other user's requests since nodejs is non-locking) and then reply to the user?
I'm also considering getting into nodejs development for the sake of being able to take advantage of the awesome heroku environment.
The short answer is yes, the way you would typically handle this in a node system is exactly how you describe it.
Because node is non-blocking, the event-loop is constantly on the lookout for actionable requests. Here's an example using node-facebook-client (one of many npm modules built to be used with facebook APIs)
console.log('start');
client.graphCall(path, params, method)(function(result) {
// fires whenever request is completed
console.log('facebook returned');
});
console.log('end');
Outputs
start
end
facebook returned
As you can imagine, this is basically what all the fuss is about with node (plus it's really really fast). That said, it's also where the learning-curve is with node -- asyncronous execution. If 'end' needs to come after 'facebook returns', then you'd have to put it in the callback
console.log('start');
client.graphCall(path, params, method)(function(result) {
// fires whenever request is completed
console.log('facebook returned');
console.log('end');
});
Additionally, it's elementary to integrate dynamic child processes into your application when it is needed, including additional node processes. From the official docs for child_process.fork:
var cp = require('child_process');
var n = cp.fork(__dirname + '/sub.js');
n.on('message', function(m) {
console.log('PARENT got message:', m);
});
n.send({ hello: 'world' });
And then the child script, 'sub.js' might look like this:
process.on('message', function(m) {
console.log('CHILD got message:', m);
});
process.send({ foo: 'bar' });
In the child the process object will have a send() method, and process will emit objects each time it receives a message on its channel.

Categories

Resources