nodejs without stopping - javascript

Is there a way I can make nodejs reload everytime it serves a page?
I want to do this during the dev cycle so I can avoid having to shutdown & startup on each code change?

Edit: Try nodules and their require.reloadable() function.
My former answer was about why not to reload the process of Node.js and does not really apply here. But I think it is still important, so I leave it here.
Node.js is evented IO and crafted specifically to avoid multiple threads or processes. The famous C10k problem asks how to serve 10 thousand clients simultaneously. This is where threads don't work very well. Node.js can serve 10 thousand clients with only one thread. If you were to restart Node.js each time you would severely cripple Node.js.
What does evented IO mean?
To take your example: serving a page. Each time Node.js is about to serve a page, a callback is called by the event loop. The event loop is inherent to each Node.js application and starts running after initializations have completed. Node.js on the server-side works exactly like client-side Javascript in the browser. Whenever an event (mouse-click, timeout, etc.) happens, a callback - an event handler - is called.
And on the server side? Let's have a look at a simple HTTP server (source code example taken from Node.js documentation)
var http = require('http');
http.createServer(function (request, response) {
response.writeHead(200, {'Content-Type': 'text/plain'});
response.end('Hello World\n');
}).listen(8124);
console.log('Server running at http://127.0.0.1:8124/');
This first loads the http module, then creates an HTTP server and tells it to invoke the inner function starting with function (request, response) every time an HTTP request comes in, then makes the server listen to port 8124. This completes almost immediately so that console.log will be executed thereafter.
Now Node.js event loop takes over. The application does not end but waits for requests. And voilĂ  each request is answered with Hello World\n.
In a summary, don't restart Node.js, but let its event loop decide when your code has to be run.

Found Nodemon, exactly what I wanted: https://github.com/remy/nodemon

I personnaly use spark2 (the fork) and will switch to cluster as soon as i found the time to test it. Among other things, those 2 will listen to file changes and reload the server when appropriate, which seems to be what you're looking for.

There are many apps for doing this, but I think the best is Cluster, since you have 0 downtime for your server. You can also set multiple workers with it or manually start/stop/restart/show stats with the cli REPL functionality.

No. Node.js should always run. I think you may have misunderstood the concept of nodejs. In order for nodejs to serve pages, it has to be its own server. Once you start a server instance and start listening on ports, it will run until you close it.
Is it possible that you've called a function to close the server instead of just closing a given stream?

Related

server.close() vs process.disconnect()

I am currently using cluster in my node.js application but have a bit of confusion on gracefully exiting the server.
Previously I was using just a single process, so in the 'uncaughtException' handler I call server.close(), which stops all incoming requests, then I also set up a timer for about 10 seconds, after that I call process.exit() to kill the server.
Now with cluster, each child process is created with an IPC channel, and calling process.disconnect() seems to do exactly the same thing as what I described for the single node process.
My question is, when using cluster and if I want to gracefully exit the service, what should I do? seems like process.disconnect() is good enough?

Run process remotely from an http page

I'm using OBS-WebSocket.js as a component (available on server) to call for change the OBS scenes, through OBS websockets. This way is compatible with all platforms, I can load the main html page from server and clicking a button I can switch the scenes.
I would find a way now to kill and reload OBS in case it hangs, from the same http page, I?m asking if there is a similar way to start a process on server, I mean:
a process running on server, listening on a tcp port and ready to start a batch file (or python script)
a .js file to invoke from the http page, able to send a proper command to that listening port.
There is something like this already? Thank you a lot
You can set up a simple Express server that listens to HTTP POST requests on e.g. /restart. You can use the child_process module to spawn a process and later on kill it.

Nginx and Node.js server - multiple tasks

UPDATE
I have a few questions about the combination of Nginx and Nodejs.
I've used Nodejs to create my server and now I'm facing with an issue about catching the server for an actions (writing, removing and etc..).
We are using Redis to lock the server when there are requests to the server, for example if a new user is doing a sign up action all the rest of the requests are waiting until the process is done, or if there is another process (longer one) all the other requests will wait longer.
We thought about creating a Load balancer (using Nginx) that will check if the server is locked, and if the server is locked it will open a new task and won't wait until the first process is done.
I used this tutorial and created a dummy server, then I've struggled with the idea of do this functionality of opening a new ports.
I'm new with load balancing implementation and I will be happy to hear your thoughts and help.
Thank you.
The gist of it is that your server needs to not crash if more than one connection attempt are made to it. Even if you use NGINX as a load balancer and have five different instances of your server running...what happens when six clients try to access your app at once?
I think you are thinking about load balancers slightly wrong. There are different load balancing methods, but the simplest one to think about is "round robin" in which each connection gets forwarded to the next server in the list (the rest are just more robust and complicated versions of this one). When there are no more servers to forward to, the next connection gets forwarded to the first server again (whether or not it is done with its last connection) and the circle starts over. Thus, load balancers aren't supposed to manage "unique connections" from clients...they are supposed to distribute connections among servers.
Your server doesn't necessarily need to accept connections and handle them all at once. But it needs to at least allow connections to queue up without crashing, and then accept and deal with each one by one.
You can go the route you are discussing. That is, you can fire up a unique instance of your server...via Heroku or other...for every single connection that is made to your app. But this is not efficient and will ultimately create more work for you in trying to architect a system that can do that well. Why not just fix your server?

Node web server crash

I'm working on Node + Mongo + Express to create REST API. There are cases when node server gets crashed and I've to restart it again. I'm using forever to do the restart stuff. But I am unable to find the solution for process which are lost during the crash.
Example: I am handling 10 http request at a moment and my node server get crashed for any request. In this case other 9 running request will be lost.
Is there any fallback mechanism to prevent this?
The nearest I see right now would be a NodeJS cluster, which I believe the master will handle the crash of one of his child and move the http request, but, if this does not work then make it with nginx and some node processes at the same time, it will handle that kind of things
Hope it helps you.
The server crashes if there is an unhandled exception, so you need to add error handling to your functions with try catch. There are several events sent to the process objects that can help you with your problem. You could try listening for uncaughtException and then follow a strategy like it's mentioned here: http://blog.argteam.com/coding/hardening-node-js-for-production-part-3-zero-downtime-deployments-with-nginx/, just replace 'SIGTERM'.

Better to have a single, global connection to memcached/mongo from node or connections on every request?

I am (re)writing an api from php to node that has extremely high usage; it uses memcached and currently connects to memcached, does its business and closes afterwards every time the php file run, ie every time the api page is accessed.
Now with node, we are building the http server itself; I could define a global memcached connection and use that:
var mclient = new memcache.Client(server, port);
// This would only run after mclient has connected
http.createServer(function (req,res){
// stuff involving mclient
}).listen(8888);
or I could put it inside the createServer callback. The same question also applies to mongodb connections. Which should I do?
I would suggest using a pool of more than one connection. This will allow multiple in-flight operations at once while saving you the overhead of the TCP handshake / rate-limiting and application-level handshaking (such as log-in) that comes when you create new connections.
I don't know about memcached, but I think the node-native-driver for MongoDB (https://github.com/christkv/node-mongodb-native) is working on built-in connection pooling so you don't have to implement this yourself. There is a note in the docs saying it's coming soon, but it looks like the code is already there so it should be very soon.
Update: I just checked with the author of the driver and connection pooling is fully implemented, he just forgot to remove that line from the docs.
One connection is better because you don't need to pass authorization and initialization each time you make request to db. Almost everything that you want to be fast accessed you should to store in RAM, not only db connections, but general templates or other files that you often use, configuration or some queues.

Categories

Resources