Node.js version: 4.5.0 - PM2 version: 2.0.18
I have an in-house Node.js app under test. I'm using pm2 for clustering and I assumed it will take care of load balancing. BUT when I ran my app on Windows and looking at the processes (using pm2 monit command) I cannot see even distribution of load. Sometimes one process spike to 70% and the other are not doing anything.
So, I think the load balancer of pm2 mechanism is at fault. I read somewhere that Its not using round robin on windows machines?
Could someone share their experience about this issue.
Also, I thought maybe Nginx is another solution to configure it as a load balancer in front of the pm2 Node.js cluster but not quite sure about the configuration
Related
Using
Angular CLI: 1.5.2
Node: 6.10.1
in a Mac OS Sierra,
I have installed several web applications so far. Anytime I do ng serve. All of the applications that I have installed in the past are served.
I guess it's the normal behavior. But I don't need that...
How do I "uninstall" old applications that I don't want anymore to be run?
Edited to answer comments:
I run npm start from one of the applications that get served. Actually, I run it from the only application that I'd like to be started. However, as I said, all of the applications get started up.
Some other facts:
They run under the same port (4200).
The first application I installed runs without any context in the URI. (http://localhost:4200)
The rest of the applications are started under the same 4200 port, but I need to complete the URI with the actual context.
I want to deploy my node application as single executable file, is it possible by using systemd, containers. I dont have enough knowledge on systemd and containers. Please help me if anybody knows about it.
Where i work we use pm2 to run NodeJS applications.
It allows you to run multiple instances on one server, monitors them and restarts if needed (on failure or you can provide a memory limit).
If you insist going with systemd, you will have to create a unit - a file that describes the execution path of you application for systemd.
It would usually be in /usr/lib/systemd/system
You would have to create a file ending with ".service"
[Unit]
Description=My NodeJS App
[Service]
ExecStart=/usr/bin/node /path/to/my/app/index.js
I am writing a nodejs application with Angular as my front end.
For this I am having Git as my code management server.
For client, I am doing minification and it is ready for production.
But I am not sure, how to prepare server side files for production.
Do we need to just copy all the Git folders into production server?.
Let me know the best way to deploy nodejs server application.
You could use pm2 as your daemon to keep your nodejs app up all the time.
Try not to include node_modules in the repo, cause different machines have different setups/installations, you cannot tell if one package would work before you run it unless you npm install them.
If you are familiar with Docker, use it, pre-bundle all (include node_modules) files into the docker image, and you do not need pm2 here, Docker itself can restart automatically. This is the ideal approach.
It really depends on how you (or your company) want to organize the workflow and the size of the project.
Sometimes I too use a GIT repository, because then is really simple to update: just a git pull and (if server files got edits) a pm2 restart N command.
In this way, you dont have to install the whole development stack in order to compile (and minify) the bundles - I guess you work on your local machine where all the development tools are installed.
Keep in mind to use the --dev flag while installing packages that are only required in development mode, so you can keep the production server as slim as possible.
A good practice I found is to add some random tokens inside the final bundle filename (both for js and css) that get then injected inside the final html static files, to avoid the refresh the page loop.
Once you have the bundle files on your dev machine, just upload them to the server (ftp, git, rsync, sshfs mount, whatever you like) and (if server files got edits) restart/reload the node process (Im using pm2 for this, its really great). If you only edited client files, no reload is needed.
Starting from here, there a lot of ways more or less sophisticated to do the job, like git pipelines for example.. but depends on the situation.
Edit: this is a good article about task runner (gulp vs grunt vs vanilla npm), while may be a little off topic, it analyze some aspect of the common deployment process
What type of server/service type supports Node.JS applications?
Is we need to install node/npm on server.
Does it need to be a dedicated server?
Thank you in advance.
I don't think you need anything special, even a random Raspberry PI with linux can host NodeJS app.
Since node.js have executable on Linux, Windows and Mac, the hardware will not be limited by what can run node.js, but by what your script need, and the workload you expect.
If you run a basic website with little trafic, a RPi will be enought, if you were to port Facebook or Google to node.js, you would still need complete dataserver.
So the only limit is third party utilities and your own knowledge of the platform you use.
Node.js Application can be hosted on Linux,Windows or any other O.S. And for node.js Application there is basic minimum setup is required. like Node.js , git bash, npm etc.
you can follow this link
My question is more on the lines of strategy than actual implementation.
And basically I'm wondering why do we build our MEAN apps on the server? And by build I mean getting components (npm install && bower install) and doing all the concat and minify stuff.
I'm trying to create my build system, and up until now I've been using a version of John Papa's build system, but my build is taking longer and longer on the server. So doesn't it make sense to just build everything locally and deploy it to the server? Or am I missing something?
Thanks
The build shouldn't happen on runtime. You have it partially right, build upfront, than deploy created artifact.
But the crucial idea is to have Continuous Integration in place. Meaning build server that is not on your local machine, which takes the code from SCM, builds it, run the tests, create deployable artifact and store it in some artifact repository (e.g. npm registry).
If you take it further and you also automatically deploy artifact into non-PROD envs, you are starting to dig into Continuous Delivery space.
If this build and deploy pipeline installs the artifact into PROD every commit, you are having Continuous Deployment working.
EDIT - reaction on comment:
The main idea is to have it continuous. Meaning, the full build is kicked off on regular basis, optimally every commit/git push.
If this is configured on you local machine and you are one man shop, that is probably fine. But as I played in my free time with various projects, I found that build on every commit may be resource intensive for my local machine and it was convenient to leave this responsibility for some third party service (especially when it's free).
There are plenty of online solutions for CI servers.
I used with success http://codeship.com and http://drone.io. http://cloudbees.com gives you hosted Jenkins. For open source projects they are free.
If you projects are not open source you will need to spend some bucks on it, but it should be cheap for one man projects.