NodeJS server side application deployment cosiderations - javascript

I am writing a nodejs application with Angular as my front end.
For this I am having Git as my code management server.
For client, I am doing minification and it is ready for production.
But I am not sure, how to prepare server side files for production.
Do we need to just copy all the Git folders into production server?.
Let me know the best way to deploy nodejs server application.

You could use pm2 as your daemon to keep your nodejs app up all the time.
Try not to include node_modules in the repo, cause different machines have different setups/installations, you cannot tell if one package would work before you run it unless you npm install them.
If you are familiar with Docker, use it, pre-bundle all (include node_modules) files into the docker image, and you do not need pm2 here, Docker itself can restart automatically. This is the ideal approach.

It really depends on how you (or your company) want to organize the workflow and the size of the project.
Sometimes I too use a GIT repository, because then is really simple to update: just a git pull and (if server files got edits) a pm2 restart N command.
In this way, you dont have to install the whole development stack in order to compile (and minify) the bundles - I guess you work on your local machine where all the development tools are installed.
Keep in mind to use the --dev flag while installing packages that are only required in development mode, so you can keep the production server as slim as possible.
A good practice I found is to add some random tokens inside the final bundle filename (both for js and css) that get then injected inside the final html static files, to avoid the refresh the page loop.
Once you have the bundle files on your dev machine, just upload them to the server (ftp, git, rsync, sshfs mount, whatever you like) and (if server files got edits) restart/reload the node process (Im using pm2 for this, its really great). If you only edited client files, no reload is needed.
Starting from here, there a lot of ways more or less sophisticated to do the job, like git pipelines for example.. but depends on the situation.
Edit: this is a good article about task runner (gulp vs grunt vs vanilla npm), while may be a little off topic, it analyze some aspect of the common deployment process

Related

VertX Webserver static content webroot

I've got two projects which I've created:
A web UI built using webpack
A Vert.x webserver written in java built using Gradle
I want to find a way to bring the resulting build dir contents of the first project into the second as the webroot which will be server up using the StaticHandler.
Is anyone aware of a clean way to do this? I want to preserve the two git projects as they are because I like using the webpack dev server for development of the UI and it generally feels cleaner to have them separated.
I was looking at potentially using the bitbucket pipelines build on my repo, however bringing the assets generated by the first project into the build of the second is where I'm facing issues.
You could create a gradle task that before that depends on the jar task (so it runs before it) executes webpack compile into the resources directory. So when your jar task runs it bundles the compiled webpack code.

NodeWebkit - deploy the application

I have one code base for both Web and NodeWebkit (NW) application.
I use the following stack:
- React
- Hapi
- Sequelize
- Windows environment
Web version of the application uses MySQL, while NW uses Sqlite. It all works fine. I have config file that compiles application for what I need (web or NW).
The problem that I face now is how to deploy the NW application. Idea is to provide NW applicaiton to a client, where he will open it clicking the icon.
Since I use the Node for the NW version, and the application uses many modules which are stored in node_modules, I face a challenge how to pack it all up.
My idea is to make an Windows installer. User will click it and the installer will extract all files to the destination. And also make an icon on the user desktop to run it.
Problem is with the Windows file name limitation. Inside the node_modules, there are many subdirectories that simply violate the Windows limitation. I cant even copy the node_modules folder. I cant even delete it. Well sure I can copy it If I zip it... or remove manually long folders.
I have not yet started working on the installer, but I am thinking I will hit the wall with this approach.
Does anyone have an idea how to make this deployment?
How can I integrate NPM3 in NW?
My plan now is to make Windows installer. That windows installer will install normally application files. The node_modules will be zipped previously and placed inside the installer. Installer will then simply unzip it to the destionation folder.
I will post my progress here.
Some update here.
Main issue here was the depth of the node_modules. I have many modules in node_modules, and after some thinking I figured out there is a simple rule there. Some modules are server side modules, while other ones are used by react.
And since Webpack already creates a huge files in which all of the modules are already included, I simply do not need them at all.
So I have removed all front end side modules(babel modules, react-*), and left only server side (Hapi, sequelize...). Miracle happened, application run and was much faster at the startup.
I am going to use Inno setup to make a manifest file, and it should be good to go.
I am still not out of the danger zone, as developer might need a server side module, which has huge depth. But I will think about that if it happens.
More to follow...
actually in nodejs you can do the following:
1-Create another folder inside your project folder for example "server_modules"
2-In the created folder create another package.json file and install any modules needed for server out there
3-All these modules will be accessible as normal node_modules using require('module_name') and you can delete "server_modules" folder when you package your desktop version if you don't need it
Note: this approach used by some developers to achive micro services in nodejs but it is useful in your case

Do I need to keep a copy of js library in lib or vendor folder though already installed using npm?

Question 1 :
I am installing my project dependency libraries using npm and it gets stored in the npm_modules folder. Is it necessary to keep the copy of library like angular.js,angular-route.js in lib folder or vendor folder? I could see few people are using lib folder or vendor folders to store the library in the permanent manner. I am confused by seeing this.
Question 2:
Do I need to copy/paste the node_modules folder to production or just run the npm install command on the project folder's command prompt to install all the dependencies in production. How does a dependency library get promoted to production?
Thank you kindly for your advice.
It all depends on how you need to deploy your site to production, really. Ultimately, you will probably want to bundle all your JS files into one or a few files, which are minified and sent with gzip compression.
How you bundle them is up to you. There are quite a few options:
Browserify
Webpack
Grunt / gulp build process
And many more besides
As to whether you need to keep a copy of these bundled javascript files under version control, well I think that boils down to 1 key question: can you run a build process (such as one of the tools using NodeJS) on the production server, or on a build server that creates a zip file or installer? If so, then you don't need to include them, just get the build server or production server to check out the latest copy from version control, npm install and then run the build process.
But if the best you could do is have the production server check files out from source control, then you would want to include the final versions of the files to use in the repository.
Keeping generated files, such as your bundled javascript files, in your source control repo should be avoided where possible. Because otherwise, every commit has to contain the changes to the source files, and the corresponding change to the generated files as well. And the latter is just noise, and has to be ignored by every developer looking at a diff/patch for a commit.

Why build a MEAN app on the server?

My question is more on the lines of strategy than actual implementation.
And basically I'm wondering why do we build our MEAN apps on the server? And by build I mean getting components (npm install && bower install) and doing all the concat and minify stuff.
I'm trying to create my build system, and up until now I've been using a version of John Papa's build system, but my build is taking longer and longer on the server. So doesn't it make sense to just build everything locally and deploy it to the server? Or am I missing something?
Thanks
The build shouldn't happen on runtime. You have it partially right, build upfront, than deploy created artifact.
But the crucial idea is to have Continuous Integration in place. Meaning build server that is not on your local machine, which takes the code from SCM, builds it, run the tests, create deployable artifact and store it in some artifact repository (e.g. npm registry).
If you take it further and you also automatically deploy artifact into non-PROD envs, you are starting to dig into Continuous Delivery space.
If this build and deploy pipeline installs the artifact into PROD every commit, you are having Continuous Deployment working.
EDIT - reaction on comment:
The main idea is to have it continuous. Meaning, the full build is kicked off on regular basis, optimally every commit/git push.
If this is configured on you local machine and you are one man shop, that is probably fine. But as I played in my free time with various projects, I found that build on every commit may be resource intensive for my local machine and it was convenient to leave this responsibility for some third party service (especially when it's free).
There are plenty of online solutions for CI servers.
I used with success http://codeship.com and http://drone.io. http://cloudbees.com gives you hosted Jenkins. For open source projects they are free.
If you projects are not open source you will need to spend some bucks on it, but it should be cheap for one man projects.

Deploying JS projects

I would like you to share your way of deploying complicated JS projects, where Grunt or Gulp are used.
For example, grunt build command concats css, js files, puts minified bower dependencies into dist folder. As far as i know, we should not store build results in version control repo, shouldn't we? Also, development environment is not needed on production server.
That is why flow like: git push production, and then grunt build on production server, then restarting it is not a good practice, isn't it?
The purpose o question is to find out, how should I deploy complicated JS projects when:
Building is necessary.
Building should not be done on production server.(Or it is a normal practice?)
Build results should not be indexed by version control system.
Deployment should not be done manually.

Categories

Resources