My question is more on the lines of strategy than actual implementation.
And basically I'm wondering why do we build our MEAN apps on the server? And by build I mean getting components (npm install && bower install) and doing all the concat and minify stuff.
I'm trying to create my build system, and up until now I've been using a version of John Papa's build system, but my build is taking longer and longer on the server. So doesn't it make sense to just build everything locally and deploy it to the server? Or am I missing something?
Thanks
The build shouldn't happen on runtime. You have it partially right, build upfront, than deploy created artifact.
But the crucial idea is to have Continuous Integration in place. Meaning build server that is not on your local machine, which takes the code from SCM, builds it, run the tests, create deployable artifact and store it in some artifact repository (e.g. npm registry).
If you take it further and you also automatically deploy artifact into non-PROD envs, you are starting to dig into Continuous Delivery space.
If this build and deploy pipeline installs the artifact into PROD every commit, you are having Continuous Deployment working.
EDIT - reaction on comment:
The main idea is to have it continuous. Meaning, the full build is kicked off on regular basis, optimally every commit/git push.
If this is configured on you local machine and you are one man shop, that is probably fine. But as I played in my free time with various projects, I found that build on every commit may be resource intensive for my local machine and it was convenient to leave this responsibility for some third party service (especially when it's free).
There are plenty of online solutions for CI servers.
I used with success http://codeship.com and http://drone.io. http://cloudbees.com gives you hosted Jenkins. For open source projects they are free.
If you projects are not open source you will need to spend some bucks on it, but it should be cheap for one man projects.
Related
Some things to know:
I understand how to make a HTML / CSS / JS website.
I understand how to make a Node JS app and host it on Heroku
I encountered a problem that was very confusing and I still working it out. I am making a firebase project using their latest tree-shaking V9 SDK
import { initializeApp } from 'firebase/app';
I understand Node JS is meant for backend (it can't be run in a browser) so how am I supposed to "build" this into something that I can reference in a script tag? I've heard of webpack but that requires you to run npm run build so how is that practical in any way? That would mean every change I make I would have to build it?
Developer Testing:
How would one live preview this Node JS app on a localhost if Node JS can't be run in a browser? I would live to be able to preview this Node JS app and quickly make changes if that's possible.
I've heard of webpack but that requires you to run npm run build so how is that practical in any way? That would mean every change I make I would have to build it?
Yes, that's the general idea. You make changes to script files on your local machine (or network), and then in order to see the changes take effect, you rebuild the app so that a new bundle is generated. This bundle can then be hosted on your web server or development server, and once it's there, you can refresh the page to see the differences.
But almost all of this can be automated, resulting in it not really being much of a chore at all.
Nodemon is a popular tool that watches a directory for changes and can run Node scripts when a change is detected. With it, you could, for example, make a change to a file, and then it'll automatically rebuild your app.
For Webpack in particular, there's webpack-dev-server, which can automatically refresh a page when the app gets rebuilt.
So, during development, the workflow can be as simple as - make a change, then alt-tab to your browser (hosting the app on localhost) to look at the changes.
Bundles built for the purpose of eventually hosting them on the front-end can easily incorporate Node modules - the build process can take the code from the module, and the bundle produced can include that code. This sort of thing is extraordinarily common.
I have a simple CLI application written in Javascript using Node that is for internal use by a small team. It runs in the Linux terminal as a CLI app. The app consists of a single ".js" file and requires a few Node packages. The problem I face now is how to deploy it to our internal team using a simple method that fits with our routine process of keeping end user computers updated.
Our app needs to be installed once per workstation / laptop and to be available to all users on that computer. Any user should be able to open a terminal and enter the command to run the app.
It seems a lot of people have discussed using Javascript for shell programming, but this issue of deploying the completed app is not widely discussed. I have not found anything on the topic. So far I have been recommended solutions that are appropriate for either development environments or web servers.
This app is not a web app and it is not deployed on a server. It needs to run offline. I am also not asking about developing or maintaining the app on a development workstation.
The installation process should ideally be as about simple as installing a shell script in /usr/local/bin and setting permissions so all permitted users on a computer can run it. We are looking for an installation method like this:
copy the Javascript file only once to each computer (to a location on the $PATH) and make sure the Node packages are available globally on that computer.
I specifically want to avoid having to do an npm install for each user account on each computer.
I also want to avoid having to update Node packages for each user account on each computer.
A developer will keep the app updated so it is always compatible with the latest version of the Node packages, and all computers it is deployed on will always have the latest versions of those packages installed.
One specific problem I encountered is discussed here, but the answers assume a different set of requirements (such as the need for "multiple applications running on different package versions").
For those requirements, if the actual problem is solving the EACCESS error (you should edit the question to include that information), then you should look at the permissions of all directories, and make sure that the user account that manages node packages on each computer has correct permissions.
One way to do that is to give /usr/local a special group, set the sticky bit with chmod (see man chmod), and use chgrp -R on the existing tree.
Then make the installing account a member of that group, and don't use sudo for npm install -g.
(Never using sudo for installations into /usr/local has the additional advantage that you can't accidentally install something somewhere else, for example because you didn't set paths in this local package source correctly.)
We are using these two approaches for similar deployments:
the programs live on a specific network mount. All users can run the same package from there. The developer only updates this package. No copying to local machines.
we use a simple deployment script which runs on all machines on logon. It pushes and copies the latest version to the local machine.
In some projects, I saw developers didn't link to node_modules files in webpack.config.js (eg. "./node_modules/boostrap/dist/js/boostrap.bundle.js"), instead, they copied the file to assets/js and linked it there. Some of my friends also told me that they prefer this option because they never feel safe with linking to node_modules (I guess as somebody may use npm update...?)
What would you call a "good practice"? Is it totally fine to link to node_modules? If not - what wrong can happen?
I used this method in small projects as I don't think there is a need for doubling files but in larger - for peace of mind - I used the path to assets
It can be okay to do it. Purely from the build step perspective, it doesn't make a difference.
The trade offs you are making between using the node modules as npm provides them (node_modules) and storing your own copies, in an assets or vendors folder, are about:
security
source code management & development efficiency
storage space
When all the thousands of developers around the world create little pet projects and push them to Github, it wouldn't make sense for all of them to store their own copy of JQuery and then push it into their Github repo. Instead we push a package.json file that lists it as a dependency, we do this for every third party dependency and prevent creating a repository where a lot (even most) of the code is not application code, but dependencies. That is good.
On the other hand, if a developer always downloads dependencies every time a new project is started/cloned/forked, you potentially risk, with every module download, the chance of installing a compromised package version. For this we solve with vulnerability scanners, semantic versioning and lock files (package-lock.json) to give you control on how and when you get updates.
Another problem with downloading always is the bandwidth it consumes. For this we solve with a local cache. So, even if you uninstall a module from one project, npm doesn't really delete it from your drive. It keeps a copy on a cache folder. This works really well for most developers, but not so much in an enterprise environment with massive applications.
A problem, that has impacted already the world severely, is that if a module author decides to delete the code then lots of apps stop working because they can't find the dependency anymore. See left-pad broke Node, Babel... (It also broke things at my work)
The issue with moving things out from node_modules to assets is that if your app has 100 dependencies, your are not going to want to do that 100 times. You might as well save in your source control system the complete source code found in node_modules. That comes at a price of course, that folder can have a huge size.
A good balance can be found by using different tools and approaches. Wether you vendorize third party dependencies (store your own copy) or not depends on what has the better cost/risk ratio in your situation.
I am using visual studio code to run angular projects.
While running the server through npm start, my laptop takes a lot of time compared to others. Is it related to my pc specifications or I can do something to improve it?
I think my Pc Specs are suitable for npm to run smoothly
64 bit 2TB HD i5 processor
If you are trying to just run the project to develop it you should be using ng serve (you must have the angular cli installed). ng serve builds the project into ram and watches your files for changes, so it is very fast.
If you are trying to run your project for other people to use, you should be using ng build and/or ng build --prod to build your project and then a web server to serve it from there on.
You can even add addtional parameters to ng build such as --build-optimizer=true or --sourcemaps=false to optimize the output.
You can find a more detailed guide and all the parameters here.
Regarding the build time, afaik there is no way of speeding it up, but builds are usually only done every once in a while, so you should have no problems with a couple of minutes of build time.
I am writing a nodejs application with Angular as my front end.
For this I am having Git as my code management server.
For client, I am doing minification and it is ready for production.
But I am not sure, how to prepare server side files for production.
Do we need to just copy all the Git folders into production server?.
Let me know the best way to deploy nodejs server application.
You could use pm2 as your daemon to keep your nodejs app up all the time.
Try not to include node_modules in the repo, cause different machines have different setups/installations, you cannot tell if one package would work before you run it unless you npm install them.
If you are familiar with Docker, use it, pre-bundle all (include node_modules) files into the docker image, and you do not need pm2 here, Docker itself can restart automatically. This is the ideal approach.
It really depends on how you (or your company) want to organize the workflow and the size of the project.
Sometimes I too use a GIT repository, because then is really simple to update: just a git pull and (if server files got edits) a pm2 restart N command.
In this way, you dont have to install the whole development stack in order to compile (and minify) the bundles - I guess you work on your local machine where all the development tools are installed.
Keep in mind to use the --dev flag while installing packages that are only required in development mode, so you can keep the production server as slim as possible.
A good practice I found is to add some random tokens inside the final bundle filename (both for js and css) that get then injected inside the final html static files, to avoid the refresh the page loop.
Once you have the bundle files on your dev machine, just upload them to the server (ftp, git, rsync, sshfs mount, whatever you like) and (if server files got edits) restart/reload the node process (Im using pm2 for this, its really great). If you only edited client files, no reload is needed.
Starting from here, there a lot of ways more or less sophisticated to do the job, like git pipelines for example.. but depends on the situation.
Edit: this is a good article about task runner (gulp vs grunt vs vanilla npm), while may be a little off topic, it analyze some aspect of the common deployment process