I've got two projects which I've created:
A web UI built using webpack
A Vert.x webserver written in java built using Gradle
I want to find a way to bring the resulting build dir contents of the first project into the second as the webroot which will be server up using the StaticHandler.
Is anyone aware of a clean way to do this? I want to preserve the two git projects as they are because I like using the webpack dev server for development of the UI and it generally feels cleaner to have them separated.
I was looking at potentially using the bitbucket pipelines build on my repo, however bringing the assets generated by the first project into the build of the second is where I'm facing issues.
You could create a gradle task that before that depends on the jar task (so it runs before it) executes webpack compile into the resources directory. So when your jar task runs it bundles the compiled webpack code.
Related
My web development is principally intranet sites and web front-ends for embedded devices using NodeJS.
My current structure is to have everything in one NPM package. I run NodeJS behind Nginx, and let Nginx serve css/image/client-side-javascript files directly from public.
I'm starting to use VueJS and Vuelidate, both of which use the now ES6 modules system - e.g. import { required, minLength } from 'vuelidate/lib/validators'.
While I've (rather hackily) made these work with my current structure, I think the time has come to get into the world of Javascript build-systems/bundlers.
If I use VueJS's preferred option of WebPack, how should I change the structure of my code?
Should I have one NPM package for the frontend (generated by vue-cli init) and another for the Express backend app?
Should I put my Express App into the generated Vue frontend package?
Should I use browserify to do the job of WebPack and stay with my existing structure?
Or something else entirely?
I’m not sure why you’re intent on putting your JavaScript code in other packages. If you have an application then you can keep your raw JavaScript files in there, along with the build script(s). Someone should be able to check your application out and be able to build it.
If you’re looking to get started with a build system, then a nice “bridge” might be to use Mix, a project created by Laravel for building front-end assets such as Sass and JavaScript. It uses Webpack under the hood, but in turn exposes a more user-friendly, fluid API.
If you go down this route, then you could put your raw JavaScript files in a lib/ or src/ directory. You could then use Mix to compile these components like this:
mix.js('lib/your-entry-point-script.js', 'public/js/app.js');
Your entry point script would just be the script that requires all your other scripts and components and the script that you want “built”. Mix would then compile that and place the resultant script at public/js/app.js.
Mix itself is just a npm package, so all you need to do is npm install laravel-mix --save-dev.
You can read more about Mix at https://laravel.com/docs/5.7/mix
I am writing a nodejs application with Angular as my front end.
For this I am having Git as my code management server.
For client, I am doing minification and it is ready for production.
But I am not sure, how to prepare server side files for production.
Do we need to just copy all the Git folders into production server?.
Let me know the best way to deploy nodejs server application.
You could use pm2 as your daemon to keep your nodejs app up all the time.
Try not to include node_modules in the repo, cause different machines have different setups/installations, you cannot tell if one package would work before you run it unless you npm install them.
If you are familiar with Docker, use it, pre-bundle all (include node_modules) files into the docker image, and you do not need pm2 here, Docker itself can restart automatically. This is the ideal approach.
It really depends on how you (or your company) want to organize the workflow and the size of the project.
Sometimes I too use a GIT repository, because then is really simple to update: just a git pull and (if server files got edits) a pm2 restart N command.
In this way, you dont have to install the whole development stack in order to compile (and minify) the bundles - I guess you work on your local machine where all the development tools are installed.
Keep in mind to use the --dev flag while installing packages that are only required in development mode, so you can keep the production server as slim as possible.
A good practice I found is to add some random tokens inside the final bundle filename (both for js and css) that get then injected inside the final html static files, to avoid the refresh the page loop.
Once you have the bundle files on your dev machine, just upload them to the server (ftp, git, rsync, sshfs mount, whatever you like) and (if server files got edits) restart/reload the node process (Im using pm2 for this, its really great). If you only edited client files, no reload is needed.
Starting from here, there a lot of ways more or less sophisticated to do the job, like git pipelines for example.. but depends on the situation.
Edit: this is a good article about task runner (gulp vs grunt vs vanilla npm), while may be a little off topic, it analyze some aspect of the common deployment process
Question 1 :
I am installing my project dependency libraries using npm and it gets stored in the npm_modules folder. Is it necessary to keep the copy of library like angular.js,angular-route.js in lib folder or vendor folder? I could see few people are using lib folder or vendor folders to store the library in the permanent manner. I am confused by seeing this.
Question 2:
Do I need to copy/paste the node_modules folder to production or just run the npm install command on the project folder's command prompt to install all the dependencies in production. How does a dependency library get promoted to production?
Thank you kindly for your advice.
It all depends on how you need to deploy your site to production, really. Ultimately, you will probably want to bundle all your JS files into one or a few files, which are minified and sent with gzip compression.
How you bundle them is up to you. There are quite a few options:
Browserify
Webpack
Grunt / gulp build process
And many more besides
As to whether you need to keep a copy of these bundled javascript files under version control, well I think that boils down to 1 key question: can you run a build process (such as one of the tools using NodeJS) on the production server, or on a build server that creates a zip file or installer? If so, then you don't need to include them, just get the build server or production server to check out the latest copy from version control, npm install and then run the build process.
But if the best you could do is have the production server check files out from source control, then you would want to include the final versions of the files to use in the repository.
Keeping generated files, such as your bundled javascript files, in your source control repo should be avoided where possible. Because otherwise, every commit has to contain the changes to the source files, and the corresponding change to the generated files as well. And the latter is just noise, and has to be ignored by every developer looking at a diff/patch for a commit.
I would like you to share your way of deploying complicated JS projects, where Grunt or Gulp are used.
For example, grunt build command concats css, js files, puts minified bower dependencies into dist folder. As far as i know, we should not store build results in version control repo, shouldn't we? Also, development environment is not needed on production server.
That is why flow like: git push production, and then grunt build on production server, then restarting it is not a good practice, isn't it?
The purpose o question is to find out, how should I deploy complicated JS projects when:
Building is necessary.
Building should not be done on production server.(Or it is a normal practice?)
Build results should not be indexed by version control system.
Deployment should not be done manually.
I have a Spring Boot project that is built with Gradle. All my front-end code lives under src/main/resources/static. This also includes my bower_components, node_modules (for Grunt tasks), etc.
Right now, I have my main Gradle build script exec the Grunt build, which concatenates/minifies all my JavaScript, and they go under src/main/resources/static/dist. Then, when processResources gets executed in Gradle, the entire src/main/resources/static/dist gets copied to the build directory. This doesn't seem right to me - the only stuff that should end up in the build directory are the concatenated/minified JS, CSS, HTML and images.
I've searched high and low but haven't found any clear direction here. Is there a best practice for building a project in this way?
Files generated as part of the build shouldn't go into src, but somewhere under the build directory (say build/grunt). What's left is to include them in archives as necessary. For example:
war.dependsOn(grunt)
war {
from "build/grunt"
}
Or, if the Grunt task declares it outputs:
war {
from grunt
}