Production build is too large - javascript

In my react app created using create-react-app the final production build comes out with a 2.3MB of JS file and 300kb of CSS file (very little CSS).
When I run the command npm run build which uses react-scripts to take build shows that is taking optimised build for production.
I even tried to force it with npm run build --env=production, still the same unoptimised output.
what I feel missing is uglifyjs and `minification of JS files is not being done, As I See my js files are not min.js.
Also final output says it has gzip compressed and it's size is around 400kb of js File ,if that is so how do i serve my files to improve optimisation.
P.S: I also tried some of solution where it suggested to exclude the Generation map to reduce size , but I feel that reduces the overall build size but still not the JS file individually.

If you really want a smaller bundle size you could try code splitting with webpack. For this you want to have control over your webpack configuration and therefore you would have to eject your create-react-app setup.
After this we can configure webpack in such a manner that it will uses code splitting. Code splitting simply produces multiple bundels instead of one big bundle of code with all of its dependencies. We then can on send only the bundle which is necessary for a certain page, and load all of the remaining bundles in the background.
All of this when done properly can significantly lower the initial load time of your react application.
Here you can find how to implement code splitting

Related

remove source path comment from __webpack_require__ in laravel mix v5 dev build

When doing a development build in a Laravel project with Laravel mix 5.0.9 (npm run dev), the .js output files are full of __webpack_require__ function calls, that seen to import files from "resources/" folder.
But along with the actual path string, relative to project root, there's a comment showing the current computer's global directory path.
These is making my CI/CD pipeline, think there's changes, and perform unnecessary steps, even thought no actual code has changed, like when trigger a build on the same commit already deployed.
How can i disable those absolute path comments, and make build use only path relative to project root? So build stays consistent across different machines.
I've tryed
no variation of mix uglify option to strip comments worked out
mix option terser.extractComments: true (or terser.terserOptions.output.comments: false) does nothing
inserting TerserPlugin via mix.webpackConfig without configs, does a prod build really, plus extracting doc comments to separate .txt files, and id could not really figure out how to only extract the comments and disable everything else.
set resourceRoot didn't changed the outputs (don't even know if should, but i'm 1h past the working hour, and no ideas anymore :D )
ps. it doesn't happen on prod builds, but i do need to build as dev for staging environment

SASS Output Style automate with minified file

I'm trying to do test to use correctly SASS and I want create a file style.css and that file minify to style.min.css
sass --watch sass/style:css --style compressed
That works well but I need automate proccess doing compile & minify at the same time.
I've found this code in other stackoverflow:
sass --watch sass/style.scss:css/style.css --watch css/style.css:css/style.min.css --style compressed --scss
but now dont work how I want that works.
Also at Sass webpage the code is different now on 2020.
https://sass-lang.com/documentation/cli/dart-sass#style
I'm not use gulp tool because I think it not neccesary for Wordpress projects.
Anybody may help me?
Irrespective of what you're building your site in, if you want total control over compilation/minification/whatever then well, that's what tooling such as gulp is there for.
It looks like you were trying to compile SCSS to CSS to a file, then take that compiled CSS file and minify it. That CLI tool won't do that, but it can absolutely do the compilation/compression at the same time, directly from the source SCSS.
Using the binary you're using, this is going to compile and compress, taking ./sass/style.scss and outputting the result to ./css/style.min.css
sass sass/style.scss:css/style.min.css --style compressed
Add --watch if you want to have it react to file changes in your scss file.
Or perhaps you were trying to get a unminified and a minified version alongside. In that case, you'll simply have to run two commands. Again, gulp is there to automate this process.
Other binaries will have different flags and options, and of course there is the gulp option which I'd certainly recommend given you then don't have to remember any lengthy commands and you can share your chosen structure/tasks accross projects.

VertX Webserver static content webroot

I've got two projects which I've created:
A web UI built using webpack
A Vert.x webserver written in java built using Gradle
I want to find a way to bring the resulting build dir contents of the first project into the second as the webroot which will be server up using the StaticHandler.
Is anyone aware of a clean way to do this? I want to preserve the two git projects as they are because I like using the webpack dev server for development of the UI and it generally feels cleaner to have them separated.
I was looking at potentially using the bitbucket pipelines build on my repo, however bringing the assets generated by the first project into the build of the second is where I'm facing issues.
You could create a gradle task that before that depends on the jar task (so it runs before it) executes webpack compile into the resources directory. So when your jar task runs it bundles the compiled webpack code.

Do I need to keep a copy of js library in lib or vendor folder though already installed using npm?

Question 1 :
I am installing my project dependency libraries using npm and it gets stored in the npm_modules folder. Is it necessary to keep the copy of library like angular.js,angular-route.js in lib folder or vendor folder? I could see few people are using lib folder or vendor folders to store the library in the permanent manner. I am confused by seeing this.
Question 2:
Do I need to copy/paste the node_modules folder to production or just run the npm install command on the project folder's command prompt to install all the dependencies in production. How does a dependency library get promoted to production?
Thank you kindly for your advice.
It all depends on how you need to deploy your site to production, really. Ultimately, you will probably want to bundle all your JS files into one or a few files, which are minified and sent with gzip compression.
How you bundle them is up to you. There are quite a few options:
Browserify
Webpack
Grunt / gulp build process
And many more besides
As to whether you need to keep a copy of these bundled javascript files under version control, well I think that boils down to 1 key question: can you run a build process (such as one of the tools using NodeJS) on the production server, or on a build server that creates a zip file or installer? If so, then you don't need to include them, just get the build server or production server to check out the latest copy from version control, npm install and then run the build process.
But if the best you could do is have the production server check files out from source control, then you would want to include the final versions of the files to use in the repository.
Keeping generated files, such as your bundled javascript files, in your source control repo should be avoided where possible. Because otherwise, every commit has to contain the changes to the source files, and the corresponding change to the generated files as well. And the latter is just noise, and has to be ignored by every developer looking at a diff/patch for a commit.

Bundler for javascript, or how to source control external javascript files

I am in the process of converting an existing Rails 3.1 app I made for a client into a Backbone.js app with the Rails app only as a backend server extension. This is only a personal project of mine, to learn more about Backbone.js.
While setting up Backbone.js (using Backbone-on-Rails), I noticed I have some dependencies (like backbone-forms) that come from external sources and are frequently updated.
I've grown accustomed to using Bundler to manage my Ruby gems, but I haven't found anything similar for JavaScript files. I'm wondering if there is any way to do the same for Javascript (and possibly css) files.
Basically I can see three possibilities to solve this issue:
Simply write down all the sources for each JS file and check these sources from time to time to see what has changed.
Use some kind of existing "Bundler for Javascript" type of tool, I've been looking for something like this but have yet to find anything (good).
Since most of these JS files will be coming from Git anyway, use Git to get the files directly and use checkout to get the latest version from time to time.
I prefer the last option, but was hoping on some more input from other people who have gone this route or preferred some other way to tackle this issue (or is this even an issue?).
I figure the Git way seems easy, but I am not quite sure yet how I could make this work nicely with Rails 3.1 and Sprockets. I guess I'd try to checkout a single file using Git and have it be cloned in a directory that is accessible to Sprockets, but I haven't tried this yet.
Any thoughts?
You don't mention it in your alternatives, but ideally you should use something like Maven to manage your dependencies. Unfortunately, there are no public repositories for javascript files. This discussion lists some other options which might be of help to you: JQuery Availability on Maven Repositories
For now I've settled on using the Git solution combined with some guard-shell magic.
The steps I follow:
Create a dependencies directory somewhere on your local drive
Clone the repositories with javascript (or css) files you want to use in the app
Set up a custom guard-shell command to do the following:
group 'dependencies' do
guard 'shell' do
dependencies = '~/path/to/dependencies/'
watch(%r{backbone-forms/src/(backbone\-forms\.js)}) {|m| `cp #{dependencies + m[0]} vendor/assets/javascripts/#{m[1]}` }
end
end
Place the Guardfile at the root of the app directory
It takes some time to set things up, but after that, when you have the Guard running, and you pull changes into your dependencies, the required files are automatically copied to your application directory, which are then part of your repository.
It seems to work great, you need to do some work for each new file you want to include in the asset pipeline, but all that is required is cloning the repository in your dependencies directory and adding a single line to your Guardfile, for example for the backbone-form css:
watch(%r{backbone-forms/src/(backbone\-forms\.css)}) {|m| `cp #{dependencies + m[0]} vendor/assets/stylesheets/#{m[1]}` }
Also, the reason I added this Guard to a group is because I keep my dependencies outside the main application directory, which means guard normally doesn't check my dependencies directory. To make this work, I start up my main Guard processes using bundle exec guard -g main and use bundle exec guard -w ~/path/to/dependencies -g dependencies in a new terminal window/tab to specify the -w(atchdir).

Categories

Resources