Webpack leaves JS in cache file when compiling multiple JS libraries - javascript

I'm trying to configure Webpack so that it compiles multiple JS files. But when I do this the majority of JS is left in a cache file, the contents of which isn't included in my files.
Generated files:
js
-- theme--01.js (134kb)
-- theme--02.js (134kb)
-- theme--03.js (134kb)
-- theme--04.js (134kb)
-- themevendors-lib_common_scripts_global_libraries_js-lib_common_scripts_global_smoothscroll_min_js--8bace3.js (753kb)
When I generate one file it works correctly:
js
-- theme--01.js (879kb)
How my entry points are configured:
entry: {
"--01": path.resolve(process.cwd(), 'src', 'theme--01.ts'),
"--02": path.resolve(process.cwd(), 'src', 'theme--02.ts'),
"--03": path.resolve(process.cwd(), 'src', 'theme--03.ts'),
"--04": path.resolve(process.cwd(), 'src', 'theme--04.ts'),
},
Here's my complete Webpack config on JSfiddle as StackOverflow won't let me paste the whole thing here: https://jsfiddle.net/charlievaughan/rpeztbLj/
I need to generate multiple files from the same codebase for different locales and to optimise page-load speeds / Lighthouse scores.
Using Webpack v5.62.1

I fixed this issue by removing the following from my Webpack config:
splitChunks: {
chunks: 'all'
}

Related

Webpack minify js and compile scss in multiple subdirectories

I am trying to create a build system using webpack for a better development workflow with a PHP project.
The source repos are all cloned in a /vendor/ directory, but to be accessible to PHP, they need to land in a /source/ directory. My plan is to work in the vendor dir and have webpack automatically copy all files over when they are changed.
This is working out pretty well with the following webpack.config.js:
module.exports = {
mode: "development",
entry: [], // empty so far, since only the CopyPlugin does something atm
output: { path: __dirname },
plugins: [
new CopyPlugin({
patterns: [
{
from: "**/*",
to: ".",
context: "vendor/myVendor/",
transformPath(targetPath, absolutePath) {
return "source/modules/myVendor/" + targetPath;
},
globOptions: {
ignore: ["**/resources/**/*"],
},
},
],
})
],
};
As you can see, this copies all files in a module, which are not in a resources folder. That's what this question is about.
The folder structure is like so:
/vendor
/myvendor
/mymodule1 <- just one of many modules - this should work for all of them
composer.json
/Application
/Controller
TestController.php
/resources
/js/minifyme.js <- this should be processed...
/css/coolcss.sass <- ...also this one...
/out
/src
/js/minifyme.min.js <- ...and copied here
/css/coolcss.css <- ...or here, respectively
My plan is to have Javascript and CSS compiled/transpiled and put in a relative folder /out/src/, from where they are then copied to the /source folder by the CopyPlugin.
What is the best way to do this?

How to stop purging tailwind components

I am using TailwindCSS and Laravel Mix. I am trying to setup PurgeCSS and I have got it reading my template files (working with WordPress) and purge any CSS not within the template files.
However, as part of Tailwind, I am using #apply in my scss files and those utilities I am applying are also being purged which leaves me with a non functioning site.
My sass files are in css/dev and there is an app.scss and then directories with more files within them (base, utilities, custom, components).
My webpack.mix.js file configuration is as follows:
mix.scripts(['js/dev/app.js', 'js/dev/navigation.js', 'js/dev/skip-link-focus-fix.js'],
'js/build/app.js')
.sass('css/dev/app.scss', 'css/build')
.options({
processCssUrls: false,
postCss: [tailwindcss('./tailwind.config.js')],
})
.purgeCss({
enabled: mix.inProduction(),
// Your custom globs are merged with the default globs. If you need to
// fully replace the globs, use the underlying `paths` option instead.
globs: [
path.join(__dirname, 'template-parts/*.php'),
path.join(__dirname, '*.php'),
path.join(__dirname, 'css/dev/*.scss'),
path.join(__dirname, 'css/dev/**/*.scss'),
],
extensions: ['html', 'js', 'php', 'scss', 'css'],
});
As you can see, I tried setting the paths for purgeCss to look inside the css paths but that has not worked.
Does anyone know how to achieve this?
You are compiling your scss to css before Purge runs, so there should be no need to purge your .scss files only your main.css (or whatever the output is called).
Do your compiled class names actually exist in full in your template files? If they are not a 100% match for the classes in your templates then they will, quite correctly, be purged.
The issue was with WordPress classes not being included in the template files and then being purged. The solution was switching to using UnCSS which allowed me to setup URLS for UnCSS to visit and it won't purge any classes used on those pages. I also included some standard WordPress classes which I found a list of online.
My final config is:
const uncss = require('postcss-uncss');
mix.js('js/dev/app.js', 'js/build/app.js')
.sass('css/dev/app.scss', 'css/build')
.options({
processCssUrls: false,
postCss: [
tailwindcss('./tailwind.config.js'),
...process.env.NODE_ENV === 'production' ? [uncss({
html: [
'./*.php',
'./template-parts/*.php',
'https://www.example.com',
'https://www.example.com/work/',
'https://www.example.com/work/example-project/',
'https://www.example.com/contact/',
'https://www.example.com/blog/',
'https://www.example.com/blog/laravel-php-framework/',
],
ignore: [
'.rtl',
'.home',
'.blog',
'.archive',
'.date',
'.error404',
'.logged-in',
'.admin-bar',
'.no-customize-support',
'.custom-background',
'.wp-custom-logo',
'.alignnone',
'.alignright',
'.alignleft',
'.wp-caption',
'.wp-caption-text',
'.screen-reader-text',
'.comment-list',
'.grecaptcha-badge',
/^search(-.*)?$/,
/^(.*)-template(-.*)?$/,
/^(.*)?-?single(-.*)?$/,
/^postid-(.*)?$/,
/^attachmentid-(.*)?$/,
/^attachment(-.*)?$/,
/^page(-.*)?$/,
/^(post-type-)?archive(-.*)?$/,
/^author(-.*)?$/,
/^category(-.*)?$/,
/^tag(-.*)?$/,
/^tax-(.*)?$/,
/^term-(.*)?$/,
/^(.*)?-?paged(-.*)?$/,
'.animate',
'.animated',
'.bounce',
'.fadeInDown',
'.fadeIn',
'.fadeInUp',
'.jackInTheBox',
]
})] : [],
]
});
I also made use of the UnCSS exclude from purge CSS comments:
/* uncss:ignore start */
my css goes here
/* uncss:ignore end */
I ended up using this on all my custom sass files except for the tailwind files so that the only selectors that are purged are tailwind utilities, which saved me about 300 KB.

How to minify a source file (Javascript) into a minify file using uglifyjs-webpack

I want to minify my files using uglifyjs-webpack.
For example I have a source file core/js/test and want to minify it and send it to min/js/test.
Just using a source and respective output, how do I use it with webpack.
I often just use the optimization: {minimize: true} option (see this) since webpack 4.
const path = require("path");
module.exports = {
context: __dirname,
entry: "/origin/main.js",
output: {
path: path.resolve("./min/js/"),
filename: "test.js"
},
optimization: {
minimize: false
}
};
However, webpack still allows you to override the default implementation using the UglifyjsWebpackPlugin. I'd advise you to have a look at the docs in here, they seem to explain it quite alright.
You might like this web site. It will help you to minify.
minifier

Extracting common chunks amongst multiple compiler configurations in webpack?

I'm trying out the multi-compiler option in webpack and am following the example at their github. However, I can't seem to understand how I can split out the common code amongst the multiple configurations.
For example, I may have same vendor libraries used in the different set of configurations. I would like to have these shared codes to be bundled to one single common file.
I tried the following but it ended up creating an individual bundles of the vendors entry for each compile configuration.
var path = require("path");
var webpack = require("webpack");
module.exports = [
{
name: "app-mod1",
entry: {
vendors: ['jquery', 'react', 'react-dom'],
pageA: ['./mod1/pageA'],
pageB: ['./mod1/pageB']
},
output: {
path: path.join(__dirname, "/mod1/js"),
filename: "[name].bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
names: ['vendors'],
minChunks: Infinity
})
]
},
{
name: "app-mod2",
entry: {
vendors: ['lodash', 'react', 'react-dom'],
pageA: ['./mod2/pageA'],
pageB: ['./mod2/pageB']
},
output: {
path: path.join(__dirname, "/mod2/js"),
filename: "[name].bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
names: ['vendors'],
minChunks: Infinity
})
]
}
];
Since react, react-dom are shared between the 2 compilations, my intention is for them to be bundled as a single file which can be shared instead of creating a same bundle for each compilation.
How can I extract the common chunks out of multiple compiler configurations?
Brief answer
You can't do that job in the way you want.
TL;DR
#Carven, I am afraid that you can't achieve your goal via MultiCompiler of Webpack, MultiCompiler is not meant to do that job, at least for the near feature.
See the source code for initiating the MultiCompiler instance, it actually initiates separate Compiler instances. These compiler have no data shared between.
See also the source of running MultiCompiler instance, the compilers instance also run separately without sharing data.
The only thing these compilers share is the Stats instance they produce and merge into a MultiStats.
By the way, there is no clue in the example you mentioned that some modules are shared between multi-compilers.
Alternative
As described by #Tzook-Bar-Noy, IMHO, you have to use muti-entries to achieve MPA(Multi-page Application) bundling.
Other worth mentioning
I noticed a library called webpack-multi-configurator is using the multi-compiler feature. But I don't think it will share common chunk(s).
You can extract the shared code into another compilation, and bundle it with DllBundlesPlugin.
later consume this DLL via DLLReferencePlugin and add it to your page either manually or via HTMLWebpackPlugin's add-asset-html-webpack-plugin
Bolierplate can be reduced by using autodll-webpack-Plugin
I learned about it now, and this topic seems quite hard to understand in webpack docs. I managed to create something that works, as it created 2 separate files and extracted the common dependencies to another file.
Here is my webpack config:
{
entry: {
pageA: "./first/first",
pageB: "./second/second"
},
output: {
path: path.join(__dirname, "js"),
filename: "[name].js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
names: ["vendor", "common"],
})
]
};
the output of this will be:
./js/
common.js
vendor.js
pageA.js
pageB.js
I created a repo with the example I worked on: https://github.com/tzookb/webpack-common-vendor-chunks
when I open a new html file I load these files:
first page:
common.js
vendor.js
pageA.js
sec page:
common.js
vendor.js
pageB.js

Simple solution to share modules loaded via NPM across multiple Browserify or Webpack bundles

Pulling my hair out here looking for a simple solution to share code, required via NPM, across multiple Browserify or Webpack bundles. Thinking, is there such a thing as a file "bridge"?
This isn't due to compile time (I'm aware of watchify) but rather the desire to extract out all of my vendor specific libs into vendor.js so to keep my app.js filesize down and to not crash the browser with massive sourcemaps. Plus, I find it way cleaner should the need to view the compiled js arise. And so:
// vendor.js
require('react');
require('lodash');
require('other-npm-module');
require('another-npm-module');
Its very important that the code be loaded from NPM as opposed to Bower, or saved into some 'vendor' directory in order to be imported via a relative path and identified via a shim. I'd like to keep every library reference pulled via NPM except for my actual application source.
In app.js I keep all of my sourcecode, and via the externals array, exclude vendor libraries listed above from compilation:
// app.js
var React = require('react');
var _ = require('lodash');
var Component = React.createClass()
// ...
And then in index.html, I require both files
// index.html
<script src='vendor.js'></script>
<script src='app.js'></script>
Using Browserify or Webpack, how can I make it so that app.js can "see" into those module loaded via npm? I'm aware of creating a bundle with externals and then referencing the direct file (in, say, node_modules) via an alias, but I'm hoping to find a solution that is more automatic and less "Require.js" like.
Basically, I'm wondering if it is possible to bridge the two so that app.js can look inside vendor.js in order to resolve dependencies. This seems like a simple, straightforward operation but I can't seem to find an answer anywhere on this wide, wide web.
Thanks!
Listing all the vendor files/modules and using CommonChunkPlugin is indeed the recommended way. This gets pretty tedious though, and error prone.
Consider these NPM modules: fastclick and mprogress. Since they have not adopted the CommonJS module format, you need to give webpack a hand, like this:
require('imports?define=>false!fastclick')(document.body);
require('mprogress/mprogress.min.css');
var Mprogress = require('mprogress/mprogress.min.js'),
Now assuming you would want both fastclick and mprogress in your vendor chunk, you would probably try this:
module.exports = {
entry: {
app: "./app.js",
vendor: ["fastclick", "mprogress", ...]
Alas, it doesn't work. You need to match the calls to require():
module.exports = {
entry: {
app: "./app.js",
vendor: [
"imports?define=>false!fastclick",
"mprogress/mprogress.min.css",
"mprogress/mprogress.min.js",
...]
It gets old, even with some resolve.alias trickery. Here is my workaround. CommonChunkPlugin lets you specify a callback that will return whether or not you want a module to be included in the vendor chunk. If your own source code is in a specific src directory, and the rest is in the node_modules directory, just reject the modules based on their path:
var node_modules_dir = path.join(__dirname, 'node_modules'),
app_dir = path.join(__dirname, 'src');
module.exports = {
entry: {
app: "./app.js",
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
function (module, count) {
return module.resource && module.resource.indexOf(app_dir) === -1;
}
)
]
};
Where module.resource is the path to the module being considered. You could also do the opposite, and include only the module if it is inside node_modules_dir, i.e.:
return module.resource && module.resource.indexOf(node_modules_dir) === 0;
but in my situation, I'd rather say: "put everything that is not in my source source tree in a vendor chunk".
Hope that helps.
With webpack you'd use multiple entry points and the CommonChunkPlugin.
Taken from the webpack docs:
To split your app into 2 files, say app.js and vendor.js, you can require the vendor files in vendor.js. Then pass this name to the CommonChunkPlugin as shown below.
module.exports = {
entry: {
app: "./app.js",
vendor: ["jquery", "underscore", ...],
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
)
]
};
This will remove all modules in the vendor chunk from the app chunk. The bundle.js will now contain just your app code, without any of it’s dependencies. These are in vendor.bundle.js.
In your HTML page load vendor.bundle.js before bundle.js.
<script src="vendor.bundle.js"></script>
<script src="bundle.js"></script>
// vendor anything coming from node_modules
minChunks: module => /node_modules/.test(module.resource)
Source: https://github.com/webpack/webpack/issues/2372#issuecomment-213149173

Categories

Resources