In my project, I need to do cache busting, since after a new deploy, the browser often only reloads the HTML but not the JS & CSS files.
Currently, I am not building the HTML in any way, it just already sits in the public directory.
The simplest approach seems to be to add a timestamp to the JS reference:
<script type="module" src="bundle/index.js?ts=20201026-102300"></script>
Now, what is the best way to achieve this in a project that already uses rollup.js?
I have seen #rollup/plugin-html, yet I'm puzzled by the example in its documentation, as it takes a JS file as input:
input: 'src/index.js',
What JS file should that be?
Instead I expect that need to define
an input HTML file
some space for code to set the timestamp variable
an output HTML file
So what's the best way to do this, be it with #rollup/plugin-html or with another approach?
Came here looking for an answer to this question myself and a few moments later and a bit of regex fiddling, I got it to work.
Note: this solution edits your HTML file each time you build it. There is no input (template) HTML and output HTML.
Install rollup-plugin-replace-html-vars
npm install rollup-plugin-replace-html-vars --save-dev
Add this piece of config to your rollup.config.js file
// rollup.config.js
// ...
plugins: [
replaceHtmlVars({
files: '**/index.html',
from: /\.\/\w+\/\w+\.\w+.\w+\?v=\d+/g,
to: './dist/app.min.js?v=' + Date.now(),
}),
]
In your index.html, add this reference to the app.js:
<script type="module" src="./dist/app.min.js?v=1630086943272"></script>
Run rollup and the reference to app.js in your index.html will have a timestamp of the build time each time you run it.
Bonus:
If you don't have a .min in your filename, use this regex instead:
/\.\/\w+\/\w+\.\w+\?v=\d+/g
Full disclosure; I'm no regex wizard, just managed to hack this one together. I bet someone here will have a better way of capturing ./dist/app.min.js?v=1630086943272 with a regex but this works for my solution.
I went with using file hashes, which means it's only reloaded when there is a new version for that file.
For that, I wrote my own utility:
function escapeStringRegexp(string) {
if (typeof string !== 'string') {
throw new TypeError('Expected a string');
}
return string
.replace(/[|\\{}()[\]^$+*?.]/g, '\\$&')
.replace(/-/g, '\\x2d');
}
function insertHashToFile(options) {
return {
writeBundle(outputOptions) {
const outputDir = outputOptions.dir ? outputOptions.dir : path.dirname(outputOptions.file);
let indexHtml = fs.readFileSync(options.htmlFile, 'utf8');
for (const sourceFile of options.sourceFiles) {
const fb = fs.readFileSync(path.join(outputDir, sourceFile));
const hash = crypto.createHash('sha1');
hash.update(fb)
const hexHash = hash.digest('hex');
const replacePattern = new RegExp(escapeStringRegexp(sourceFile) + '(:?\\?h=[^"]+)?', 'g');
indexHtml = indexHtml.replaceAll(replacePattern, `${sourceFile}?h=${hexHash.substring(0, 8)}`);
}
fs.writeFileSync(options.htmlFile, indexHtml);
},
};
}
and then
plugins: [
production && insertHashToFile({
sourceFiles: [
"bundle.js",
"bundle.css",
],
htmlFile: "public/index.html",
}),
]
Related
I'm working on a React app being served by Flask.
I need the app to present some things that are filled by the Flask app using a Jinja template, and the simplest way I could find that should work is to use an external js file which I run through a render_template command and have the rest of the code reference that.
I used the WebpackCopyPlugin to make sure that file is available to import, and I explicitly exclude it from the babel-loader to make sure it doesn't get compiled.
However, when it is copied to the output dir by npm run build, it changes its contents!
This is the original file:
var is_admin = "{{is_admin}}" == "True";
var is_user = "{{is_user}}" == "True";
var is_debug = "{{is_debug}}" == "True";
var username = "{{request.remote_user}}";
(Yes, I know I shouldn't be keeping stuff like that in a javascript file, it doesn't actually give permissions to do anything - it's just for display purposes. The actual permission checking and access granting is all done in the backend).
But WebpackCopyPlugin copies it to look like this:
var is_admin=!1,is_user=!1,is_debug=!1,username="{{request.remote_user}}";
Why is it doing that?
Can I tell it to just copy the file as is without modifying it?
Thanks!
Modify the default minimizer options.
webpack.config.js
{
optimization: {
minimize: true,
minimizer: [new TerserPlugin({
exclude: /static/,
})]
}
}
I have the following gulp task which I want to change the filename and contents of a file replacing any matching strings with the replacement.
The matching strings in the file contents get changed, but the file's name does not. I thought it would as my code appears to match the examples on https://www.npmjs.com/package/gulp-replace
What am I doing wrong?
function renameFileContents() {
return gulp.src([
'**/*',
'!.github/**',
'!languages/**',
'!node_modules/**',
'!.babelrc',
'!.editconfig',
'!.gitignore',
'!.travis.yml',
'!CHANGELOG.md',
'!codesniffer.ruleset.xml',
'!composer.json',
'!composer.lock',
'!config.yml',
'!config-default.yml',
'!gulpfile.babel.js',
'!MIT-LICENSE.txt',
'!package-lock.json',
'!package.json',
'!phpunit.xml.dist',
'!README.md',
'!webpack.config.js'
])
.pipe($.replace('BigTest', 'Tester'))
.pipe($.replace('Bigtest', 'Tester'))
.pipe($.replace('bigtest', 'tester'))
.pipe(gulp.dest('./'));
}
Use gulp-rename to alter filenames. Add:
const rename = require('gulp-rename');
and before .pipe(gulp.dest('./'));:
.pipe(
rename(function(path) {
path.basename = path.basename.replace(/BigTest|Bigtest|bigtest/, function(matched) {
return { BigTest: 'Tester', Bigtest: 'Tester', bigtest: 'tester' }[matched];
});
})
)
You asked in a comment why new files are created (with the new names) but the original files still remain. Why does gulp-rename not actually rename the original files as you might expect?
Gulp-rename is not working with the original files. This can be a little confusing.
It's called gulp-rename because it renames an in-memory gulp file
object. gulp is like functional programming, each plugin takes in
input and produces output in-memory without causing side effects. [emphasis added]
gulp works like this:
read file (gulp.src)
do some stuff, modify the file in-memory (plugins)
commit file changes back to fs (gulp.dest/or others)
From gulp-rename issues: not renaming the original files.
The suggested fix (from gulp recipes: deleting files from a pipeline) which I tested is:
const del = require('del');
const vinylPaths = require('vinyl-paths');
and add this pipe before the replace pipe:
.pipe(vinylPaths(del))
.pipe(
rename(function(path) { ......
and your original files will be deleted, leaving only the newly named files. Obviously, make sure you test this on good test cases before deleting any of your files!
Webpack has been very useful to us in writing isomorphic Javascript, and swapping out npm packages for browser globals when bundling.
So, if I want to use the node-fetch npm package on Node.js but exclude it when bundling and just use the native browser fetch global, I can just mention it in my webpack.config.js:
{
externals: {
'node-fetch': 'fetch',
'urlutils': 'URL',
'webcrypto': 'crypto', // etc
}
}
And then my CommonJS requires const fetch = require('node-fetch') will be transpiled to const fetch = window.fetch (or whatever it does).
So far so good. Here's my question: This is easy enough when requiring entire modules, but what about when I need to require a submodule / individual property of an exported module?
For example, say I want to use the WhatWG URL standard, isomorphically. I could use the urlutils npm module, which module.exports the whole URL class, so my requires look like:
const URL = require('urlutils')
And then I can list urlutils in my externals section, no prob. But the moment I want to use a more recent (and more supported) npm package, say, whatwg-url, I don't know how to Webpack it, since my requires look like:
const { URL } = require('whatwg-url')
// or, if you don't like destructuring assignment
const URL = require('whatwg-url').URL
How do I tell Webpack to replace occurrences of require('whatwg-url').URL with the browser global URL?
At first I would like to highlight that I am not a webpack expert. I think there is a better way of bundling during the build time. Anyway, here is my idea:
webpack.config.js
module.exports = {
target: "web",
entry: "./entry.js",
output: {
path: __dirname,
filename: "bundle.js"
}
};
entry.js
var URL = require("./content.js");
document.write('Check console');
console.log('URL function from content.js', URL);
content.js
let config = require('./webpack.config.js');
let urlutils = require('urlutils');
let whatwgUrl = require('whatwg-url');
console.log('urlutils:', urlutils);
console.log('whatwgUrl', whatwgUrl);
module.exports = {
URL: undefined
};
if (config.target === 'web') {
module.exports.URL = urlutils;
} else {
module.exports.URL = whatwgUrl.URL;
}
index.html
<html>
<head>
<meta charset="utf-8">
</head>
<body>
<script type="text/javascript" src="bundle.js" charset="utf-8"></script>
</body>
</html>
As I said in the comment, it's going to bundle two libs for the Web bundle - waste of space.
Now, for NodeJS, you change the target from web to node and it should take the other library. https://webpack.github.io/docs/configuration.html#target
I've found a module for 'isomorphic' apps: https://github.com/halt-hammerzeit/universal-webpack
I think you could try to use two, separate middle content.js files as a parameters for the module. One containing urlutis and the second whatwg-url. Then it would dynamically recognize what it compiles your files for and use the proper module.
Hope it helps.
For example, I use AMD definition in my project, and use "webpack" for project building. It's possible to create some loader which will take a dependencies in array format?
define(
[
'mySuperLoader![./path/dependency-1, ./path/dependency-2, ...]'
],
function() {
// ... some logic here
}
)
Project example: gitHub
If you want to port the load-plugin's behavior to webpack, you need to do this:
1. Create a custom resolver
This is because mySuperLoader![./path/dependency-1, ./path/dependency-2, ...] does not point to a single file. When webpack tries to load a file, it first:
resolves the file path
loads the file content
matches and resolves all loaders
passes the file content to the loader chain
Since [./path/dependency-1, ./path/dependency-2, ...] is not a proper file path, there is some work to do. It is even not a proper JSON.
So, our first goal is to turn this into mySuperLoader!some/random/file?["./path/dependency-1", "./path/dependency-2", ...]. This is usually done by creating a custom resolver:
// webpack.config.js
var customResolverPlugin = {
apply: function (resolver) {
resolver.plugin("resolve", function (context, request) {
const matchLoadRequest = /^\[(.+)]$/.exec(request.path);
if (matchLoadRequest) {
request.query = '?' + JSON.stringify(
matchLoadRequest[1]
.split(", ")
);
request.path = __filename;
}
});
}
};
module.exports = {
...
plugins: [
{
apply: function (compiler) {
compiler.resolvers.normal.apply(customResolverPlugin);
}
}
]
};
Notice request.path = __filename;? We just need to give webpack an existing file so that it does not throw an error. We will generate all the content anyway. Probably not the most elegant solution, but it works.
2. Create our own load-loader (yeah!)
// loadLoader.js
const path = require("path");
function loadLoader() {
return JSON.parse(this.request.match(/\?(.+?)$/)[1])
.map(module =>
`exports['${path.basename(module, '.js')}'] = require('${module}');`
)
.join('\n');
}
module.exports = loadLoader;
This loader parses the request's query we have re-written with our custom resolver and creates a CommonJS module that looks like this
exports['dependency-1'] = require('path/to/dependency-1');
exports['dependency-2'] = require('path/to/dependency-2');
3. Alias our own load-loader
// webpack.config.js
...
resolveLoader: {
alias: {
load: require.resolve('./loadLoader.js')
}
},
4. Configure root
Since /path/to/dependency-1 is root-relative, we need to add the root to the webpack config
// webpack.config.js
resolve: {
root: '/absolute/path/to/root' // usually just __dirname
},
This is neither a beautiful nor an ideal solution, but should work as a makeshift until you've ported your modules.
I don't think that you should use a loader for that. Why don't you just write:
require("./path/dependency-1");
require("./path/dependency-2");
require("./path/dependency-3");
It accomplishes the same thing, is much more expressive and requires no extra code/loader/hack/configuration.
If you're still not satisfied, you might be interested in webpack contexts which allow you to require a bulk of files that match a given filter. So, if you write
require("./template/" + name + ".jade");
webpack includes all modules that could be accessed by this expression without accessing parent directories. It's basically the same like writing
require("./table.jade");
require("./table-row.jade");
require("./directory/folder.jade")
You can also create contexts manually like this
var myRequire = require.context(
"./template", // search inside this directory
false, // false excludes sub-directories
/\.jade$/ // use this regex to filter files
);
var table = myRequire("./table.jade");
I'm trying to use WebPack to include "showdown". The problem is that showdown will require("fs") and check the return value. This makes WebPack throw an error.
It seems like it should be possible to configure Webpack to generate a shim so that call to require("fs") will return false.
Maybe one of these techniques might work: http://webpack.github.io/docs/shimming-modules.html
Here's the Showdown.js code. If I comment out this code inside the node modules directory, the problem is solved. However, there should be a better way.
//
// Automatic Extension Loading (node only):
//
if (typeof module !== 'undefind' && typeof exports !== 'undefined' && typeof require !== 'undefind') {
var fs = require('fs');
if (fs) {
// Search extensions folder
var extensions = fs.readdirSync((__dirname || '.')+'/extensions').filter(function(file){
return ~file.indexOf('.js');
}).map(function(file){
return file.replace(/\.js$/, '');
});
// Load extensions into Showdown namespace
Showdown.forEach(extensions, function(ext){
var name = stdExtName(ext);
Showdown.extensions[name] = require('./extensions/' + ext);
});
}
}
The solution was to switch to marked: https://www.npmjs.org/package/marked. The showdown library is problematic as far as modules go.
Add it to noParse e.g.
var config = {
output: {
path: buildPath,
filename: 'bundle.js'
},
module: {
noParse: [
/showdown/,
],
And webpack will assume it does not contain any useful calls to require 🌹
This issue should be fixed in showdown v >=1.0.0
This seems to be a Showdown problem rather than a webpack one. The code that requires fs is intended to only be run in a node environment. Unfortunately there are some typos in the code that checks if it's running in node (the first if statement in the code you posted, undefind instead of undefined).
There is an pull request that fixes this which hasn't been merged yet.
To be honest it looks like the Showdown library is no longer maintained (last commit November 2012) so you may be better off looking for an alternative if possible.