I'm using Webpack's [hash] for cache busting locale files. But I also need to hard-code the locale file path to load it from browser. Since the file path is altered with [hash], I need to inject this value to get right path.
I don't know how can get Webpack [hash] value programmatically in config so I can inject it using WebpackDefinePlugin.
module.exports = (env) => {
return {
entry: 'app/main.js',
output: {
filename: '[name].[hash].js'
}
...
plugins: [
new webpack.DefinePlugin({
HASH: ***???***
})
]
}
}
In case you want to dump the hash to a file and load it in your server's code, you can define the following plugin in your webpack.config.js:
const fs = require('fs');
class MetaInfoPlugin {
constructor(options) {
this.options = { filename: 'meta.json', ...options };
}
apply(compiler) {
compiler.hooks.done.tap(this.constructor.name, stats => {
const metaInfo = {
// add any other information if necessary
hash: stats.hash
};
const json = JSON.stringify(metaInfo);
return new Promise((resolve, reject) => {
fs.writeFile(this.options.filename, json, 'utf8', error => {
if (error) {
reject(error);
return;
}
resolve();
});
});
});
}
}
module.exports = {
// ... your webpack config ...
plugins: [
// ... other plugins ...
new MetaInfoPlugin({ filename: 'dist/meta.json' }),
]
};
Example content of the output meta.json file:
{"hash":"64347f3b32969e10d80c"}
I've just created a dumpmeta-webpack-plugin package for this plugin. So you might use it instead:
const { DumpMetaPlugin } = require('dumpmeta-webpack-plugin');
module.exports = {
...
plugins: [
...
new DumpMetaPlugin({
filename: 'dist/meta.json',
prepare: stats => ({
// add any other information you need to dump
hash: stats.hash,
})
}),
]
}
Please refer to the Webpack documentation for all available properties of the Stats object.
Seems like it should be a basic feature but apparently it's not that simple to do.
You can accomplish what you want by using wrapper-webpack-plugin.
plugins: [
new WrapperPlugin({
header: '(function (BUILD_HASH) {',
footer: function (fileName) {
const rx = /^.+?\.([a-z0-9]+)\.js$/;
const hash = fileName.match(rx)[1];
return `})('${hash}');`;
},
})
]
A bit hacky but it works — if u don't mind the entire chunk being wrapped in an anonymous function.
Alternatively you can just add var BUILD_HASH = ... in the header option, though it could cause problem if it becomes a global.
I created this plugin a while back, I'll try to update it so it provides the chunk hash naturally.
On server, you can get the hash by reading the filenames (example: web.bundle.f4771c44ee57573fabde.js) from your bundle folder.
You can pass the version to your build using webpack.DefinePlugin
If you have a package.json with a version, you can extract it like this:
const version = require("./package.json").version;
For example (we stringified the version):
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(version)
}),
then in your javascript, the version will be available as:
process.env.VERSION
The WebpackManifestPlugin is officially recommended in the output management guide. It writes a JSON to the output directory mapping the input filenames to the output filenames. Then you can inject those mapped values into your server template.
It's similar to Dmitry's answer, except Dmitry's doesn't appear to support multiple chunks.
That can be done with Webpack Stats Plugin. It gives you nice and neat output file with all the data you want. And it's easy to incorporate it to the webpack config files where needed.
E.g. To get hash generated by Webpack and use it elsewhere.
Could be achieved like:
# installation
npm install --save-dev webpack-stats-plugin
yarn add --dev webpack-stats-plugin
# generating stats file
const { StatsWriterPlugin } = require("webpack-stats-plugin")
module.exports = {
plugins: [
// Everything else **first**.
// Write out stats file to build directory.
new StatsWriterPlugin({
stats: {
all: false,
hash: true,
},
filename: "stats.json" // Default and goes straight to your output folder
})
]
}
# usage
const stats = require("YOUR_PATH_TO/stats.json");
console.log("Webpack's hash is - ", stats.hash);
More usage examples in their repo
Hope that helps!
Related
As per the documentation, I am not using worker-loader, I am trying to use the native way suggested by the webpack-5 documentation.
Below is the usage of the worker script in the main thread.
const worker = new window.Worker(
new URL("../workers/listOperation.worker.js", import.meta.url),
{
type: "module",
},
);
worker.postMessage({ list: hugeList, params: reqData });
worker.onerror = err => console.error(err);
worker.onmessage = e => {
const { list } = e.data;
// Usage of `list` from the response
worker.terminate();
};
return worker;
It works fine if there are no imports used in the script. But when I import any node modules (e.g. loadash/get) or any other function/constants from local, it does not work as the output webWorker bundle file doesn't transpile and bundle the imported code. It keeps the "import" statement as it is.
Below is the worker script (listOperation.worker.js)
import get from "lodash/get";
import { ANY_CONSTANT } from "../constants"; // This is some local constant
addEventListener("message", e => {
const { list, params } = e.data;
// Here I have some usage of `get` method from `lodash/get` and ANY_CONSTANT
self.postMessage({
list: list,
});
});
Webpack outputs the bundle file like below which won't be usable by the browser, if I put the /\.worker.js$/ pattern in the the exclude of babel-loader rule.
import get from "lodash/get";import{ANY_CONSTANT}from"../constants";addEventListener("message",(e=>{const{list:s,params:t,.......
And even if I don't put the /\.worker.js$/ pattern in the the exclude of babel-loader rule, the output bundle still doesn't include the implementation of get from lodash/get or the value of the constant. It just outputs it in cjs using require.
Also, I made use of asset module so that I can put the output file inside a directory, not directly in the root of the dist folder. Configuration changes in my webpack.config looks like this.
module.exports = {
entry: {...},
module: {
rules: [
{
test: /\.js$/,
exclude: [/(node_modules)/, /\.worker.js$/],
use: {
loader: "babel-loader", // This uses the config defined in babel.config.js
},
},
{
test: /\.worker.js$/,
exclude: /(node_modules)/,
type: "asset/resource",
generator: {
filename: "js/workers/[hash][ext][query]",
},
},
],
},
}
Dependencies: "#vue/cli-plugin-unit-jest": "^4.5.13", "#vue/test-utils": "^1.2.1", "vue-jest": "^3.0.7"
I have an app which uses an alias (say "foo") being set in vue.config.js:
module.exports = {
chainWebpack: (config) => {
// Add project name as alias
config.resolve.alias.set('foo', __dirname);
},
};
For both import statements and HTML tag src...
In main.js:
...
import App from 'foo/src/components/core/App';
...
In ../src/core/App/index.vue:
<script src="foo/src/components/core/App/script.js" />
<style module src="foo/src/components/core/App/style.css" />
<template src="foo/src/components/core/App/template.html" />
I know I can use a moduleNameMapper in jest.config.js, something like:
'^foo(.*)$': '<rootDir>$1',
However, this doesn't map aliases that appear in the src attribute of my HTML tags. Is there any way to have vue-jest interpret these attribute paths via a config setting or some other means?
Any recommendations will be greatly appreciated.
URL parsing in SFCs
vue-jest doesn't resolve src URLs for the top-level block tags in SFCs, so you'll have to use un-aliased relative paths in src/components/core/App/index.vue:
<script src="./script.js" />
<style module src="./style.css" />
<template src="./template.html" />
URL parsing in <template> contents
vue-jest uses #vue/component-compiler-utils to compile the template, but URL parsing requires the transformAssetUrls option. vue-jest 3.x does not support passing options to #vue/component-compiler-utils, but that now works in 4.0.0-rc.1 via a templateCompiler.transformAssetUrls config.
Even with this URL parsing enabled, Vue CLI configures jest to return an empty string for require-ed media, including images. If your tests need to work with the normally resolved URLs in production, you'll need a Jest transform that mimics url-loader. Vue CLI configures the loader to return the resolved filename if greater than 4KB; or the base64 data URL otherwise.
To enable the URL parsing:
Update to vue-jest 4:
npm i -D vue-jest#4
Create the following file for the custom my-jest-url-loader, which we'll use later below:
// <rootDir>/tests/my-jest-url-loader.js
const urlLoader = require('url-loader')
module.exports = {
process(src, filename) {
const urlLoaderOptions = {
esModule: false,
limit: 4096,
fallback: {
loader: 'file-loader',
options: {
esModule: false,
emitFile: false,
name: filename,
},
},
}
const results = urlLoader.call({
query: urlLoaderOptions,
resourcePath: filename,
}, src)
// strip leading Webpack prefix from file path if it exists
return results.replace(/^module.exports = __webpack_public_path__ \+ /, 'module.exports = ')
}
}
To avoid accidentally overwriting Vue CLI's default Jest presets, use a merge utility (e.g., lodash.merge) to insert a custom config in jest.config.js.
Add a vue-jest config in a Jest global, setting templateCompiler.transformAssetUrls.
Modify the merged preset's transform property to use our my-jest-url-loader transform for images. This requires removing other image transforms from the default Jest preset to avoid conflicts.
// jest.config.js
const vueJestPreset = require('#vue/cli-plugin-unit-jest/presets/default/jest-preset')
const merge = require('lodash.merge') 3️⃣
const newJestPreset = merge(vueJestPreset, {
globals: { 4️⃣
'vue-jest': {
templateCompiler: {
transformAssetUrls: {
video: ['src', 'poster'],
source: 'src',
img: 'src',
image: ['xlink:href', 'href'],
use: ['xlink:href', 'href']
}
}
}
},
moduleNameMapper: {
'^foo/(.*)$': '<rootDir>/$1',
},
})
function useUrlLoaderForImages(preset) { 5️⃣
const imageTypes = ['jpg', 'jpeg', 'png', 'svg', 'gif', 'webp']
const imageTypesRegex = new RegExp(`(${imageTypes.join('|')})\\|?`, 'ig')
// remove the image types from the transforms
Object.entries(preset.transform).filter(([key]) => {
const regex = new RegExp(key)
return imageTypes.some(ext => regex.test(`filename.${ext}`))
}).forEach(([key, value]) => {
delete preset.transform[key]
const newKey = key.replace(imageTypesRegex, '')
preset.transform[newKey] = value
})
preset.transform = {
...preset.transform,
[`.+\\.(${imageTypes.join('|')})$`]: '<rootDir>/tests/my-jest-url-loader',
}
}
useUrlLoaderForImages(newJestPreset)
module.exports = newJestPreset
GitHub demo
I'm trying to concatenate two js files after the webpack build process. The goal is to provide a single js file with the ES6 modules and the legacy code in it.
Already tried plugins like webpack-concat-files-plugin without success. Which makes sense to me, because the output files are not there when the plugin gets executed.
Another thought would be a small script executed in the afterCompile hook, which handles the concatenation of the two files. Is this the common way to do something like this or are there other ways to achieve the goal?
Thanks for your help.
Basic example:
module.exports = {
entry: {
app: 'app.js',
legacy: [
'legacy-1.js',
'legacy-2.js',
// ...
]
},
output: {
filename: path.join('dist/js', '[name].js'),
}
}
Solved this as suggested:
FileMergeWebpackPlugin.js
const fs = require('fs');
class FileMergeWebpackPlugin {
constructor({ files, destination, removeSourceFiles }) {
this.files = files;
this.destination = destination;
this.removeSourceFiles = removeSourceFiles;
}
apply(compiler) {
compiler.hooks.afterEmit.tap('FileMergeWebpackPlugin', () => {
const fileBuffers = [];
this.files
.filter(file => fs.existsSync(file))
.forEach(file => fileBuffers.push(fs.readFileSync(file)))
fs.writeFileSync(this.destination, fileBuffers.concat(), { encoding: 'UTF-8' })
if (this.removeSourceFiles) {
this.files.forEach(file => fs.unlinkSync(file));
}
});
}
}
module.exports = FileMergeWebpackPlugin;
webpack.config.js
const FileMergeWebpackPlugin = require('./FileMergeWebpackPlugin');
module.exports = {
entry: {
app: 'app.js',
legacy: [
'legacy-1.js',
'legacy-2.js',
]
},
output: {
filename: path.join('dist/js', '[name].js'),
},
plugins: [
new FileMergeWebpackPlugin({
destination: 'dist/js/bundle.js',
removeSourceFiles: true,
files: [
'dist/js/app.js',
'dist/js/legacy.js',
]
})
]
}
Eventually i will release this as a npm package, will update the post when i do so
Yes.
You can create your own Plugin that is activate in emit hook - the parameter for this hook is compilation. You can get the chunks you look to concat from this object and create a new chunk with the concated value (or just add one to the other)
I have been trying to get this to work maybe I'm missing something. I am using ng-constant and setting up different environments end point as mentioned in the ng-constants issue
However I am using gulp and the configuration looks like
gulp.task('environmentsapi', function () {
return ngConstant({
stream: true,
development: {
constants: {
"ENV": {"api": "http://1.1.1.1:8082/"}
}
},
production: {
constants: {
"ENV": {"api": "https://productionapplink/"}
}
}
})
// Writes config.js to dist/ folder
.pipe(gulp.dest('dist/scripts/config'));
});
I cant figure out how to call the different end points in the different gulp tasks like the example in the link ngconstant:development etc. How can i run this within the task environmentsapi, since this task is shared in all environment builds. Please let me know how to do this.
gulp.task('build', function () {
runSequence('clean', ['sass', 'scripts', 'bower_components', 'environmentsapi' //How can I run ngconstant:development here? ], 'wiredep')
});
Simply create new tasks that set flags!
Here I'm using the development flag that defaults to true.
var development = true;
gulp.task('prod', function () {
development = false;
});
gulp.task('environmentsapi', function () {
const apiEndpoint = development ? 'http://1.1.1.1:8082/' : 'https://productionapplink/';
return ngConstant({
stream: true,
constants: {
'ENV': {api: apiEndpoint}
}
});
});
Now, using gulp build will build your application with the ENV.api set to 'http://1.1.1.1:8082/', your development endpoint.
And calling gulp prod build will make your output use an ENV.api set to 'https://productionapplink/'.
As discussed in the comments section, the solution above is quite perfect when you only have two environments, but it quickly gets out of hand when the number of environment grows.
In that case, I suggest using a different approach, the Pirate way, using yargs.
Here would be your new gulpfile.js:
const argv = require('yargs').argv;
const endpoints = {
'dev': 'http://1.1.1.1:8082/',
'prod-org': 'https://productionapplink.org/',
'prod-com': 'https://productionapplink.com/',
'prod-gov': 'https://productionapplink.gov/'
};
gulp.task('enviornmentsapi', function () {
const apiEnpdoint = typeof argv.env === 'undefined' ? endpoints.dev : endpoints[argv.env];
return ngConstant({
stream: true,
constants: {
ENV: { api: apiEnpdoint }
}
}).pipe(gulp.dest('dist/scripts/config'));
});
Use it like follows:
gulp build uses the default api URL: 'http://1.1.1.1:8082/'
gulp build --env=prod-org uses 'https://productionapplink.org/'
gulp build --env=prod-com uses 'https://productionapplink.com/'
I hope this could work for you this time!
I'm trying to use web pack to compile an in memory string of valid javascript code. I'm using memory fs as outlined here: https://webpack.github.io/docs/node.js-api.html#compile-to-memory.
So I'm taking a string containing raw javascript, writing that to memory fs, and then web pack resolves to that entry point. But the compilation fails on the first require statement, presumably because it's not able to look in the real fs for node_modules.
Any ideas on how can I accomplish this?
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
function* compile(code) {
const fs = new MemoryFS();
fs.writeFileSync('/file.js', code);
const compiler = webpack({
entry: { file: '/file.js' },
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
}
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = fs;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //should be a bundle containing the underscore source.
The error is
ModuleNotFoundError: Module not found: Error: Cannot resolve module
underscore in /
This question indicates that others have tried the same thing: https://github.com/webpack/webpack/issues/1562. there's a gist referenced at https://gist.github.com/DatenMetzgerX/2a96ebf287b4311f4c18 that I believe was intended to do what I'm hoping to accomplish, but in it's current form I don't see how. It assigns an instance of MemoryFs to all of the resolvers. I've tried assigning node's fs module, but no dice.
So in short, I'm trying to set an entry point to an in memory string of raw javascript, but still have require and import statements resolved to node_modules on disk.
UPDATE
I've been able to get the result I'm looking for but it's not pretty. I'm basically overriding the implementation of #stat and #readFile in MemoryFS
to check the real filesystem if it gets any request for a file that doesn't exist in memory. I could clean this up a bit by subclassing MemoryFS instead of swapping method implementations at runtime, but the idea would still be the same.
Working solution
import webpack from 'webpack';
import JsonLoader from 'json-loader';
import MemoryFS from 'memory-fs';
import UglifyJS from "uglify-js";
import thenify from 'thenify';
import path from 'path';
import fs from 'fs';
import root from 'app-root-path';
/*
* Provide webpack with an instance of MemoryFS for
* in-memory compilation. We're currently overriding
* #stat and #readFile. Webpack will ask MemoryFS for the
* entry file, which it will find successfully. However,
* all dependencies are on the real filesystem, so any require
* or import statements will fail. When that happens, our wrapper
* functions will then check fs for the requested file.
*/
const memFs = new MemoryFS();
const statOrig = memFs.stat.bind(memFs);
const readFileOrig = memFs.readFile.bind(memFs);
memFs.stat = function (_path, cb) {
statOrig(_path, function(err, result) {
if (err) {
return fs.stat(_path, cb);
} else {
return cb(err, result);
}
});
};
memFs.readFile = function (path, cb) {
readFileOrig(path, function (err, result) {
if (err) {
return fs.readFile(path, cb);
} else {
return cb(err, result);
}
});
};
export default function* compile(code) {
// Setup webpack
//create a directory structure in MemoryFS that matches
//the real filesystem
const rootDir = root.toString();
//write code snippet to memoryfs
const outputName = `file.js`;
const entry = path.join(rootDir, outputName);
const rootExists = memFs.existsSync(rootDir);
if (!rootExists) {
memFs.mkdirpSync(rootDir);
}
memFs.writeFileSync(entry, code);
//point webpack to memoryfs for the entry file
const compiler = webpack({
entry: entry,
output: {
filename: outputName
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
]
}
});
compiler.run = thenify(compiler.run);
//direct webpack to use memoryfs for file input
compiler.inputFileSystem = memFs;
compiler.resolvers.normal.fileSystem = memFs;
//direct webpack to output to memoryfs rather than to disk
compiler.outputFileSystem = memFs;
const stats = yield compiler.run();
//remove entry from memory. we're done with it
memFs.unlinkSync(entry);
const errors = stats.compilation.errors;
if (errors && errors.length > 0) {
//if there are errors, throw the first one
throw errors[0];
}
//retrieve the output of the compilation
const res = stats.compilation.assets[outputName].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //is a valid js bundle containing the underscore source and a log statement logging _.
If there's not a better way, then I'll definitely encapsulate this into a subclass of MemoryFS, but I'm hoping there's a more sane way to accomplish this with Webpack's api.
Instead of memory-fs, the combination of unionfs/memfs/linkfs should help.
https://npmjs.com/unionfs
https://npmjs.com/memfs
https://npmjs.com/linkfs
I have created this snippet untested. I think you want the inputFS to be the real one and the output fs to be the in memory one. On the other hand you want all the dependencies of file.js to be constructed separately. For that I figured the webpack.optimize.CommonsChunkPlugin plugin could help. I expect webpack to write everything to the memory. I hope it works.
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
import realFS from 'fs';
function* compile(code) {
const fs = new MemoryFS();
const compiler = webpack({
entry: {
file: '/file.js',
vendor: [
'underscore',
'other-package-name'
]
},
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
},
plugins: [
new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.js')
]
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = realFS;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
You're using MemoryFS, which is a JavaScript reimplementation of a feature normally handled by the Operating System. I wonder, could you mount a directory using tmpfs at the Operating System level, then use that? webpack would then not know or care that the input file is actually stored in memory.
Assuming that you have mounted a memory-based filesystem at /media/memory, the webpack configuration code could be as simple as this:
resolve: {
root: ['/media/memory', ...other paths...],
},
output: {
path: '/wherever/you/want/the/output/files'
}
}
This approach also has a hidden benefit: If you want to debug the input code, you just mount /media/memory with a non-RAM-based filesystem and you can see what's being generated.
I know it's late but for the record here comes a code snippet.
import * as fs from 'fs';
import { resolve } from 'path';
import { Volume } from 'memfs';
import { ufs } from 'unionfs';
const volume = Volume.fromJSON({
[resolve(process.cwd(), 'test.js')]: 'this file is on memory not on disk'
});
ufs.use(fs).use(volume);
// Reads from memory
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.js'), 'utf8'));
// Reads from disk
console.log(ufs.readFileSync(resolve(process.cwd(), 'package.json'), 'utf8'));
// Writing into memory
volume.writeFileSync(resolve(process.cwd(), 'test.memory'), 'This should be
on memory');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.memory'), 'utf8'));
// Writing into disk
ufs.writeFileSync(resolve(process.cwd(), 'test.disk'), 'This should be on disk');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.disk'), 'utf8'));
Hers the console output:
user1#pc playground % node inMem.mjs
this file is on memory not on disk
{
"name": "playground",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"memfs": "^3.3.0",
"unionfs": "^4.4.0"
}
}
This should be on memory
This should be on disk
user1#pc playground % ls .
inMem.mjs node_modules package.json yarn.lock