I'm trying to use web pack to compile an in memory string of valid javascript code. I'm using memory fs as outlined here: https://webpack.github.io/docs/node.js-api.html#compile-to-memory.
So I'm taking a string containing raw javascript, writing that to memory fs, and then web pack resolves to that entry point. But the compilation fails on the first require statement, presumably because it's not able to look in the real fs for node_modules.
Any ideas on how can I accomplish this?
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
function* compile(code) {
const fs = new MemoryFS();
fs.writeFileSync('/file.js', code);
const compiler = webpack({
entry: { file: '/file.js' },
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
}
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = fs;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //should be a bundle containing the underscore source.
The error is
ModuleNotFoundError: Module not found: Error: Cannot resolve module
underscore in /
This question indicates that others have tried the same thing: https://github.com/webpack/webpack/issues/1562. there's a gist referenced at https://gist.github.com/DatenMetzgerX/2a96ebf287b4311f4c18 that I believe was intended to do what I'm hoping to accomplish, but in it's current form I don't see how. It assigns an instance of MemoryFs to all of the resolvers. I've tried assigning node's fs module, but no dice.
So in short, I'm trying to set an entry point to an in memory string of raw javascript, but still have require and import statements resolved to node_modules on disk.
UPDATE
I've been able to get the result I'm looking for but it's not pretty. I'm basically overriding the implementation of #stat and #readFile in MemoryFS
to check the real filesystem if it gets any request for a file that doesn't exist in memory. I could clean this up a bit by subclassing MemoryFS instead of swapping method implementations at runtime, but the idea would still be the same.
Working solution
import webpack from 'webpack';
import JsonLoader from 'json-loader';
import MemoryFS from 'memory-fs';
import UglifyJS from "uglify-js";
import thenify from 'thenify';
import path from 'path';
import fs from 'fs';
import root from 'app-root-path';
/*
* Provide webpack with an instance of MemoryFS for
* in-memory compilation. We're currently overriding
* #stat and #readFile. Webpack will ask MemoryFS for the
* entry file, which it will find successfully. However,
* all dependencies are on the real filesystem, so any require
* or import statements will fail. When that happens, our wrapper
* functions will then check fs for the requested file.
*/
const memFs = new MemoryFS();
const statOrig = memFs.stat.bind(memFs);
const readFileOrig = memFs.readFile.bind(memFs);
memFs.stat = function (_path, cb) {
statOrig(_path, function(err, result) {
if (err) {
return fs.stat(_path, cb);
} else {
return cb(err, result);
}
});
};
memFs.readFile = function (path, cb) {
readFileOrig(path, function (err, result) {
if (err) {
return fs.readFile(path, cb);
} else {
return cb(err, result);
}
});
};
export default function* compile(code) {
// Setup webpack
//create a directory structure in MemoryFS that matches
//the real filesystem
const rootDir = root.toString();
//write code snippet to memoryfs
const outputName = `file.js`;
const entry = path.join(rootDir, outputName);
const rootExists = memFs.existsSync(rootDir);
if (!rootExists) {
memFs.mkdirpSync(rootDir);
}
memFs.writeFileSync(entry, code);
//point webpack to memoryfs for the entry file
const compiler = webpack({
entry: entry,
output: {
filename: outputName
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
]
}
});
compiler.run = thenify(compiler.run);
//direct webpack to use memoryfs for file input
compiler.inputFileSystem = memFs;
compiler.resolvers.normal.fileSystem = memFs;
//direct webpack to output to memoryfs rather than to disk
compiler.outputFileSystem = memFs;
const stats = yield compiler.run();
//remove entry from memory. we're done with it
memFs.unlinkSync(entry);
const errors = stats.compilation.errors;
if (errors && errors.length > 0) {
//if there are errors, throw the first one
throw errors[0];
}
//retrieve the output of the compilation
const res = stats.compilation.assets[outputName].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //is a valid js bundle containing the underscore source and a log statement logging _.
If there's not a better way, then I'll definitely encapsulate this into a subclass of MemoryFS, but I'm hoping there's a more sane way to accomplish this with Webpack's api.
Instead of memory-fs, the combination of unionfs/memfs/linkfs should help.
https://npmjs.com/unionfs
https://npmjs.com/memfs
https://npmjs.com/linkfs
I have created this snippet untested. I think you want the inputFS to be the real one and the output fs to be the in memory one. On the other hand you want all the dependencies of file.js to be constructed separately. For that I figured the webpack.optimize.CommonsChunkPlugin plugin could help. I expect webpack to write everything to the memory. I hope it works.
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
import realFS from 'fs';
function* compile(code) {
const fs = new MemoryFS();
const compiler = webpack({
entry: {
file: '/file.js',
vendor: [
'underscore',
'other-package-name'
]
},
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
},
plugins: [
new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.js')
]
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = realFS;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
You're using MemoryFS, which is a JavaScript reimplementation of a feature normally handled by the Operating System. I wonder, could you mount a directory using tmpfs at the Operating System level, then use that? webpack would then not know or care that the input file is actually stored in memory.
Assuming that you have mounted a memory-based filesystem at /media/memory, the webpack configuration code could be as simple as this:
resolve: {
root: ['/media/memory', ...other paths...],
},
output: {
path: '/wherever/you/want/the/output/files'
}
}
This approach also has a hidden benefit: If you want to debug the input code, you just mount /media/memory with a non-RAM-based filesystem and you can see what's being generated.
I know it's late but for the record here comes a code snippet.
import * as fs from 'fs';
import { resolve } from 'path';
import { Volume } from 'memfs';
import { ufs } from 'unionfs';
const volume = Volume.fromJSON({
[resolve(process.cwd(), 'test.js')]: 'this file is on memory not on disk'
});
ufs.use(fs).use(volume);
// Reads from memory
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.js'), 'utf8'));
// Reads from disk
console.log(ufs.readFileSync(resolve(process.cwd(), 'package.json'), 'utf8'));
// Writing into memory
volume.writeFileSync(resolve(process.cwd(), 'test.memory'), 'This should be
on memory');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.memory'), 'utf8'));
// Writing into disk
ufs.writeFileSync(resolve(process.cwd(), 'test.disk'), 'This should be on disk');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.disk'), 'utf8'));
Hers the console output:
user1#pc playground % node inMem.mjs
this file is on memory not on disk
{
"name": "playground",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"memfs": "^3.3.0",
"unionfs": "^4.4.0"
}
}
This should be on memory
This should be on disk
user1#pc playground % ls .
inMem.mjs node_modules package.json yarn.lock
Related
As per the documentation, I am not using worker-loader, I am trying to use the native way suggested by the webpack-5 documentation.
Below is the usage of the worker script in the main thread.
const worker = new window.Worker(
new URL("../workers/listOperation.worker.js", import.meta.url),
{
type: "module",
},
);
worker.postMessage({ list: hugeList, params: reqData });
worker.onerror = err => console.error(err);
worker.onmessage = e => {
const { list } = e.data;
// Usage of `list` from the response
worker.terminate();
};
return worker;
It works fine if there are no imports used in the script. But when I import any node modules (e.g. loadash/get) or any other function/constants from local, it does not work as the output webWorker bundle file doesn't transpile and bundle the imported code. It keeps the "import" statement as it is.
Below is the worker script (listOperation.worker.js)
import get from "lodash/get";
import { ANY_CONSTANT } from "../constants"; // This is some local constant
addEventListener("message", e => {
const { list, params } = e.data;
// Here I have some usage of `get` method from `lodash/get` and ANY_CONSTANT
self.postMessage({
list: list,
});
});
Webpack outputs the bundle file like below which won't be usable by the browser, if I put the /\.worker.js$/ pattern in the the exclude of babel-loader rule.
import get from "lodash/get";import{ANY_CONSTANT}from"../constants";addEventListener("message",(e=>{const{list:s,params:t,.......
And even if I don't put the /\.worker.js$/ pattern in the the exclude of babel-loader rule, the output bundle still doesn't include the implementation of get from lodash/get or the value of the constant. It just outputs it in cjs using require.
Also, I made use of asset module so that I can put the output file inside a directory, not directly in the root of the dist folder. Configuration changes in my webpack.config looks like this.
module.exports = {
entry: {...},
module: {
rules: [
{
test: /\.js$/,
exclude: [/(node_modules)/, /\.worker.js$/],
use: {
loader: "babel-loader", // This uses the config defined in babel.config.js
},
},
{
test: /\.worker.js$/,
exclude: /(node_modules)/,
type: "asset/resource",
generator: {
filename: "js/workers/[hash][ext][query]",
},
},
],
},
}
The problem
My project consists of a yarn monorepo in which, among various packages, there is a NextJS app and a configuration package (which is also shared with the server and a react-native application). In the configuration module what I do is to export the production keys if the environment is production, while if the project is under development, the development ones.
import { merge } from "lodash";
import { IConfig, RecursivePartial } from "./interfaces";
import { defaultKeys } from "./default.keys";
import { existsSync } from "fs";
const secretKeys: RecursivePartial<IConfig> = (function (env) {
switch (env) {
case "production":
return require("./production.keys").productionKeys;
default:
try {
if (!existsSync("./development.keys")) {
return require("./production.keys").productionKeys;
} else {
return require("./development.keys").developmentKeys;
}
} catch (e) {
}
}
})(process.env.NODE_ENV);
export const keys = merge(defaultKeys, secretKeys) as IConfig;
Of course, the development configuration file is included in the .gitignore file and therefore does not exist during production.
This has always worked perfectly (with the server and the mobile app), and indeed, there was never the need to check with the fs module if the development.keys file exists (check which I added later).
However, I recently added a NextJs application to the project. I encountered no problems during development, however when I tried to deploy the app on Heroku I encountered this error:
ModuleNotFoundError: Module not found: Error: Can't resolve './development.keys' in '/tmp/build_43d652bc/packages/config/dist/keys'
What I did
Initially, I thought it was a problem with the environment variables and that in production the development file was wrongly required.
However, I later realized that this was not the problem. And indeed, even placing the import of the configuration file for development in any place of the code, even following a return, the error continues to occur.
import { merge } from "lodash";
import { IConfig, RecursivePartial } from "./interfaces";
import { defaultKeys } from "./default.keys";
import { existsSync } from "fs";
const secretKeys: RecursivePartial<IConfig> = (function (env) {
switch (env) {
case "production":
return require("./production.keys").productionKeys;
default:
try {
if (!existsSync("./development.keys")) {
return require("./production.keys").productionKeys; // <-------
} else {
return require("./production.keys").productionKeys; // <-------
}
require("./development.keys").developmentKeys; // <------- This line should not be executed
} catch (e) {
return require("./production.keys").productionKeys;
}
}
})(process.env.NODE_ENV);
export const keys = merge(defaultKeys, secretKeys) as IConfig;
It is as if during the build, nextjs (or probably webpack) controls all the imports, without following the "flow" of the code.
I hope someone shows me where I am wrong because at the moment I am stuck. Thank you!
Update
Thanks to the ideas of this discussion I changed my file which now looks like this:
const getConfigPath = (env?: string): string => {
console.log(env);
if (env === "production") return "production.keys";
else return "development.keys";
};
const secretKeys: RecursivePartial<IConfig> = require("./" +
getConfigPath(process.env.NODE_ENV)).keys;
export const keys = merge(defaultKeys, secretKeys) as IConfig;
However, now I'm running into another webpack error:
Module parse failed: Unexpected token (2:7)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| import { IConfig, RecursivePartial } from "../interfaces";
> export declare const keys: RecursivePartial<IConfig>;
It is as if webpack does not recognize declaration files generated by typescript. However, the error is new and I have never encountered it. I believe it's due to the behavior of webpack pointed out in the linked discussion.
I trust in some help, since I know little about webpack
Edit
This is my next.config.js:
const path = require("path");
module.exports = {
distDir: '../../.next',
sassOptions: {
includePaths: [path.join(__dirname, 'styles')],
prependData: `#import "variables.module.scss";`
},
webpack: (config, { isServer }) => {
if (!isServer) {
config.resolve.fallback.fs = false;
}
return config;
},
};
Basically they are the defult configurations. The only thing I have changed is the path for the build, I have made a sass file globle and I have momentarily added a piece of code to allow the use of the fs module, which however as you can see above I do not use it more. So I could take this configuration out.
You need the loader for typescript configured in your next.config.js
npm install || yarn add awesome-typescript-loader
module.exports = {
distDir: '../../.next',
sassOptions: {
includePaths: [path.join(__dirname, 'styles')],
prependData: `#import "variables.module.scss";`
},
webpack: (config, options) => {
const { dir, defaultLoaders, isServer } = options;
if (!isServer) {
config.resolve.fallback.fs = false;
}
config.pageExtensions.push(".ts", ".tsx");
config.resolve.extensions.push(".ts", ".tsx");
config.module.rules.push({
test: /\.tsx?$/,
include: [dir],
exclude: /node_modules/,
use: [
defaultLoaders.babel,
{
loader: "awesome-typescript-loader",,
},
],
});
return config;
}
}
I utilize webpack with webpack-import-glob-loader to import files using glob pattern. So in one of my files (src/store/resources/module.ts) I have a line:
import '../../modules/resources/providers/**/*.resource.ts';
When I run a test with ts-jest it fails and reports the following message:
Cannot find module '../../modules/resources/providers/**/*.resource.ts' from 'src/store/resources/module.ts`
I assume it complains because it can't recognize this import syntax.
How to make jest work for project with glob imports?
I solved this by manually handling the globs inside of a jest preprocessor. Since you need to control processing of the files to handle globs in this approach, you have to manually initialize your processor.
// config.js
module.exports = {
transform: {
'.': `./path/to/your/processor.js`
// processor.js
const path = require(`path`);
const glob = require(`glob`).sync;
const yourProcessor = // require your processor code - ts-jest, babel-jest, esbuild-jest, etc
module.exports = {
process(src, filename, config, opts) {
const dir = path.dirname(filename);
src = processGlob(src, dir);
return yourProcessor(src, filename, config, opts);
},
};
function processGlob(src, dir) {
// This will match things like
// import './**/*';
// import '../../modules/resources/providers/**/*.resource.ts';
// Takes inspiration from https://github.com/excid3/esbuild-rails/blob/main/src/index.js
return src.replace(/^import\s'(.*\*.*)';$/m, (_match, pathCapture) => {
const matcher = /.+\..+/; // Handles '.' results
const files = glob(pathCapture, {
cwd: dir,
})
.sort()
.filter((path) => matcher.test(path));
return `${files.map((module, index) => `import * as module${index} from '${module}'`).join(`;`)}`;
});
}
In this approach you can only glob once per file (because it uses the .map indexes to add numbers to each import module name). You could keep track of a global count variable instead, if you wanted to have several glob imports in one file.
I'm trying to concatenate two js files after the webpack build process. The goal is to provide a single js file with the ES6 modules and the legacy code in it.
Already tried plugins like webpack-concat-files-plugin without success. Which makes sense to me, because the output files are not there when the plugin gets executed.
Another thought would be a small script executed in the afterCompile hook, which handles the concatenation of the two files. Is this the common way to do something like this or are there other ways to achieve the goal?
Thanks for your help.
Basic example:
module.exports = {
entry: {
app: 'app.js',
legacy: [
'legacy-1.js',
'legacy-2.js',
// ...
]
},
output: {
filename: path.join('dist/js', '[name].js'),
}
}
Solved this as suggested:
FileMergeWebpackPlugin.js
const fs = require('fs');
class FileMergeWebpackPlugin {
constructor({ files, destination, removeSourceFiles }) {
this.files = files;
this.destination = destination;
this.removeSourceFiles = removeSourceFiles;
}
apply(compiler) {
compiler.hooks.afterEmit.tap('FileMergeWebpackPlugin', () => {
const fileBuffers = [];
this.files
.filter(file => fs.existsSync(file))
.forEach(file => fileBuffers.push(fs.readFileSync(file)))
fs.writeFileSync(this.destination, fileBuffers.concat(), { encoding: 'UTF-8' })
if (this.removeSourceFiles) {
this.files.forEach(file => fs.unlinkSync(file));
}
});
}
}
module.exports = FileMergeWebpackPlugin;
webpack.config.js
const FileMergeWebpackPlugin = require('./FileMergeWebpackPlugin');
module.exports = {
entry: {
app: 'app.js',
legacy: [
'legacy-1.js',
'legacy-2.js',
]
},
output: {
filename: path.join('dist/js', '[name].js'),
},
plugins: [
new FileMergeWebpackPlugin({
destination: 'dist/js/bundle.js',
removeSourceFiles: true,
files: [
'dist/js/app.js',
'dist/js/legacy.js',
]
})
]
}
Eventually i will release this as a npm package, will update the post when i do so
Yes.
You can create your own Plugin that is activate in emit hook - the parameter for this hook is compilation. You can get the chunks you look to concat from this object and create a new chunk with the concated value (or just add one to the other)
I'm using Webpack's [hash] for cache busting locale files. But I also need to hard-code the locale file path to load it from browser. Since the file path is altered with [hash], I need to inject this value to get right path.
I don't know how can get Webpack [hash] value programmatically in config so I can inject it using WebpackDefinePlugin.
module.exports = (env) => {
return {
entry: 'app/main.js',
output: {
filename: '[name].[hash].js'
}
...
plugins: [
new webpack.DefinePlugin({
HASH: ***???***
})
]
}
}
In case you want to dump the hash to a file and load it in your server's code, you can define the following plugin in your webpack.config.js:
const fs = require('fs');
class MetaInfoPlugin {
constructor(options) {
this.options = { filename: 'meta.json', ...options };
}
apply(compiler) {
compiler.hooks.done.tap(this.constructor.name, stats => {
const metaInfo = {
// add any other information if necessary
hash: stats.hash
};
const json = JSON.stringify(metaInfo);
return new Promise((resolve, reject) => {
fs.writeFile(this.options.filename, json, 'utf8', error => {
if (error) {
reject(error);
return;
}
resolve();
});
});
});
}
}
module.exports = {
// ... your webpack config ...
plugins: [
// ... other plugins ...
new MetaInfoPlugin({ filename: 'dist/meta.json' }),
]
};
Example content of the output meta.json file:
{"hash":"64347f3b32969e10d80c"}
I've just created a dumpmeta-webpack-plugin package for this plugin. So you might use it instead:
const { DumpMetaPlugin } = require('dumpmeta-webpack-plugin');
module.exports = {
...
plugins: [
...
new DumpMetaPlugin({
filename: 'dist/meta.json',
prepare: stats => ({
// add any other information you need to dump
hash: stats.hash,
})
}),
]
}
Please refer to the Webpack documentation for all available properties of the Stats object.
Seems like it should be a basic feature but apparently it's not that simple to do.
You can accomplish what you want by using wrapper-webpack-plugin.
plugins: [
new WrapperPlugin({
header: '(function (BUILD_HASH) {',
footer: function (fileName) {
const rx = /^.+?\.([a-z0-9]+)\.js$/;
const hash = fileName.match(rx)[1];
return `})('${hash}');`;
},
})
]
A bit hacky but it works — if u don't mind the entire chunk being wrapped in an anonymous function.
Alternatively you can just add var BUILD_HASH = ... in the header option, though it could cause problem if it becomes a global.
I created this plugin a while back, I'll try to update it so it provides the chunk hash naturally.
On server, you can get the hash by reading the filenames (example: web.bundle.f4771c44ee57573fabde.js) from your bundle folder.
You can pass the version to your build using webpack.DefinePlugin
If you have a package.json with a version, you can extract it like this:
const version = require("./package.json").version;
For example (we stringified the version):
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(version)
}),
then in your javascript, the version will be available as:
process.env.VERSION
The WebpackManifestPlugin is officially recommended in the output management guide. It writes a JSON to the output directory mapping the input filenames to the output filenames. Then you can inject those mapped values into your server template.
It's similar to Dmitry's answer, except Dmitry's doesn't appear to support multiple chunks.
That can be done with Webpack Stats Plugin. It gives you nice and neat output file with all the data you want. And it's easy to incorporate it to the webpack config files where needed.
E.g. To get hash generated by Webpack and use it elsewhere.
Could be achieved like:
# installation
npm install --save-dev webpack-stats-plugin
yarn add --dev webpack-stats-plugin
# generating stats file
const { StatsWriterPlugin } = require("webpack-stats-plugin")
module.exports = {
plugins: [
// Everything else **first**.
// Write out stats file to build directory.
new StatsWriterPlugin({
stats: {
all: false,
hash: true,
},
filename: "stats.json" // Default and goes straight to your output folder
})
]
}
# usage
const stats = require("YOUR_PATH_TO/stats.json");
console.log("Webpack's hash is - ", stats.hash);
More usage examples in their repo
Hope that helps!