Using jest with files with glob imports - javascript

I utilize webpack with webpack-import-glob-loader to import files using glob pattern. So in one of my files (src/store/resources/module.ts) I have a line:
import '../../modules/resources/providers/**/*.resource.ts';
When I run a test with ts-jest it fails and reports the following message:
Cannot find module '../../modules/resources/providers/**/*.resource.ts' from 'src/store/resources/module.ts`
I assume it complains because it can't recognize this import syntax.
How to make jest work for project with glob imports?

I solved this by manually handling the globs inside of a jest preprocessor. Since you need to control processing of the files to handle globs in this approach, you have to manually initialize your processor.
// config.js
module.exports = {
transform: {
'.': `./path/to/your/processor.js`
// processor.js
const path = require(`path`);
const glob = require(`glob`).sync;
const yourProcessor = // require your processor code - ts-jest, babel-jest, esbuild-jest, etc
module.exports = {
process(src, filename, config, opts) {
const dir = path.dirname(filename);
src = processGlob(src, dir);
return yourProcessor(src, filename, config, opts);
},
};
function processGlob(src, dir) {
// This will match things like
// import './**/*';
// import '../../modules/resources/providers/**/*.resource.ts';
// Takes inspiration from https://github.com/excid3/esbuild-rails/blob/main/src/index.js
return src.replace(/^import\s'(.*\*.*)';$/m, (_match, pathCapture) => {
const matcher = /.+\..+/; // Handles '.' results
const files = glob(pathCapture, {
cwd: dir,
})
.sort()
.filter((path) => matcher.test(path));
return `${files.map((module, index) => `import * as module${index} from '${module}'`).join(`;`)}`;
});
}
In this approach you can only glob once per file (because it uses the .map indexes to add numbers to each import module name). You could keep track of a global count variable instead, if you wanted to have several glob imports in one file.

Related

Webpack - Bundling multiple js files with respect to resusable methods and variable from one another

I am trying to bundle around 10+ javascript files of my application which are loaded as scripts in the index.html file. They are properly sequenced as per their dependencies with one another.
<script src="/js/main.js"></script>
<script src="/js/user.js"></script>
<script src="/js/contact.js"></script>
...
The code inside each file looks like this:
// main.js
const _main = {};
_main.MethodOne = function(){
...
}
// user.js
const _user = {};
_user.MethodTwo = function(){
// Can access the method from main.js file
_main.MethodOne();
}
// contact.js
const _contact = {};
_contact.MethodThree = function(){
// Can access the methods from main.js & user.js file
_user.MethodTwo();
_main.MethodOne();
}
Now, when bundling them through webpack, the constant variables _main, _contact & _user gets renamed to some random letters. However, the names used to invoke those methods from other files remain the same in each case i.e. _user.MethodTwo(), _main.MethodOne() doesn't change.
This is my webpack.config.js
entry: {
vendors: [
'./public/js/colorpicker.js',
...
],
app: [
'./public/js/main.js',
'./public/js/user.js',
'./public/js/contact.js'
]
},
mode: 'production',
output: {
filename: 'js/[name].[contenthash].bundle.js',
path: path.resolve(__dirname, 'dist'),
clean: true,
publicPath: 'auto',
assetModuleFilename: 'img/[hash][ext][query]'
},
I read the webpack documentation however didn't get any clue about this problem.
Your best bet is to move to the module approach by using import and export.
so instead of defining global variables like you do, each file would export the corresponding variable. If a file needs a variable from another file, it simply has to import it.
// main.js
export const _main = {};
_main.MethodOne = function(){
...
}
// user.js
import {_main} from "./main.js";
export const _user = {};
_user.MethodTwo = function(){
// Can access the method from main.js file
_main.MethodOne();
}
// contact.js
import {_main} from "./main.js";
import {_user} from "./user.js";
const _contact = {};
_contact.MethodThree = function(){
// Can access the methods from main.js & user.js file
_user.MethodTwo();
_main.MethodOne();
}
Read more about import/export here

Yarn Workspaces, workspace does not emit errors or warnings

I have followed the following post in order to create a monorepo using yarn workspaces and craco.
It works really well except one thing: the errors/warnings of the common (components )library are not emitted to the console.
The structure is very simple:
monorepo
|-packages
|-components
|-fe
Fe is the main webApp that uses the components library.
The FE emits all warnings correctly, components does not.
How to make the shared component emit warnings/errors?
Updated:
Steps to reproduce in this repo:
https://github.com/sofoklisM/my-monorepo.git
What you need to change is the context option of the underlying ESLint Webpack plugin that is used by Create React App.
In this case I changed the context of ESLint to the root of the monorepo (yarn workspace root).
Here is an updated craco.config.js that should do the trick:
// craco.config.js
const path = require("path");
const { getLoader, loaderByName } = require("#craco/craco");
const { getPlugin, pluginByName } = require("#craco/craco/lib/webpack-plugins")
const absolutePath = path.join(__dirname, "../components");
module.exports = {
webpack: {
alias: {},
plugins: [],
configure: (webpackConfig, { env, paths }) => {
const { isFound, match } = getLoader(
webpackConfig,
loaderByName("babel-loader")
);
if (isFound) {
const include = Array.isArray(match.loader.include)
? match.loader.include
: [match.loader.include];
match.loader.include = include.concat([absolutePath]);
}
// Change context of ESLint Webpack Plugin
const { match: eslintPlugin } = getPlugin(webpackConfig, pluginByName("ESLintWebpackPlugin"));
eslintPlugin.options['context'] = path.join(__dirname, "../..");
return webpackConfig;
}
}
};
I've also made an updated fork of your reproduction repo here: https://github.com/ofhouse/stackoverflow-65447779

How to inject Webpack build hash to application code

I'm using Webpack's [hash] for cache busting locale files. But I also need to hard-code the locale file path to load it from browser. Since the file path is altered with [hash], I need to inject this value to get right path.
I don't know how can get Webpack [hash] value programmatically in config so I can inject it using WebpackDefinePlugin.
module.exports = (env) => {
return {
entry: 'app/main.js',
output: {
filename: '[name].[hash].js'
}
...
plugins: [
new webpack.DefinePlugin({
HASH: ***???***
})
]
}
}
In case you want to dump the hash to a file and load it in your server's code, you can define the following plugin in your webpack.config.js:
const fs = require('fs');
class MetaInfoPlugin {
constructor(options) {
this.options = { filename: 'meta.json', ...options };
}
apply(compiler) {
compiler.hooks.done.tap(this.constructor.name, stats => {
const metaInfo = {
// add any other information if necessary
hash: stats.hash
};
const json = JSON.stringify(metaInfo);
return new Promise((resolve, reject) => {
fs.writeFile(this.options.filename, json, 'utf8', error => {
if (error) {
reject(error);
return;
}
resolve();
});
});
});
}
}
module.exports = {
// ... your webpack config ...
plugins: [
// ... other plugins ...
new MetaInfoPlugin({ filename: 'dist/meta.json' }),
]
};
Example content of the output meta.json file:
{"hash":"64347f3b32969e10d80c"}
I've just created a dumpmeta-webpack-plugin package for this plugin. So you might use it instead:
const { DumpMetaPlugin } = require('dumpmeta-webpack-plugin');
module.exports = {
...
plugins: [
...
new DumpMetaPlugin({
filename: 'dist/meta.json',
prepare: stats => ({
// add any other information you need to dump
hash: stats.hash,
})
}),
]
}
Please refer to the Webpack documentation for all available properties of the Stats object.
Seems like it should be a basic feature but apparently it's not that simple to do.
You can accomplish what you want by using wrapper-webpack-plugin.
plugins: [
new WrapperPlugin({
header: '(function (BUILD_HASH) {',
footer: function (fileName) {
const rx = /^.+?\.([a-z0-9]+)\.js$/;
const hash = fileName.match(rx)[1];
return `})('${hash}');`;
},
})
]
A bit hacky but it works — if u don't mind the entire chunk being wrapped in an anonymous function.
Alternatively you can just add var BUILD_HASH = ... in the header option, though it could cause problem if it becomes a global.
I created this plugin a while back, I'll try to update it so it provides the chunk hash naturally.
On server, you can get the hash by reading the filenames (example: web.bundle.f4771c44ee57573fabde.js) from your bundle folder.
You can pass the version to your build using webpack.DefinePlugin
If you have a package.json with a version, you can extract it like this:
const version = require("./package.json").version;
For example (we stringified the version):
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(version)
}),
then in your javascript, the version will be available as:
process.env.VERSION
The WebpackManifestPlugin is officially recommended in the output management guide. It writes a JSON to the output directory mapping the input filenames to the output filenames. Then you can inject those mapped values into your server template.
It's similar to Dmitry's answer, except Dmitry's doesn't appear to support multiple chunks.
That can be done with Webpack Stats Plugin. It gives you nice and neat output file with all the data you want. And it's easy to incorporate it to the webpack config files where needed.
E.g. To get hash generated by Webpack and use it elsewhere.
Could be achieved like:
# installation
npm install --save-dev webpack-stats-plugin
yarn add --dev webpack-stats-plugin
# generating stats file
const { StatsWriterPlugin } = require("webpack-stats-plugin")
module.exports = {
plugins: [
// Everything else **first**.
// Write out stats file to build directory.
new StatsWriterPlugin({
stats: {
all: false,
hash: true,
},
filename: "stats.json" // Default and goes straight to your output folder
})
]
}
# usage
const stats = require("YOUR_PATH_TO/stats.json");
console.log("Webpack's hash is - ", stats.hash);
More usage examples in their repo
Hope that helps!

How to whitelist all subdependencies with webpack-node-externals

I'm using webpack to bundle server assets using the target property.
This results in a usable client bundle, and a usable server, which is working great. However it seems that even for the server code, webpack is bundling everything within node_modules. I am attempting to use webpack-node-externals to solve this problem, seen below:
module.exports = [
{
name: "server code, output to ./server",
entry: "./servertest.js",
output: {
filename: "./server/index.js"
},
target: "node",
externals: [
nodeExternals({
includeClientPackages: false
})
]
},
{
name: "client side, output to ./public",
entry: "./app.js",
output: {
filename: "./dist/app.js"
}
}
]
This doesn't work however as its default behavior is to exclude all node_modules from bundling, thus rendering the server useless. There is a whitelist option, for which I have included express, the only dependency of my small test case. It doesn't fail on express, however it fails on a dependency of express, merge-descriptors. And of course if I add merge-descriptors to the whitelist, trying to start the server will fail on another dependency of express. I surely cannot add every dependency and sub-dependency (etc etc) to this whitelist array.
How can I ensure all dependencies of a given requirement are bundled by webpack during a target: 'node' build?
To deal with this, I created a small helper, datwd to get a list of all subdependencies for specific packages:
// webpack.config.js
const nodeExternals = require('webpack-node-externals')
const includeSubdependencies = require('datwd')
module.exports = {
// ...
externals: [
nodeExternals({
// Will include "cookies" and its dependencies; for example:
// `['cookies', 'depd', 'keygrip', 'tsscmp']`
allowlist: includeSubdependencies(['cookies'])
})
]
It relies on npm ls under the hood. Source code:
/* eslint-disable global-require, import/no-dynamic-require */
const { execSync } = require('child_process')
/**
* Returns a flat array of all Node module dependency names for the entire
* dependency tree, optionally filtered by top-level modules. Requires NPM
* and for dependencies to be installed.
*
* #param {Array} moduleFilter - An optional array of top-level module names
* whose dependencies should be included. If specified, any other modules'
* dependencies will be excluded.
* #return {String[]} An array of module names
*/
const getAllDependencies = (moduleFilterInput = []) => {
const moduleFilter = Array.isArray(moduleFilterInput)
? moduleFilterInput
: [moduleFilterInput]
// Get the full dependency tree using NPM, excluding dev dependencies
// and peer dependencies.
const dependencyTree = JSON.parse(execSync('npm ls --prod --json').toString())
// Only get dependencies for specific top-level modules, if specified.
const dependencyTreeFiltered = moduleFilter.length
? {
...dependencyTree,
dependencies: Object.keys(dependencyTree.dependencies)
.filter((key) => moduleFilter.includes(key))
.reduce((obj, key) => {
// eslint-disable-next-line no-param-reassign
obj[key] = dependencyTree.dependencies[key]
return obj
}, {}),
}
: dependencyTree
const allChildDeps = []
const getAllChildDependencies = (depTree) => {
const nextDeps = depTree.dependencies
if (!nextDeps || !Object.keys(nextDeps).length) {
return []
}
Object.entries(nextDeps).forEach(([childDep, childDepTree]) => {
allChildDeps.push(childDep)
getAllChildDependencies(childDepTree)
})
return allChildDeps
}
return getAllChildDependencies(dependencyTreeFiltered)
}

Compiling Webpack in memory but resolving to node_modules on disk

I'm trying to use web pack to compile an in memory string of valid javascript code. I'm using memory fs as outlined here: https://webpack.github.io/docs/node.js-api.html#compile-to-memory.
So I'm taking a string containing raw javascript, writing that to memory fs, and then web pack resolves to that entry point. But the compilation fails on the first require statement, presumably because it's not able to look in the real fs for node_modules.
Any ideas on how can I accomplish this?
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
function* compile(code) {
const fs = new MemoryFS();
fs.writeFileSync('/file.js', code);
const compiler = webpack({
entry: { file: '/file.js' },
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
}
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = fs;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //should be a bundle containing the underscore source.
The error is
ModuleNotFoundError: Module not found: Error: Cannot resolve module
underscore in /
This question indicates that others have tried the same thing: https://github.com/webpack/webpack/issues/1562. there's a gist referenced at https://gist.github.com/DatenMetzgerX/2a96ebf287b4311f4c18 that I believe was intended to do what I'm hoping to accomplish, but in it's current form I don't see how. It assigns an instance of MemoryFs to all of the resolvers. I've tried assigning node's fs module, but no dice.
So in short, I'm trying to set an entry point to an in memory string of raw javascript, but still have require and import statements resolved to node_modules on disk.
UPDATE
I've been able to get the result I'm looking for but it's not pretty. I'm basically overriding the implementation of #stat and #readFile in MemoryFS
to check the real filesystem if it gets any request for a file that doesn't exist in memory. I could clean this up a bit by subclassing MemoryFS instead of swapping method implementations at runtime, but the idea would still be the same.
Working solution
import webpack from 'webpack';
import JsonLoader from 'json-loader';
import MemoryFS from 'memory-fs';
import UglifyJS from "uglify-js";
import thenify from 'thenify';
import path from 'path';
import fs from 'fs';
import root from 'app-root-path';
/*
* Provide webpack with an instance of MemoryFS for
* in-memory compilation. We're currently overriding
* #stat and #readFile. Webpack will ask MemoryFS for the
* entry file, which it will find successfully. However,
* all dependencies are on the real filesystem, so any require
* or import statements will fail. When that happens, our wrapper
* functions will then check fs for the requested file.
*/
const memFs = new MemoryFS();
const statOrig = memFs.stat.bind(memFs);
const readFileOrig = memFs.readFile.bind(memFs);
memFs.stat = function (_path, cb) {
statOrig(_path, function(err, result) {
if (err) {
return fs.stat(_path, cb);
} else {
return cb(err, result);
}
});
};
memFs.readFile = function (path, cb) {
readFileOrig(path, function (err, result) {
if (err) {
return fs.readFile(path, cb);
} else {
return cb(err, result);
}
});
};
export default function* compile(code) {
// Setup webpack
//create a directory structure in MemoryFS that matches
//the real filesystem
const rootDir = root.toString();
//write code snippet to memoryfs
const outputName = `file.js`;
const entry = path.join(rootDir, outputName);
const rootExists = memFs.existsSync(rootDir);
if (!rootExists) {
memFs.mkdirpSync(rootDir);
}
memFs.writeFileSync(entry, code);
//point webpack to memoryfs for the entry file
const compiler = webpack({
entry: entry,
output: {
filename: outputName
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
]
}
});
compiler.run = thenify(compiler.run);
//direct webpack to use memoryfs for file input
compiler.inputFileSystem = memFs;
compiler.resolvers.normal.fileSystem = memFs;
//direct webpack to output to memoryfs rather than to disk
compiler.outputFileSystem = memFs;
const stats = yield compiler.run();
//remove entry from memory. we're done with it
memFs.unlinkSync(entry);
const errors = stats.compilation.errors;
if (errors && errors.length > 0) {
//if there are errors, throw the first one
throw errors[0];
}
//retrieve the output of the compilation
const res = stats.compilation.assets[outputName].source();
return res;
}
Usage
var code = "var _ = require('underscore'); console.log(_);";
var bundle = yield compile(code); //is a valid js bundle containing the underscore source and a log statement logging _.
If there's not a better way, then I'll definitely encapsulate this into a subclass of MemoryFS, but I'm hoping there's a more sane way to accomplish this with Webpack's api.
Instead of memory-fs, the combination of unionfs/memfs/linkfs should help.
https://npmjs.com/unionfs
https://npmjs.com/memfs
https://npmjs.com/linkfs
I have created this snippet untested. I think you want the inputFS to be the real one and the output fs to be the in memory one. On the other hand you want all the dependencies of file.js to be constructed separately. For that I figured the webpack.optimize.CommonsChunkPlugin plugin could help. I expect webpack to write everything to the memory. I hope it works.
import webpack from 'webpack';
import MemoryFS from 'memory-fs';
import thenify from 'thenify';
import realFS from 'fs';
function* compile(code) {
const fs = new MemoryFS();
const compiler = webpack({
entry: {
file: '/file.js',
vendor: [
'underscore',
'other-package-name'
]
},
output: {
path: '/build',
filename: '[name].js'
},
module: {
loaders: [
{ test: /\.json$/, loader: 'json' }
],
},
plugins: [
new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.js')
]
});
compiler.run = thenify(compiler.run);
compiler.inputFileSystem = realFS;
compiler.resolvers.normal.fileSystem = fs; //this is needed for memfs
compiler.outputFileSystem = fs;
const stats = yield compiler.run();
//retrieve the output of the compilation
const res = stats.compilation.assets['file.js'].source();
return res;
}
You're using MemoryFS, which is a JavaScript reimplementation of a feature normally handled by the Operating System. I wonder, could you mount a directory using tmpfs at the Operating System level, then use that? webpack would then not know or care that the input file is actually stored in memory.
Assuming that you have mounted a memory-based filesystem at /media/memory, the webpack configuration code could be as simple as this:
resolve: {
root: ['/media/memory', ...other paths...],
},
output: {
path: '/wherever/you/want/the/output/files'
}
}
This approach also has a hidden benefit: If you want to debug the input code, you just mount /media/memory with a non-RAM-based filesystem and you can see what's being generated.
I know it's late but for the record here comes a code snippet.
import * as fs from 'fs';
import { resolve } from 'path';
import { Volume } from 'memfs';
import { ufs } from 'unionfs';
const volume = Volume.fromJSON({
[resolve(process.cwd(), 'test.js')]: 'this file is on memory not on disk'
});
ufs.use(fs).use(volume);
// Reads from memory
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.js'), 'utf8'));
// Reads from disk
console.log(ufs.readFileSync(resolve(process.cwd(), 'package.json'), 'utf8'));
// Writing into memory
volume.writeFileSync(resolve(process.cwd(), 'test.memory'), 'This should be
on memory');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.memory'), 'utf8'));
// Writing into disk
ufs.writeFileSync(resolve(process.cwd(), 'test.disk'), 'This should be on disk');
console.log(ufs.readFileSync(resolve(process.cwd(), 'test.disk'), 'utf8'));
Hers the console output:
user1#pc playground % node inMem.mjs
this file is on memory not on disk
{
"name": "playground",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"memfs": "^3.3.0",
"unionfs": "^4.4.0"
}
}
This should be on memory
This should be on disk
user1#pc playground % ls .
inMem.mjs node_modules package.json yarn.lock

Categories

Resources