Does nodejs provides options like resolve.alias in webpack - javascript

I use resolve.alias to eliminate long relative path.
// webpack.config.js
module.exports = {
// ...
resolve: {
alias: {
services: __dirname + '/src/services',
components: __dirname + '/src/components'
},
}
// componentFoo.js
import ServiceBar from 'services/serviceBar'
But when I tried using ava to run tests, node cannot find module 'services/serviceBar'.
My folder structure:
src
--components
----componentFoo.js
--services
----serviceBar.js
test
--index.js

Node.js does not have any built-in option for aliases. But you can use the babel plugin babel-plugin-module-resolver to define aliases, which should be convenient as AVA already uses babel.
You need to add it to your babel plugins:
"plugins": [
["module-resolver", {
"alias": {
"services": "./src/services",
"components": "./src/components"
}
}]
]
The paths are relative to the babel config, unless you specify the cwd option (list of options). Another possibility would be to use the root option instead of aliases, which is similar to webpack's resolve.modules:
"root": ["./src"]

Related

Public file is missing from build/public in Webpack [duplicate]

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

How to use jest with node_modules using es6 within NX project

I have a project with NX structure(apps + libs). And I am writing tests for react + typescript lib. I faced the issue when I try to use suneditor + suneditor-react:
Jest encountered an unexpected token
Jest failed to parse a file. This happens e.g. when your code or its dependencies use non-standard JavaScript syntax, or when Jest is not configured to support such syntax.
Out of the box Jest supports Babel, which will be used to transform your files into valid JS based on your Babel configuration.
By default "node_modules" folder is ignored by transformers.
Here's what you can do:
• If you are trying to use ECMAScript Modules, see https://jestjs.io/docs/ecmascript-modules for how to enable it.
• To have some of your "node_modules" files transformed, you can specify a custom "transformIgnorePatterns" in your config.
• If you need a custom transformation specify a "transform" option in your config.
• If you simply want to mock your non-JS modules (e.g. binary assets) you can stub them out with the "moduleNameMapper" config option.
You'll find more details and examples of these config options in the docs:
https://jestjs.io/docs/configuration
For information about custom transformations, see:
https://jestjs.io/docs/code-transformation
Details:
...\node_modules\suneditor\src\plugins\index.js:4
import blockquote from './command/blockquote';
I found the files are not transformed when they are in node_modules and made this change to jest config:
transformIgnorePatterns: [
"<rootDir>/node_modules/(?!suneditor|suneditor-react)"
]
After this I am continuously getting the error for all tests:
● Test suite failed to run
Cannot find module 'babel-preset-es2015'
at resolveStandardizedName (../../node_modules/#babel/core/lib/config/files/plugins.js:100:7)
at resolvePreset (../../node_modules/#babel/core/lib/config/files/plugins.js:48:10)
at loadPreset (../../node_modules/#babel/core/lib/config/files/plugins.js:67:20)
at createDescriptor (../../node_modules/#babel/core/lib/config/config-descriptors.js:154:9)
at ../../node_modules/#babel/core/lib/config/config-descriptors.js:109:50
at Array.map (<anonymous>)
at createDescriptors (../../node_modules/#babel/core/lib/config/config-descriptors.js:109:29)
at createPresetDescriptors (../../node_modules/#babel/core/lib/config/config-descriptors.js:101:10)
Jest config:
module.exports = {
displayName: 'component',
preset: '../../jest.preset.js',
transform: {
'^.+\\.[tj]sx?$': 'babel-jest',
},
moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx'],
coverageDirectory: '../../coverage/libs/pbc-client-journal',
transformIgnorePatterns: [
"<rootDir>/node_modules/(?!suneditor|suneditor-react)"
]
};
And jest.preset.js:
const nxPreset = require('#nrwl/jest/preset');
module.exports = { ...nxPreset };
.babelrc:
{
"presets": [
[
"#nrwl/react/babel",
{
"runtime": "automatic",
"useBuiltIns": "usage"
}
]
]
}
Could someone please help me with this issue?
You need to switch to babel.config.js in your lib to be able to transform files from node_modules. Here is what worked in my project:
babel.config.js
module.exports = {
presets: [
[
"#nrwl/react/babel",
{
runtime: "automatic",
useBuiltIns: "usage"
}
]
],
plugins: []
}
jest.config.js
module.exports = {
displayName: 'mylib',
preset: '../../jest.preset.js',
transform: {
'^.+\\.[tj]sx?$': ['babel-jest', { cwd: __dirname }],
},
transformIgnorePatterns: [
"node_modules/(?!(#asyncapi)/)"
],
moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx'],
coverageDirectory: '../../coverage/libs/mylib',
};

Use ES6 library's ES5 module without compiling it on my own project with rollup

I need to use an ES6 library (Luxon) and want to compile down the files to ES5, but Rollup adds the files as ES6.
The library has a special /build folder with different output formats.
How can I configure Rollup to make use of that instead of doing nothing with the library?
First of all, you have two options here:
Either compiling the library on your project with Rollup and
#rollup/plugin-babel
Or referencing the /build directory of the package instead of the ES6
version of the package, using an alias with #rollup/plugin-alias
I'll go with the second approach because is the one you asked for:
Install the plugin npm i #rollup/plugin-alias
At rollup.config.js import it import alias from '#rollup/plugin-alias';
Finally, add it to plugins:
const path = require('path');
module.exports = {
input: 'src/index.js',
output: {
dir: 'output',
format: 'cjs'
},
plugins: [
alias({
entries: [
{ find: 'luxon', replacement: path.resolve(process.cwd(), 'node_modules/luxon/build') },
]
})
]
};

Node module cross build targets for deployment in npm

I have my ES6 module that I want to build to target different environments. I don't know how to do this or if I should use webpack or rollup.
Build targets
ES6 environments like Vue.
Commonjs like node backend or some webpack builds.
The browser, the script tag.
Project directory structure
src
--Dog.js
index.js
package.json
Project Files
Dog.js
class Dog{
//Typical Dog stuff
}
export default Dog;
index.js
import Dog from "./src/Dog";
export {
Dog
}
webpack.config.js
module.exports = {
target: 'node',
mode: 'production',
};
package.json (relevant parts)
"files": [
"dist",
"src"
]
Is there any way to automatize this process or should I just write a new library for each target manually? And if there is how the projects that import my module know which build is right for their environment?
Just use different webpack configs for different environments and targets. It is a common approach.
This is what i think it works with a minimal webpack configuration. The dist folder and its contents are created by webpack.
Project directory structure
src
--Dog.js
dist
--dog.common.js
--dog.lib.min.js
index.js
package.json
webpack.config.js
webpack.config.js
const path = require("path");
var commonJsConfig = {
target: 'node',
mode: 'production',
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'dog.common.js',
libraryTarget: 'commonjs'
}
};
var browserConfig = {
target: 'web',
mode: 'production',
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'dog.lib.min.js'
}
};
module.exports = [commonJsConfig, browserConfig];
package.json (relevant parts)
main tells backend and webpack legacy projects to use the commonjs format.
module is for es6 format
unpkg for the browser
files tell npm wich files to upload
"main": "dist/dog.common.js",
"module": "index.js",
"unpkg": "dist/dog.lib.min.js",
"files": [
"dist",
"src"
],
"scripts": {
"build": "webpack"
},

Typescript can't find modules which are imported with webpack alias

I am currently setting up my project to be a bit cleaner, especially in the frontend part with references.
In hindsight I noticed that I was very generous with the folder structure for my frontend files and ended up with lots of layers. That's why I decided to look into what webpack can do for this case, and found out about the alias functionality.
This is how I set it up:
resolve: {
alias: {
components: path.resolve(__dirname, "Scripts/Views/Components"),
data: path.resolve(__dirname, "Scripts/Data"),
definitions: path.resolve(__dirname, "Scripts/Definitions"),
helper: path.resolve(__dirname, "Scripts/Helper"),
scripts: path.resolve(__dirname, "Scripts"),
views: path.resolve(__dirname, "Scripts/Views"),
},
extensions: [".tsx", ".ts", ".js", ".jsx"],
modules: ["node_modules"]
}
As you can see, I created alias' for various folders here.
This is my folder structure:
Now, let's hop into e.g. the LoginDialog.tsx. Here I am trying to import like this:
import { IErrorAttachedProperty } from "definitions/formHelper";
However, all I end up with here is an error that no module could be found this way.
What am I doing wrong here?
If it is of any significance - The webpack.config.js resides in the same directory as the Scripts folder.
you have to config tsconfig.json for typescript
"baseUrl": "./",
"paths": {
"components/*": [
"./src(or any other path)/Scripts/Views/Components"
]
},
here is nice example ts alias
Ok, so to avoid confusion for others I'm posting my solution/findings:
Yes, you can just use tsconfig.json without needing resolve/alias in Webpack. You should just do it once with Typescript setup.
EDIT: Nope, turns out you do need resolve/alias section in webpack.config.js. Typescript will be happy without it, but then you will get Webpack errors when it builds. Do both to make it work.
TIP: Make sure the paths you provide in the paths section of tsconfig.json are relative to the baseUrl entry point. Don't make them relative to the tsconfig.json file, baseUrl is like the project root for the non-relative module imports defined with paths.
From Typescript docs, absolute modules names (import * from package-a) are relative to baseUrl ~ https://www.typescriptlang.org/docs/handbook/module-resolution.html#base-url
All module imports with non-relative names are assumed to be relative to the baseUrl.
Relative modules (import * from ./packages) are just from current file as stated:
Note that relative module imports are not impacted by setting the baseUrl, as they are always resolved relative to their importing files.
So if you have:
./packages
./package-a
./package-b
./index.ts
./tsconfig.json
Your tsconfig.json would look like:
{
"compilerOptions": {
"baseUrl": "./packages",
"paths": {
"package-a/*": [ "./package-a/*" ],
},
},
"include": [
"./packages/**/*"
]
}
Then your webpack.config.json would look like:
{
resolve: {
alias: {
'package-a': path.resolve(__dirname, 'packages/package-a/'),
}
},
}
Then you can import from index.ts like this:
import { pkgAThing } from 'package-a';
// or
import { otherPkgAThing } from 'package-a/dir/dir`;
Which is alternative to relative style:
import { pkgAThing } from './packages/package-a`;

Categories

Resources