I'm still confused how to resolve module paths with webpack. Now I write:
myfile = require('../../mydir/myfile.js')
but I'd like to write
myfile = require('mydir/myfile.js')
I was thinking that resolve.alias may help since I see a similar example using { xyz: "/some/dir" } as alias then I can require("xyz/file.js").
But if I set my alias to { mydir: '/absolute/path/mydir' }, require('mydir/myfile.js') won't work.
I feel dumb because I've read the doc many times and I feel I'm missing something.
What is the right way to avoid writing all the relative requires with ../../ etc?
Webpack >2.0
See wtk's answer.
Webpack 1.0
A more straightforward way to do this would be to use resolve.root.
http://webpack.github.io/docs/configuration.html#resolve-root
resolve.root
The directory (absolute path) that contains your modules. May also be an array of directories. This setting should be used to add individual directories to the search path.
In your case:
webpack config
var path = require('path');
// ...
resolve: {
root: path.resolve('./mydir'),
extensions: ['', '.js']
}
consuming module
require('myfile')
or
require('myfile.js')
see also: http://webpack.github.io/docs/configuration.html#resolve-modulesdirectories
For future reference, webpack 2 removed everything but modules as a way to resolve paths. This means root will not work.
https://gist.github.com/sokra/27b24881210b56bbaff7#resolving-options
The example configuration starts with:
{
modules: [path.resolve(__dirname, "app"), "node_modules"]
// (was split into `root`, `modulesDirectories` and `fallback` in the old options)
resolve.alias should work exactly the way you described, so I'm providing this as an answer to help mitigate any confusion that may result from the suggestion in the original question that it does not work.
a resolve configuration like the one below will give you the desired results:
// used to resolve absolute path to project's root directory (where web pack.config.js should be located)
var path = require( 'path' );
...
{
...
resolve: {
// add alias for application code directory
alias:{
mydir: path.resolve( __dirname, 'path', 'to', 'mydir' )
},
extensions: [ '', '.js' ]
}
}
require( 'mydir/myfile.js' ) will work as expected. If it does not, there must be some other issue.
If you have multiple modules that you want to add to the search path, resolve.root makes sense, but if you just want to be able to reference components within your application code without relative paths, alias seems to be the most straight-forward and explicit.
An important advantage of alias is that it gives you the opportunity to namespace your requires which can add clarity to your code; just like it is easy to see from other requires what module is being referenced, alias allows you to write descriptive requires that make it obvious you're requiring internal modules, e.g. require( 'my-project/component' ). resolve.root just plops you into the desired directory without giving you the opportunity to namespace it further.
In case anyone else runs into this problem, I was able to get it working like this:
var path = require('path');
// ...
resolve: {
root: [path.resolve(__dirname, 'src'), path.resolve(__dirname, 'node_modules')],
extensions: ['', '.js']
};
where my directory structure is:
.
├── dist
├── node_modules
├── package.json
├── README.md
├── src
│ ├── components
│ ├── index.html
│ ├── main.js
│ └── styles
├── webpack.config.js
Then from anywhere in the src directory I can call:
import MyComponent from 'components/MyComponent';
I have resolve it with Webpack 2 like this:
module.exports = {
resolve: {
modules: ["mydir", "node_modules"]
}
}
You can add more directories to array...
My biggest headache was working without a namespaced path. Something like this:
./src/app.js
./src/ui/menu.js
./node_modules/lodash/
Before I used to set my environment to do this:
require('app.js')
require('ui/menu')
require('lodash')
I found far more convenient avoiding an implicit src path, which hides important context information.
My aim is to require like this:
require('src/app.js')
require('src/ui/menu')
require('test/helpers/auth')
require('lodash')
As you see, all my app code lives within a mandatory path namespace. This makes quite clear which require call takes a library, app code or a test file.
For this I make sure that my resolve paths are just node_modules and the current app folder, unless you namespace your app inside your source folder like src/my_app
This is my default with webpack
resolve: {
extensions: ['', '.jsx', '.js', '.json'],
root: path.resolve(__dirname),
modulesDirectories: ['node_modules']
}
It would be even better if you set the environment var NODE_PATH to your current project file. This is a more universal solution and it will help if you want to use other tools without webpack: testing, linting...
If you're using create-react-app, you can simply add a .env file containing
NODE_PATH=src/
Source: https://medium.com/#ktruong008/absolute-imports-with-create-react-app-4338fbca7e3d
Got this solved using Webpack 2 :
resolve: {
extensions: ['', '.js'],
modules: [__dirname , 'node_modules']
}
This thread is old but since no one posted about require.context I'm going to mention it:
You can use require.context to set the folder to look through like this:
var req = require.context('../../mydir/', true)
// true here is for use subdirectories, you can also specify regex as third param
return req('./myfile.js')
Simply use babel-plugin-module-resolver:
$ npm i babel-plugin-module-resolver --save-dev
Then create a .babelrc file under root if you don't have one already:
{
"plugins": [
[
"module-resolver",
{
"root": ["./"]
}
]
]
}
And everything under root will be treated as absolute import:
import { Layout } from 'components'
For VSCode/Eslint support, see here.
I didn't get why anybody suggested to include myDir's parent directory into modulesDirectories in webpack, that should make the trick easily:
resolve: {
modulesDirectories: [
'parentDir',
'node_modules',
],
extensions: ['', '.js', '.jsx']
},
Related
I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.
OK, i have searched high and low but cannot reliably deterrmine if this is or is not possible with webpack.
https://github.com/webpack/webpack/tree/master/examples/require.context
Appears to indicate that one can pass a string to a function and it load a module...
But my attempt is just not working:
webpack.config.js
'use strict';
let webpack = require('webpack'),
jsonLoader = require("json-loader"),
path = require("path"),
fs = require('fs'),
nodeModules = {};
fs.readdirSync('node_modules')
.filter(function(x) {
return ['.bin'].indexOf(x) === -1;
})
.forEach(function(mod) {
nodeModules[mod] = 'commonjs ' + mod;
});
let PATHS = {
app: __dirname + '/src'
};
module.exports = {
context: PATHS.app,
entry: {
app: PATHS.app+'/server.js'
},
target: 'node',
output: {
path: PATHS.app,
filename: '../build/server.js'
},
externals: nodeModules,
performance: {
hints: "warning"
},
plugins: [
jsonLoader
],
resolve: {
modules: [
'./node_modules',
path.resolve(__dirname),
path.resolve(__dirname + "/src"),
path.resolve('./config')
]
},
node: {
fs: "empty"
}
};
The server.js
let _ = require('lodash');
let modules = [ "modules/test" ];
require( 'modules/test' )();
_.map( modules, function( module ){
require( module );
});
The module in modules/ named test.js
module.exports = () => {
console.log('hello world');
};
But the result is always the same... the pm2 logs just say hello world for the static require... but for the dynamic load of the same module
Error: Cannot find module "."
All i want to be able to do is loop through an array of paths to modules and load then...
You cannot use a variable as argument to require. Webpack needs to know what files to bundle at compile time. As it does no program flow analysis, it can't know what you pass to the function. In that case it might be obvious, but this could go as far as using user input to decide what module to require, and there is no way webpack can possibly know which modules to include at compile time, so webpack does not allow it.
The example you posted is a bit different. You could use require with a concatenated string. For example:
require(`./src/${moduleName}/test`);
Which modules does webpack need to include in the bundle? The variable moduleName could be anything, so the exact module is not known at compile time. Instead it includes all modules that could possibly match the above expression. Assuming the following directory structure:
src
├─ one
│ └─ test.js
├─ two
│ ├─ subdir
│ │ └─ test.js
│ └─ test.js
└─ three
└─ test.js
All of these test.js files will be included in the bundle, because moduleName could be one or something nested like two/subdir.
For more details see require with expression of the official docs.
You cannot loop through an array and import every module of the array, with the above exception by concatenating a string, but that has the effect of including all possible modules and should generally be avoided.
I ran into this problem in an electron environment. My use case was being able to require dynamically created files in an IDE like application. I wanted to use the electron require, which is basically a NodeJS Common module loader. After some back and forth I landed on a solution that uses webpack's noParse module configuration.
First create a module that that will be ignored by webpack's parser:
// file: native-require.js
// webpack replaces calls to `require()` from within a bundle. This module
// is not parsed by webpack and exports the real `require`
// NOTE: since the module is unparsed, do not use es6 exports
module.exports = require
In my webpack config, under module, instruct the bundler not to parse this module:
{
module: {
noParse: /\/native-require.js$/,
}
}
Lastly, in any bundle where you want to access the original require:
import nativeRequire from './native-require'
const someModule = nativeRequire('/some/module.js') // dynamic imports
A bit late....but... since you are bundling to target: 'node', there is a workaround to dynamic requiring modules, and bypassing the "the effect of including all possible modules".
The solution is lifted from:
Using dynamic require on node targets WITHOUT resolve or bundle the target module · Issue #4175 · webpack/webpack
Quoted from that comment:
const requireFunc = typeof __webpack_require__ === "function" ? __non_webpack_require__ : require;
const foo = requireFunc(moduleName);
Bundles to:
const requireFunc = true ? require : require;
const foo = requireFunc(moduleName);
I am trying to have a namespace for my app to work as a module, and import my components using this namespace and limit the use of relative path.
Although, even though I followed the webpack documentation for alias here: http://webpack.github.io/docs/configuration.html#resolve-alias
I can't make it to work.
This is how my resolve object looks like:
resolve: {
root: path.resolve(__dirname),
alias: {
myApp: './src',
},
extensions: ['', '.js', '.json', '.jsx']
}
path.resolve(__dirname) resolves /Users/Alex/Workspace/MyAppName/ui/
I import my file that way in the file /Users/Alex/Workspace/MyAppName/ui/src/components/Header/index.jsx:
import { myMethod } from 'myApp/utils/myUtils';
I get the following error during the build:
ERROR in ./src/components/Header/index.jsx
Module not found: Error: Cannot resolve module 'myApp/utils/myUtils' in /Users/Alex/Workspace/MyAppName/ui/src/components/Header
# ./src/components/Header/index.jsx 33:19-56
I also tried with modulesDirectories but it doesn't work either.
Do you have any idea what is wrong?
Resolving the alias to the absolute path should do the trick:
resolve: {
alias: {
myApp: path.resolve(__dirname, 'src'),
},
extensions: ['', '.js', '.jsx']
}
Check this webpack resolve alias gist with a simple example.
Another solution to limit the number of relative paths is to add your ./src folder as root instead of aliasing it:
resolve: {
root: [path.resolve('./src')],
extensions: ['', '.js', '.jsx']
}
Then you will be able to require all folders inside ./src as if they where modules. For example, assuming you have the following directory structure:
.
├── node_modules
├── src
│ ├── components
│ └── utils
you would be able to import from components and utils like this:
import Header from 'components/Header';
import { myMethod } from 'utils/myUtils';
Something like having an alias for each folder inside ./src.
This might be obvious to many but if you, like me, have spent too much time trying to figure out why it suddenly does not work when moving from Windows or Mac to Linux then check casing in the paths...
Me and everyone else on the project are using Windows or Mac but at home I dual boot ubuntu which I enjoy using. On cloning our code and running webpack i got a whole lot of Cannot resolve module... errors. I spent more time than I'd like to admit searching for some obscure error in node, npm, webpack or anything until I noticed the paths of the failing modules were something like #app/Shared/settings.js and the require was require('#app/shared/settings'). Windows doesn't care so everything was fine all until I started working on linux. As it turned out problem was not with webpack, node or anything else so that's probably why I didn't find anyone suggesting that this could be the problem.
Hope this saves some time for someone. Or maybe I'm just dumb, I don't know.
Most of the time it depends on your project folder structure. If your webpack file is inside a folder then make sure you handle it accordingly
.
├── node_modules
├── src
│ ├── components
│ └── config/webpack.config.js
modules.exports = {
resolve: {
extensions: ['.js', '.jsx'],
alias: {
Components: path.resolve(__dirname, '../src/components/'),
}
},
...
resolve: {
...
}
}
Also it often happens that we name our folder as "source" but use "src" in path.
Darn! that copy paste has taken alot of my debug time
Hope this helps someone who is facing such issues due to the above reasons.
I got the same error. Finnaly I found the problem was that I wrote resolve twice. And the second resolve override the previous one.
My code is like this:
modules.exports = {
resolve: {
extensions: ['.js', '.jsx'],
alias: {
Components: path.resolve(__dirname, 'src/components/'),
}
},
...
resolve: {
...
}
}
More help can be found in Webpack Doc
I use without next syntax :
resolve: {
alias: {
Data: __dirname + '/src/data'
},
extensions: ['.js', '.jsx', '.json']
}
import points from 'Data/points.json';
In my case, I wanted to alias mobx, so that any import of mobx would always return the same instance, regardless of whether the import call was from my main app, or from within one of the libraries it used.
At first I had this:
webpackConfig.resolve.alias = {
mobx: path.resolve(root, "node_modules", "mobx"),
};
However, this only forced imports of mobx from my app's own code to use the alias.
To have it work for the library's import calls as well, I had to do:
webpackConfig.resolve.alias = {
mobx: path.resolve(root, "node_modules", "mobx"),
"LIBRARY_X/node_modules/mobx": path.resolve(root, "node_modules", "mobx"),
};
It's odd, because I'm pretty sure just having the mobx alias used to work for both contexts before, but now it apparently doesn't (Webpack v4.41.2).
Anyway, the above is how I solved it. (In my case, I needed it to prevent two mobx instances from being used as that breaks things, and LIBRARY_X was symlinked using npm-link, so I couldn't delete/unify the secondary mobx folder itself)
I use webpack path aliases for ES6 module loading.
E.g. If I define an alias for utils instead of something like
import Foo from "../../../utils/foo", I can do
import Foo from "utils/foo"
The problem is that once I start using aliases, WebStorm looses track of the import and I'm left with warnings and no auto-completion.
Is there a way to instruct WebStorm to use such aliases?
Yes, there is.
In fact, Webstorm can't automatically parse and apply Webpack config, but you can set up aliases the same way.
You just have to mark the parent folder of "utils" (in your example) as a resource root (right-click, mark directory as / resource root).
We just managed to do with the following structure :
/src
/A
/B
/C
We have A B and C folders declared as alias in Webpack.
And in Webstorm we marked "src" as "Resource Root".
And now we can simply import :
import A/path/to/any/file.js
instead of
import ../../../../../A/path/to/any/file.js
while still having Webstorm correctly parsing and indexing all code, link to files, autocompleting and so on ...
I managed to set up aliases for WebStorm 2017.2 within webpack like this:
For the record: in PHPSTORM, working with laravel mix, I managed to solve this by creating a webpack.config.js file separately like:
const path = require('path')
const webpack = require('webpack')
module.exports = {
...
resolve: {
extensions: ['.js', '.json', '.vue'],
alias: {
'~': path.resolve(__dirname, './resources/assets/js')
}
},
...
}
And then importing it in the webpack.mix.js like:
const config = require('./webpack.config')
...
mix.webpackConfig(config)
Make sure the webpack configuration file is pointed correctly in the configuration of the PhpStorm in: Settings > Languages & Frameworks > Javascript > Webpack
You can define custom paths, so WebStorm/PhpStorm can understand your aliases. But make sure, they are identical with your aliases. Create file in your root directory and call it something like this: webStorm.config.js (any js file will be ok). Then configure your paths inside:
System.config({
"paths": {
"components/*": "./src/components/*",
"core/*": "./src/core/*",
...
}
});
WebStorm/PhpStorm will recognize System as it's own module and will treat this file as configuration.
This is answered in a comment but to save people digging into comments and link only information, here it is:
As of WS2017.2 this will be done automatically. The information is here.
According to this, webstorm will automatically resolve aliases that are included within the webpack.config in the root of the project. If you have a custom structure and your webpack.config isn't in the root folder then go to Settings | Languages & Frameworks | JavaScript | Webpack and set the option to the config you require.
Note: Most setups have a base config which then call a dev or prod version. In order for this to work properly, you need to tell webstorm to use the dev one.
Not right now, We were also using path aliases for the files in our react project. The import names were shorter but we lost a lot on static checking of webstorm as well as completion features.
We later came up with a decision to reduce the code to only 3 levels of depth, as well a single level for the common parts. The path completion feature of webstom (ctrl + space) even helps reduce the typing overhead. The production build does not use longer names, so hardly makes any difference in final code.
I will suggest please reconsider your decision about aliases. You loose semantic meaning of modules coming from node_modules and your own code, as well as referencing the alias files again and again to make sense of your code, is a much bigger overhead.
add jsconfig.js on your project root
{
"compilerOptions": {
"baseUrl": ".",
"paths": {
"~/*": ["./src/*"]
}
}
}
In PHPStorm (using 2017.2 currently), I have not been able to get webpack configs to work properly in regards to aliases.
My fix involves using the "Directories" section of the main settings. I just had to mark each folder referenced by an alias as a sources root, then click the properties dropdown for each and specify the alias as a "Package prefix". This made everything link up for me.
Not sure if the Directories section exists in WebStorm, but if it does, this seems to be a fool-proof method for getting import aliases working.
For anyone struggling: path.resolve() must be called with "__dirname" first argument for Idea (Websorm) to be able to resolve the path correctly.
Will work for Idea (Websorm):
alias: {
'#someAlias': pathLib.resolve(__dirname, 'path/to/directory')
}
Will not work for Idea (Websorm) (while still being valid webpack alias):
alias: {
'#someAlias': pathLib.resolve('path/to/directory')
}
Webstorm can't read webpack.config if module.exports return a function.
For example
module.exports = function (webpackEnv) {
return {
mode: isEnvProduction ? 'production' : isEnvDevelopment && 'development',
...
}
}
Check your config file, maybe this cause you are a problem.
There is a lot of discussion here about Laravel Mix, so I'll leave this here to help out future readers. I solved this by creating a separate (fake) webpack config file which is only used by my IDE (PHPStorm).
1. Create a separate alias.js file (e.g. /webpack/alias.js)
const path = require('path');
const assets = path.join(__dirname,'..','resources','assets');
module.exports = {
'#js' : path.resolve(assets, 'js'),
'#c' : path.resolve(assets, 'js', 'components'),
'#errors' : path.resolve(assets, 'js', 'errors'),
'#utils' : path.resolve(assets, 'js', 'utils'),
'#store' : path.resolve(assets, 'js', 'store'),
'#api' : path.resolve(assets, 'js', 'api'),
'#less' : path.resolve(assets, 'less')
}
2. Require the alias.js file into webpack.mix.js
const mix = require('laravel-mix');
mix.alias(require('./webpack/alias'))
// ... The rest of your mix, e.g.
.js('app.js')
.vue()
.less('app.less');
3. Create the fake webpack config for your IDE (e.g. /webpack/ide.config.js)
Here, import the laravel-mix webpack config, plus your aliases, and any other config that the IDE might need help finding. Also include the prefixed ~ aliases for importing styles into your Vue components.
/*
|--------------------------------------------------------------------------
| A fake config file for PhpStorm to enable aliases
|--------------------------------------------------------------------------
|
| File > Settings... > Languages & Frameworks > Javascript > Webpack
|
| Select "Manually" and set the configuration file to this
|
*/
const path = require('path');
const mixConfig = require('./../node_modules/laravel-mix/setup/webpack.config')();
module.exports = {
...mixConfig,
resolve: {
alias: {
...require('./alias'),
'~#less' : path.resolve('#less'), // <--
},
...mixConfig.resolve
}
}
4. Set your IDE to use webpack/ide.config.js as your webpack config file.
Had the same problem on a new Laravel project with Jetstream. The webpack.config.js was present and correct. But PHPStorm still didn't recognize the # symbol as a resource root.
After opening the webpack config, I got a notification:
After Clicking on Trust project and run, the # symbol became recognized.
I know that this isn't the solution or use-case for everyone. But I still found it worthy to note on this post, because it helped me in my situation.
Using
laravel/framework:8.77.1
npm:8.3.0
node:v14.18.1
Pulling my hair out here looking for a simple solution to share code, required via NPM, across multiple Browserify or Webpack bundles. Thinking, is there such a thing as a file "bridge"?
This isn't due to compile time (I'm aware of watchify) but rather the desire to extract out all of my vendor specific libs into vendor.js so to keep my app.js filesize down and to not crash the browser with massive sourcemaps. Plus, I find it way cleaner should the need to view the compiled js arise. And so:
// vendor.js
require('react');
require('lodash');
require('other-npm-module');
require('another-npm-module');
Its very important that the code be loaded from NPM as opposed to Bower, or saved into some 'vendor' directory in order to be imported via a relative path and identified via a shim. I'd like to keep every library reference pulled via NPM except for my actual application source.
In app.js I keep all of my sourcecode, and via the externals array, exclude vendor libraries listed above from compilation:
// app.js
var React = require('react');
var _ = require('lodash');
var Component = React.createClass()
// ...
And then in index.html, I require both files
// index.html
<script src='vendor.js'></script>
<script src='app.js'></script>
Using Browserify or Webpack, how can I make it so that app.js can "see" into those module loaded via npm? I'm aware of creating a bundle with externals and then referencing the direct file (in, say, node_modules) via an alias, but I'm hoping to find a solution that is more automatic and less "Require.js" like.
Basically, I'm wondering if it is possible to bridge the two so that app.js can look inside vendor.js in order to resolve dependencies. This seems like a simple, straightforward operation but I can't seem to find an answer anywhere on this wide, wide web.
Thanks!
Listing all the vendor files/modules and using CommonChunkPlugin is indeed the recommended way. This gets pretty tedious though, and error prone.
Consider these NPM modules: fastclick and mprogress. Since they have not adopted the CommonJS module format, you need to give webpack a hand, like this:
require('imports?define=>false!fastclick')(document.body);
require('mprogress/mprogress.min.css');
var Mprogress = require('mprogress/mprogress.min.js'),
Now assuming you would want both fastclick and mprogress in your vendor chunk, you would probably try this:
module.exports = {
entry: {
app: "./app.js",
vendor: ["fastclick", "mprogress", ...]
Alas, it doesn't work. You need to match the calls to require():
module.exports = {
entry: {
app: "./app.js",
vendor: [
"imports?define=>false!fastclick",
"mprogress/mprogress.min.css",
"mprogress/mprogress.min.js",
...]
It gets old, even with some resolve.alias trickery. Here is my workaround. CommonChunkPlugin lets you specify a callback that will return whether or not you want a module to be included in the vendor chunk. If your own source code is in a specific src directory, and the rest is in the node_modules directory, just reject the modules based on their path:
var node_modules_dir = path.join(__dirname, 'node_modules'),
app_dir = path.join(__dirname, 'src');
module.exports = {
entry: {
app: "./app.js",
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
function (module, count) {
return module.resource && module.resource.indexOf(app_dir) === -1;
}
)
]
};
Where module.resource is the path to the module being considered. You could also do the opposite, and include only the module if it is inside node_modules_dir, i.e.:
return module.resource && module.resource.indexOf(node_modules_dir) === 0;
but in my situation, I'd rather say: "put everything that is not in my source source tree in a vendor chunk".
Hope that helps.
With webpack you'd use multiple entry points and the CommonChunkPlugin.
Taken from the webpack docs:
To split your app into 2 files, say app.js and vendor.js, you can require the vendor files in vendor.js. Then pass this name to the CommonChunkPlugin as shown below.
module.exports = {
entry: {
app: "./app.js",
vendor: ["jquery", "underscore", ...],
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
)
]
};
This will remove all modules in the vendor chunk from the app chunk. The bundle.js will now contain just your app code, without any of it’s dependencies. These are in vendor.bundle.js.
In your HTML page load vendor.bundle.js before bundle.js.
<script src="vendor.bundle.js"></script>
<script src="bundle.js"></script>
// vendor anything coming from node_modules
minChunks: module => /node_modules/.test(module.resource)
Source: https://github.com/webpack/webpack/issues/2372#issuecomment-213149173