Converting legacy namespace code to modules from the top down - javascript

I'm trying to convert out codebase which is comprised of multiple repositories to use modules instead of namespaces.
Sorry ahead of time if my understanding is fundamentally wrong, my main expirience in javascript is in this company so this is all I've ever known.
The codebase has heiharchies where certain repositories inherit from the base one so for instance if we have A,B,C,D,E then B,C,D,E all know A.
C might know B, but D only knows A, while E knows D and C. (Attached image to be clear)
Right now we have every repository compile into 1 js file using tsconfig outFile, and we load all of it in our html file one by one in order in script tags.
I've started converting the "upper" repos as modules can use namespaces, (so in my example E and D). I changed all of the code inside the repositories themselves to be modular and not use namespaces.
It now compiles properly and works (individually - the issue I'm having is with making them import from one another).
Since I'm trying to preserve the behavior it seems like I need to use a bundler.
I'm trying to use webpack but I'm having some problems with it.
I created a package.json (we didn't have one because we just directly made one js file and put it in the html until now yes that means we couldn't import things properly it's a nightmare it's why I'm trying to change it).
Installed webpack ts-loader and yarg.
I've then made the webpack.config.js I linked below and tried compiling.
Some of my repos don't have good entry files because they are initialized by other repos so I have to make a massive list and check that all files are at least written in the JS (there has to be a solution for this i'm missing right?)
The other issue is I don't understand how to make the other repos import from the repo I just bundled.
I put the bundled.js in the node_modules but when I attempt to import a class from it I get the following error:
"TS2305: Module '"../../node_modules/#types/hybridPanel.js"' has no exported member 'test'.
(I am trying to load D in E in this instance both have been converted to modules and are not using namespaces so this should work).
Do I need to publish the repo somewhere and npm install it in repo2 (I thought that's the same thing as moving the js over).
So the questions I have are:
How do I make this work?
What do I do about the entry files issue?
Am I even going about this the right way (will I even get types if I just import the js)?
Will we need to change our tags in the html to be type=module instead of javascript?
If I were to add a module (using npm or similar) will I then need to add it to the externals tag in the webpack? I'm not confident in my understanding just yet.
Is this the correct way of converting to modules from namespaces? Our plan is if I can get this to work to convert the rest of the higher ones next and then do the lower ones all at once (the base ones multiple people use) and fix the imports in the higher ones once we get there.
webpack config link :
https://pastebin.com/iNqV1dEV
const webpack = require("webpack");
const path = require("path");
const yargs = require("yargs");
const env = yargs.argv.env; // use --env with webpack 2
const pkg = require("./package.json");
const shouldExportToAMD = yargs.argv.amd;
let libraryName = pkg.name;
let outputFile, mode;
if (shouldExportToAMD) {
libraryName += ".amd";
}
if (env === "build") {
mode = "production";
outputFile = libraryName + ".min.js";
} else {
mode = "development";
outputFile = libraryName + ".js";
}
const config = {
mode: mode,
entry: [__dirname + "/src/panel/MoPanelManager.ts", __dirname + "/src/panel/chat/MoSingleChat.ts", __dirname + "/src/panel/booth/MoBoothDisplays.ts", __dirname + "/src/panel/chat/MoGifs.ts", __dirname + "/src/settings/MoSettings.ts"],
devtool: "source-map",
output: {
path: __dirname + "/www/module",
filename: outputFile,
library: libraryName,
libraryTarget: "umd",
libraryExport: "default",
umdNamedDefine: true,
globalObject: "typeof self !== 'undefined' ? self : this",
},
module: {
rules: [
{
test: /\.ts?$/,
use: {
loader: 'ts-loader',
},
exclude: /(node_modules|bower_components)/,
},
],
},
resolve: {
modules: [path.resolve("./node_modules"), path.resolve("./src")],
extensions: [".ts", ".js"]
},
};
module.exports = config;
Image of the repo example

Related

Public file is missing from build/public in Webpack [duplicate]

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

Change domain of images that webpack generates for imported images

Although my dev server is running on localhost:3000, I have set up my host file to point www.mysite.com to localhost. In my JavaScript, I have code like:
import myImage from '../assets/my-image.jpg'
const MyCmp => <img src={myImage} />
Using Webpack's file-loader, it transforms that import into a URL to the hosted image. However, it uses the localhost path to that image, but I'd like it to use the www.mysite.com domain. I looked at both the publicPath and postTransformPublicPath options for file-loader, but those only appear to allow you to modify the part of the path that comes after the domain.
I personally don't like the notion of defining host-information statically in the build output. This is something that should be determined in runtime based on where you actually put your files.
If you are like me then there are two options here.
Both involve you calling a global method that you have defined on i.e. window / global scope.
The purpose of the global method is to resolve the root path (the domain, etc) in runtime.
Define a global method
So lets say you define a method on the global scope somewhere in your startup code like so:
(<any>window).getWebpackBundleRootPath = function (webpackLibraryId) {
if (webpackLibraryId == null) return throw "OOOPS DO SOMETHING HERE!";
// Preferably these variables should be loaded from a config-file of sorts.
if(webpackLibraryId == "yourwebpacklibrary1") return "https://www.yoursite.com/";
// If you have other libraries that are hosted somewhere else, put them here...
return "...some default path for all other libraries...";
};
The next step is to configure webpack to call this global method when it tries to resolve the path.
As I mentioned there are two ways, one that manipulates the output of the webpack and one that is more integrated in webpacks configuration (although only for file-loader but I think it should suffice).
It's worth mentioning that you don't need a global method if you only have one bundle or if you host all your bundles in one place. Then it would be enough to use a global variable instead. It should be quite easy to modify the example below to accommodate this.
First option: configure webpack file-loader to call your method when resolving path
This solution will not require something to be done post build. If this fits your need and covers all scenarios I would go for this option.
Edit your webpack config file
var path = require('path');
let config = {
entry: {
'index': path.join(__dirname, '/public/index.js')
},
output: {
path: path.join(__dirname, '/dist/'),
filename: 'index-bundle.js',
publicPath: 'https://localhost:3000/',
library: 'yourwebpacklibrary1',
...
},
module: {
rules: [{
// Please note that this only defines the resolve behavior for ttf. If you want to resolve other files you need to configure the postTransformPublicPath for those too. This is a big caveat in my opinion and might be a reason for using second option.
test: /\.ttf(\?v=\d+\.\d+\.\d+)?$/,
use: [{
loader: 'file-loader',
options: {
outputPath: 'assets/fonts', // The folder where you store your fonts.
name: '[name].[ext]',
// This is where the magic happens. This lets you override the output of webpack file resolves.
postTransformPublicPath: function (p) {
// Because of the way webpack file-loader works the input from to this method will look something like this: __webpack_public_path__ + "/assets/fonts/yourfont.ttf"
// But we are not interested in using the __webpack_public_path__ variable so lets remove that.
p = p.replace('__webpack_public_path__ + ', '');
// Return a stringified call to our global method and append the relative path to the root path returned.
return `getWebpackBundleRootPath("${config.output.library}") + ${p}`;
}
}
}]
},
},
...
};
module.exports = config;
As you might have noticed in the comments in the webpack config file you need to specify the resolve behavior for each file-loader that you add (if someone knows a better way, please let me know). This is why I still use the second option.
Second option: manipulate the output of the webpack in a postbuild step
Example webpack.config.js file
For completeness sake here is an example of a webpack.config.js file that contains the variables used in the postbuild script.
var path = require('path');
module.exports = {
entry: {
'index': path.join(__dirname, '/public/index.js')
},
output: {
path: path.join(__dirname, '/dist/'),
filename: 'index-bundle.js',
publicPath: 'https://localhost:3000/',
library: 'yourwebpacklibrary1',
...
},
...
}
Create a postbuild.js file
Create a file postbuild.js next to your package.json with the following content:
const fs = require('fs');
// We take the path to the webpack config file as input so that we can read settings from it.
const webpackConfigFile = process.argv[2];
// Read the webpack config file into memory.
const config = require(webpackConfigFile);
// The file to manipulate is the output javascript bundle that webpack produces.
const inputFile = config.output.path + config.output.filename;
// Load the file into memory.
let fileContent = fs.readFileSync(inputFile, 'utf8');
// Replace the default public path with the call to the method. Please note that if you specify a publicPath as '/' or something very common you might end up with a problem so make sure it is unique in the file to avoid other unrelated stuff being replaced as well.
fileContent = fileContent.replace('"' + config.output.publicPath + '"', 'getWebpackBundleRootPath("' + config.output.library + '")');
// Save the manipulated file back to disk.
fs.writeFileSync(inputFile, fileContent, 'utf8');
Call the postbuild.js automatically on build
Next step is to actually call the postbuild.js script after each build.
This can be done in a postscript in package.json like so (in the script section in your package.json):
{
"scripts": {
"build": "webpack",
"postbuild": "node postbuild.js ./webpack.config.js"
}
}
From now on whenever you run the build script it will also run the postbuild script (from npm or yarn, etc).
You can of course also manually run the postbuild.js script manually after each build instead.
but those only appear to allow you to modify the part of the path that comes after the domain.
Not really, you can give it an URL that includes the domain.
In your case, assuming your images are under the assets directory, you will have something like this in your webpack.config.js
...
module: {
rules: [
...
{
test: /\.(png|jpe?g|gif|svg)$/,
use: {
loader: 'file-loader',
options: {
publicPath: 'https://www.example.com/assets',
outputPath: 'assets'
}
}
},
...
]
}
...

WebPack configuration- what is the proper code and what does it mean

I'm learning React and want to understand how web pack is configured for a project.
It would be great if someone can tell me what the following lines of code are doing.
const fs = require('fs')
const path = require('path')
const webpack = require('webpack')
function isDirectory(dir) {
return fs.lstatSync(dir).isDirectory()
}
const SubjectsDir = path.join(__dirname, 'subjects')
const SubjectDirs = fs.readdirSync(SubjectsDir).filter(function (dir) {
return isDirectory(path.join(SubjectsDir, dir))
})
module.exports = {
devtool: 'source-map',
entry: SubjectDirs.reduce(function (entries, dir) {
if (fs.existsSync(path.join(SubjectsDir, dir, 'exercise.js')))
entries[dir + '-exercise'] = path.join(SubjectsDir, dir, 'exercise.js')
if (fs.existsSync(path.join(SubjectsDir, dir, 'solution.js')))
entries[dir + '-solution'] = path.join(SubjectsDir, dir, 'solution.js')
if (fs.existsSync(path.join(SubjectsDir, dir, 'lecture.js')))
entries[dir + '-lecture'] = path.join(SubjectsDir, dir, 'lecture.js')
return entries
}, {
shared: [ 'react', 'react-dom' ]
}),
output: {
path: '__build__',
filename: '[name].js',
chunkFilename: '[id].chunk.js',
publicPath: '__build__'
},
resolve: {
extensions: [ '', '.js', '.css' ]
},
module: {
loaders: [
{ test: /\.css$/, loader: 'style!css' },
{ test: /\.js$/, exclude: /node_modules|mocha-browser\.js/, loader: 'babel' },
{ test: /\.woff(2)?$/, loader: 'url?limit=10000&mimetype=application/font-woff' },
{ test: /\.ttf$/, loader: 'file' },
{ test: /\.eot$/, loader: 'file' },
{ test: /\.svg$/, loader: 'file' },
{ test: require.resolve('jquery'), loader: 'expose?jQuery' }
]
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({ name: 'shared' })
],
devServer: {
quiet: false,
noInfo: false,
historyApiFallback: {
rewrites: [
{ from: /ReduxDataFlow\/exercise.html/,
to: 'ReduxDataFlow\/exercise.html'
}
]
},
stats: {
// Config for minimal console.log mess.
assets: true,
colors: true,
version: true,
hash: true,
timings: true,
chunks: false,
chunkModules: false
}
}
}
This informaiton is coming from a training course, but they do not explain what the lines are doing.
Webpack is what we call a module bundler for JavaScript applications. You can do a whole slew of things with it that help a client browser download and run your code. In the case of React, it helps convert JSX code into plain JS so that the browser can understand it. JSX itself will not run in the browser. We can even use plugins to help minify code, inject HTML, bundle various groups of code together, etc. Now that the introduction to Webpack is out of the way, let's take a look at the code. I will be starting from the very top. Feel free to skip down to #3 if you are only interested in the Webpack configuration object.
The following code will require the modules that are needed in this file. fs is short for "filesystem" and is a module that gives you functions to run that can access the project's filesystem. path is a common module used to resolve or create pathnames to files and is very easy to use! And then we have the webpack module through which we can access webpack specific functions (ie: webpack plugins like webpack.optimize.UglifyJsPlugin).
const fs = require('fs')
const path = require('path')
const webpack = require('webpack')
These next few lines first set up a helper function to determine whether or not something in the filesystem being read is a directory.
function isDirectory(dir) {
return fs.lstatSync(dir).isDirectory()
}
const SubjectsDir = path.join(__dirname, 'subjects')
const SubjectDirs = fs.readdirSync(SubjectsDir).filter(function (dir) {
return isDirectory(path.join(SubjectsDir, dir))
})
devtool specifies which developer tool you want to use to help with debugging. Options are listed here : https://webpack.github.io/docs/configuration.html#devtool. This can be very useful in helping you determine exactly which files and lines and columns errors are coming from.
devtool: 'source-map'
These next few lines tell Webpack where to begin bundling your files. These initial files are called entry points. The entry property in a Webpack configuration object should be an object whose keys determine the name of a bundle and values point to a relative path to the entry file or the name of a node_module. You can also pass in an array of files to each entry point. This will cause each of those files to be bundled together into one file under the name specified by the key - ie: react and react-dom will each be parsed and have their outputs bundled under the name shared.
entry: SubjectDirs.reduce(function (entries, dir) {
if (fs.existsSync(path.join(SubjectsDir, dir, 'exercise.js')))
entries[dir + '-exercise'] = path.join(SubjectsDir, dir, 'exercise.js')
if (fs.existsSync(path.join(SubjectsDir, dir, 'solution.js')))
entries[dir + '-solution'] = path.join(SubjectsDir, dir, 'solution.js')
if (fs.existsSync(path.join(SubjectsDir, dir, 'lecture.js')))
entries[dir + '-lecture'] = path.join(SubjectsDir, dir, 'lecture.js')
return entries
}, {
shared: [ 'react', 'react-dom' ]
}),
In the reduce function we simply read through the SubjectsDir, determine whether files exercise.js, lecture.js & solution.js exist, and then provide the path to those files as values associated with the key names identified by dir + '-' + filename (ie: myDir-exercise). This may end up looking like the following if only exercise.js exists:
entry : {
'myDir-exercise': 'subjectDir/myDir/exercise.js',
share: ['react', 'react-dom']
}
After we provide entry points to the Webpack configuration object, we must specify where we want Webpack to output the result of bundling those files. This can be specified in the output property.
output: {
path: '__build__',
filename: '[name].js',
chunkFilename: '[id].chunk.js',
publicPath: '__build__'
},
The path property defines the absolute path to the output directory. In this case we call it __build__.
The filename property defines the output name of each entry point file. Webpack understands that by specifying '[name]' you are referring to the key you assigned to each entry point in the entry property (ie: shared or myDir-exercise).
The chunkFilename property is similar to the filename property but for non-entry chunk files which can be specified by the CommonChunksPlugin (see below). The use of [id] is similar to the use of [name].
The publicPath property defines the public URL to where your files are located, as in the URL from which to access your files through a browser.
The resolve property tells Webpack what how to resolve your files if it can not find them for some reason. There are several properties we can pass here with extensions being one of them. The extensions property tells Webpack which file extensions to try on a file if one is not specified in your code.
resolve: {
extensions: [ '', '.js', '.css' ]
},
For example, let's say we have the following code
const exercise = require('./exercise');
We can leave out the .js because we have provided that string in the resolve property of the webpack configuration and Webpack will try and append .js to this at bundling time to find your file. As of Webpack 2 we also no longer need to specify an empty string as the first element of the resolve property.
The module property tells Webpack how modules within our project will be treated. There are several properties we can add here and I suggest taking a look at the documentation for more details. loaders is a common property to use and with that we can tell Webpack how to parse particular file types within our project. The test property in each loader is simply a Regex that tells Webpack which files to run this loader on. This /\.js$/ for example will run the specified loader on files that end with .js. babel-loader is a widely used JavaScript + ES6 loader. The exclude property tells Webpack which files to not run with the specified loader. The loader property is the name of the loader. As of Webpack 2 we are no longer able to drop the -loader from the string as we see here.
Plugins have a wide range of functions. As mentioned earlier we can use plugins to help minify code or to build chunks that are used across our entire application, like react and react-dom. Here we see the CommonChunksPlugin being used which will bundle the files under the entry name shared together as a chunk so that they can be separated from the rest of the application.
Finally we have the devServer property which specifies certain configurations for the behavior of webpack-dev-server, which is a separate module from webpack. This module can be useful for development in that you can opt out of building your own web server and allow webpack-dev-server to serve your static files. It also does not write the outputs to your filesystem and serves the bundle from a location in memory at the path specified by the publicPath property in the output property of the Webpack configuration object (see #5). This helps make development faster. To use it you would simply run webpack-dev-server in place of webpack. Take a look at the documentation online for more details.
The configuration object that we've taken a look at follows the Webpack 1 standard. There are many changes between Webpack 1 and 2 in terms of configuration syntax that would be important to take a look at. The concepts are still the same, however. Take a look at the documentation for information on migrating to Webpack 2 and for further Webpack 2 details.

Path aliases for imports in WebStorm

I use webpack path aliases for ES6 module loading.
E.g. If I define an alias for utils instead of something like
import Foo from "../../../utils/foo", I can do
import Foo from "utils/foo"
The problem is that once I start using aliases, WebStorm looses track of the import and I'm left with warnings and no auto-completion.
Is there a way to instruct WebStorm to use such aliases?
Yes, there is.
In fact, Webstorm can't automatically parse and apply Webpack config, but you can set up aliases the same way.
You just have to mark the parent folder of "utils" (in your example) as a resource root (right-click, mark directory as / resource root).
We just managed to do with the following structure :
/src
/A
/B
/C
We have A B and C folders declared as alias in Webpack.
And in Webstorm we marked "src" as "Resource Root".
And now we can simply import :
import A/path/to/any/file.js
instead of
import ../../../../../A/path/to/any/file.js
while still having Webstorm correctly parsing and indexing all code, link to files, autocompleting and so on ...
I managed to set up aliases for WebStorm 2017.2 within webpack like this:
For the record: in PHPSTORM, working with laravel mix, I managed to solve this by creating a webpack.config.js file separately like:
const path = require('path')
const webpack = require('webpack')
module.exports = {
...
resolve: {
extensions: ['.js', '.json', '.vue'],
alias: {
'~': path.resolve(__dirname, './resources/assets/js')
}
},
...
}
And then importing it in the webpack.mix.js like:
const config = require('./webpack.config')
...
mix.webpackConfig(config)
Make sure the webpack configuration file is pointed correctly in the configuration of the PhpStorm in: Settings > Languages & Frameworks > Javascript > Webpack
You can define custom paths, so WebStorm/PhpStorm can understand your aliases. But make sure, they are identical with your aliases. Create file in your root directory and call it something like this: webStorm.config.js (any js file will be ok). Then configure your paths inside:
System.config({
"paths": {
"components/*": "./src/components/*",
"core/*": "./src/core/*",
...
}
});
WebStorm/PhpStorm will recognize System as it's own module and will treat this file as configuration.
This is answered in a comment but to save people digging into comments and link only information, here it is:
As of WS2017.2 this will be done automatically. The information is here.
According to this, webstorm will automatically resolve aliases that are included within the webpack.config in the root of the project. If you have a custom structure and your webpack.config isn't in the root folder then go to Settings | Languages & Frameworks | JavaScript | Webpack and set the option to the config you require.
Note: Most setups have a base config which then call a dev or prod version. In order for this to work properly, you need to tell webstorm to use the dev one.
Not right now, We were also using path aliases for the files in our react project. The import names were shorter but we lost a lot on static checking of webstorm as well as completion features.
We later came up with a decision to reduce the code to only 3 levels of depth, as well a single level for the common parts. The path completion feature of webstom (ctrl + space) even helps reduce the typing overhead. The production build does not use longer names, so hardly makes any difference in final code.
I will suggest please reconsider your decision about aliases. You loose semantic meaning of modules coming from node_modules and your own code, as well as referencing the alias files again and again to make sense of your code, is a much bigger overhead.
add jsconfig.js on your project root
{
"compilerOptions": {
"baseUrl": ".",
"paths": {
"~/*": ["./src/*"]
}
}
}
In PHPStorm (using 2017.2 currently), I have not been able to get webpack configs to work properly in regards to aliases.
My fix involves using the "Directories" section of the main settings. I just had to mark each folder referenced by an alias as a sources root, then click the properties dropdown for each and specify the alias as a "Package prefix". This made everything link up for me.
Not sure if the Directories section exists in WebStorm, but if it does, this seems to be a fool-proof method for getting import aliases working.
For anyone struggling: path.resolve() must be called with "__dirname" first argument for Idea (Websorm) to be able to resolve the path correctly.
Will work for Idea (Websorm):
alias: {
'#someAlias': pathLib.resolve(__dirname, 'path/to/directory')
}
Will not work for Idea (Websorm) (while still being valid webpack alias):
alias: {
'#someAlias': pathLib.resolve('path/to/directory')
}
Webstorm can't read webpack.config if module.exports return a function.
For example
module.exports = function (webpackEnv) {
return {
mode: isEnvProduction ? 'production' : isEnvDevelopment && 'development',
...
}
}
Check your config file, maybe this cause you are a problem.
There is a lot of discussion here about Laravel Mix, so I'll leave this here to help out future readers. I solved this by creating a separate (fake) webpack config file which is only used by my IDE (PHPStorm).
1. Create a separate alias.js file (e.g. /webpack/alias.js)
const path = require('path');
const assets = path.join(__dirname,'..','resources','assets');
module.exports = {
'#js' : path.resolve(assets, 'js'),
'#c' : path.resolve(assets, 'js', 'components'),
'#errors' : path.resolve(assets, 'js', 'errors'),
'#utils' : path.resolve(assets, 'js', 'utils'),
'#store' : path.resolve(assets, 'js', 'store'),
'#api' : path.resolve(assets, 'js', 'api'),
'#less' : path.resolve(assets, 'less')
}
2. Require the alias.js file into webpack.mix.js
const mix = require('laravel-mix');
mix.alias(require('./webpack/alias'))
// ... The rest of your mix, e.g.
.js('app.js')
.vue()
.less('app.less');
3. Create the fake webpack config for your IDE (e.g. /webpack/ide.config.js)
Here, import the laravel-mix webpack config, plus your aliases, and any other config that the IDE might need help finding. Also include the prefixed ~ aliases for importing styles into your Vue components.
/*
|--------------------------------------------------------------------------
| A fake config file for PhpStorm to enable aliases
|--------------------------------------------------------------------------
|
| File > Settings... > Languages & Frameworks > Javascript > Webpack
|
| Select "Manually" and set the configuration file to this
|
*/
const path = require('path');
const mixConfig = require('./../node_modules/laravel-mix/setup/webpack.config')();
module.exports = {
...mixConfig,
resolve: {
alias: {
...require('./alias'),
'~#less' : path.resolve('#less'), // <--
},
...mixConfig.resolve
}
}
4. Set your IDE to use webpack/ide.config.js as your webpack config file.
Had the same problem on a new Laravel project with Jetstream. The webpack.config.js was present and correct. But PHPStorm still didn't recognize the # symbol as a resource root.
After opening the webpack config, I got a notification:
After Clicking on Trust project and run, the # symbol became recognized.
I know that this isn't the solution or use-case for everyone. But I still found it worthy to note on this post, because it helped me in my situation.
Using
laravel/framework:8.77.1
npm:8.3.0
node:v14.18.1

Simple solution to share modules loaded via NPM across multiple Browserify or Webpack bundles

Pulling my hair out here looking for a simple solution to share code, required via NPM, across multiple Browserify or Webpack bundles. Thinking, is there such a thing as a file "bridge"?
This isn't due to compile time (I'm aware of watchify) but rather the desire to extract out all of my vendor specific libs into vendor.js so to keep my app.js filesize down and to not crash the browser with massive sourcemaps. Plus, I find it way cleaner should the need to view the compiled js arise. And so:
// vendor.js
require('react');
require('lodash');
require('other-npm-module');
require('another-npm-module');
Its very important that the code be loaded from NPM as opposed to Bower, or saved into some 'vendor' directory in order to be imported via a relative path and identified via a shim. I'd like to keep every library reference pulled via NPM except for my actual application source.
In app.js I keep all of my sourcecode, and via the externals array, exclude vendor libraries listed above from compilation:
// app.js
var React = require('react');
var _ = require('lodash');
var Component = React.createClass()
// ...
And then in index.html, I require both files
// index.html
<script src='vendor.js'></script>
<script src='app.js'></script>
Using Browserify or Webpack, how can I make it so that app.js can "see" into those module loaded via npm? I'm aware of creating a bundle with externals and then referencing the direct file (in, say, node_modules) via an alias, but I'm hoping to find a solution that is more automatic and less "Require.js" like.
Basically, I'm wondering if it is possible to bridge the two so that app.js can look inside vendor.js in order to resolve dependencies. This seems like a simple, straightforward operation but I can't seem to find an answer anywhere on this wide, wide web.
Thanks!
Listing all the vendor files/modules and using CommonChunkPlugin is indeed the recommended way. This gets pretty tedious though, and error prone.
Consider these NPM modules: fastclick and mprogress. Since they have not adopted the CommonJS module format, you need to give webpack a hand, like this:
require('imports?define=>false!fastclick')(document.body);
require('mprogress/mprogress.min.css');
var Mprogress = require('mprogress/mprogress.min.js'),
Now assuming you would want both fastclick and mprogress in your vendor chunk, you would probably try this:
module.exports = {
entry: {
app: "./app.js",
vendor: ["fastclick", "mprogress", ...]
Alas, it doesn't work. You need to match the calls to require():
module.exports = {
entry: {
app: "./app.js",
vendor: [
"imports?define=>false!fastclick",
"mprogress/mprogress.min.css",
"mprogress/mprogress.min.js",
...]
It gets old, even with some resolve.alias trickery. Here is my workaround. CommonChunkPlugin lets you specify a callback that will return whether or not you want a module to be included in the vendor chunk. If your own source code is in a specific src directory, and the rest is in the node_modules directory, just reject the modules based on their path:
var node_modules_dir = path.join(__dirname, 'node_modules'),
app_dir = path.join(__dirname, 'src');
module.exports = {
entry: {
app: "./app.js",
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
function (module, count) {
return module.resource && module.resource.indexOf(app_dir) === -1;
}
)
]
};
Where module.resource is the path to the module being considered. You could also do the opposite, and include only the module if it is inside node_modules_dir, i.e.:
return module.resource && module.resource.indexOf(node_modules_dir) === 0;
but in my situation, I'd rather say: "put everything that is not in my source source tree in a vendor chunk".
Hope that helps.
With webpack you'd use multiple entry points and the CommonChunkPlugin.
Taken from the webpack docs:
To split your app into 2 files, say app.js and vendor.js, you can require the vendor files in vendor.js. Then pass this name to the CommonChunkPlugin as shown below.
module.exports = {
entry: {
app: "./app.js",
vendor: ["jquery", "underscore", ...],
},
output: {
filename: "bundle.js"
},
plugins: [
new webpack.optimize.CommonsChunkPlugin(
/* chunkName= */"vendor",
/* filename= */"vendor.bundle.js"
)
]
};
This will remove all modules in the vendor chunk from the app chunk. The bundle.js will now contain just your app code, without any of it’s dependencies. These are in vendor.bundle.js.
In your HTML page load vendor.bundle.js before bundle.js.
<script src="vendor.bundle.js"></script>
<script src="bundle.js"></script>
// vendor anything coming from node_modules
minChunks: module => /node_modules/.test(module.resource)
Source: https://github.com/webpack/webpack/issues/2372#issuecomment-213149173

Categories

Resources