I'm trying to build a Webpack config allowing me to create multiple modules in a specific namespace. I have a the following config:
module.exports = {
entry: {
'one': './src/modules/one/one.module.js',
'two': './src/modules/two/two.module.js',
'three': './src/modules/three/three.module.js',
},
output: {
filename: '[name].module.js',
path: path.resolve(__dirname, 'dist'),
library: ['myModules', '[name]'],
libraryTarget: 'umd',
},
...
};
The problem is, that it creates a global namespace: myModules, but the submodules are not visible. I'm not able to create a new instance by running new window.myModules.one();. How can I fix that?
Assuming that you're using ES modules and that you have a default export for each module, you would have to access the default property.
new window.myModules.one.default();
Webpack 3 added the output.libraryExport option, which allows you to assign a specific export to the library target. Instead of having an object with the exports, you could have just the default export in place. For this you need to set output.libraryExport to 'default'.
output: {
filename: '[name].module.js',
path: path.resolve(__dirname, 'dist'),
library: ['myModules', '[name]'],
libraryTarget: 'umd',
libraryExport: 'default',
},
With that you can use it as you've wanted.
new window.myModules.one();
Related
Bellow is my code block, written in webpack. I am trying to change the entry point file name into app.js and access its function inside the script tag where this file is called. I want to achieve same result in vite. i can't find vite docs that much helpful for this
this is Webpack config file
entry: {
app: [
'regenerator-runtime/runtime.js', // for using async request,
path.resolve(__dirname, '..', './src/index.tsx'),
],
},
output: {
clean: true, // clears output directory before emitting
asyncChunks: false,
path: path.resolve(__dirname, '..', './build'),
crossOriginLoading: 'anonymous',
filename: '[name].js',
library: 'ChatSupport',
libraryTarget: 'umd',
libraryExport: 'default',
umdNamedDefine: true,
},
I did something like this:
root/
node_modules/
myPackage/
index.js // uses the .env, can access process.env
app.js // uses the .env, can access process.env
.env
In app.js, the process object is a global, when I import myPackage the global object is also available in myPackeg/index.js. All good, hurray.
But, the node_modules/myPackage is not bundled, its just a couple of .js files with entry point at index.js. If myPackege is run through webpack build (minified, mangled) it somehow no longer is able to inherit the global process object from app.js. I don't understand why.
Webpack config of myPackage is nothing special, compiles to ES5, UMD. The code was mangled though, I excluded the 'process' from being mangled but it didn't help.
What am I missing?
webpack.config.js (without transplanting to ES5 with Babel)
module.exports = {
mode: 'production',
entry: './lib/index.js',
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'myModule',
library: 'myModule',
libraryTarget: 'umd',
},
resolve: {
alias: {
lodash: path.resolve(__dirname, 'node_modules/lodash'),
'bn.js': path.resolve(__dirname, 'node_modules/bn.js'),
},
},
node: {
Buffer: false,
},
};
I'm developing a npm module and I would like it to be importable by all kind of js client side app.
Right now I tried export default myObject and module.exports = myObject
The problem is export default seems to be available only in es6 app and module.exports doesn't work in pure javascript as module is not defined.
So I would like my module to be accessible if the client use React, Angular, Vue, pure Javascript or whatever... Also my module is just an object with a list of pure javascript functions inside. No tricky part here.
Is there a way to ensure that the module is available regardless of the technology the client will use ?
The subject is a bit old now but just in case someone get the same problem.
I got it working be using UMD as Yury suggested. After some unsuccessful tries I ended up using webpack directly. Never knew he was providing us these great tools to get UMD so simply. Here is my configuration file.
I export two configuration to build a normal and a minified version at the same time.
module.exports = [
{
entry: path.resolve(__dirname, "src/myLib.js"),
output: {
path: path.resolve(__dirname, "dist"),
filename: "myLib.js",
library: 'myLib',
libraryTarget: "umd",
umdNamedDefine: true,
},
mode: "development",
module: config.module,
resolve: config.resolve,
plugins: config.plugins,
},
{
entry: path.resolve(__dirname, "src/myLib.js"),
output: {
path: path.resolve(__dirname, "dist"),
filename: "myLib.min.js",
library: 'myLib',
libraryTarget: "umd",
umdNamedDefine: true,
},
mode: "production",
module: config.module,
resolve: config.resolve,
plugins: config.plugins,
},
];
I'm not actually sure this is possible, but what I'm trying to do is take a number of NPM packages, bundle them together using Webpack and expose them as an object where each package is a property.
For example, if I wanted react and react-dom bundled, it would then provide an object like:
{
'react': /* react code */,
'react-dom': /* react-dom code */
}
My current configuration is:
module.exports = {
entry: [ 'react', 'react-dom' ],
output: {
path: __dirname + '/public',
publicPath: 'http://localhost:8081/public/',
filename: 'bundle.js',
libraryTarget: 'umd',
}
};
This seems to somewhat work in the fact that it does return an object, but the object it returns is whatever the last entry package is, so in this case, the object contains all of react-dom's methods.
If I were to change the order of the entry array to [ 'react-dom', 'react' ], then only the react methods would be exposed.
The idea would be to export the object so I can access both package methods using their properties like react.Component or react.PureComponent
I've also tried using the expose-loader, but that yields the same results as above, unless I configured it incorrectly.
Any ideas on how to properly configure webpack to achieve this?
If I understand correctly what you want to do, you could just set up a, let's say, bundle-source.js with this structure:
exports.react = require('react');
exports['react-dom'] = require('react-dom');
exports.anyModule = require('anyModule');
Then you set that bundle-source.js as the entry point of your webpack conf:
module.exports = {
entry: [ '...path-to...bundle-source.js' ],
output: {
path: __dirname + '/public',
publicPath: 'http://localhost:8081/public/',
filename: 'bundle.js',
libraryTarget: 'umd',
}
};
I have Hot Reloading feature turned on in my project like this
entry: [
'webpack-hot-middleware/client',
'./src/js/entry.js'
],
output: {
path: path.join(__dirname, 'dist'),
filename: 'bundle.js',
publicPath: '/'
},
Everything works OK until I decided to move vendors modules in different file and it didn't work. Then I realize that creating multiple chunks with array (as mentioned before) somehow differs from creating with object notation llike this
entry: {
hot: 'webpack-hot-middleware/client',
app: './src/js/entry.js'
},
output: {
path: path.join(__dirname, 'dist'),
filename: '[name].bundle.js',
publicPath: '/'
}
I included both app.bundle.js and hot.bundle.js in my index.html but still this does not work. Any idea why?
If you just want to specify several chunks you can just add hot-loading script to one of them:
entry: {
vandor: './vendor/vendor.js',
app: ['webpack-hot-middleware/client', './src/js/entry.js']
},
And if you want, you can do it dynamically:
entry: {
vandor: ['./vendor/vendor.js'],
app: ['./src/js/entry.js']
},
...
webpackConfig.entry.app.unshift('webpack-hot-middleware/client');