Conditional build based on environment using Webpack - javascript

I have some things for development - e.g mocks which I would like to not bloat my distributed build file with.
In RequireJS you can pass a config in a plugin file and conditonally require things in based on that.
For webpack there doesn't seem to be a way of doing this. Firstly to create a runtime config for an environment I have used resolve.alias to repoint a require depending on the environment, e.g:
// All settings.
var all = {
fish: 'salmon'
};
// `envsettings` is an alias resolved at build time.
module.exports = Object.assign(all, require('envsettings'));
Then when creating the webpack config I can dynamically assign which file envsettings points to (i.e. webpackConfig.resolve.alias.envsettings = './' + env).
However I would like to do something like:
if (settings.mock) {
// Short-circuit ajax calls.
// Require in all the mock modules.
}
But obviously I don't want to build in those mock files if the environment isn't mock.
I could possibly manually repoint all those requires to a stub file using resolve.alias again - but is there a way that feels less hacky?
Any ideas how I can do that? Thanks.

You can use the define plugin.
I use it by doing something as simple as this in your webpack build file where env is the path to a file that exports an object of settings:
// Webpack build config
plugins: [
new webpack.DefinePlugin({
ENV: require(path.join(__dirname, './path-to-env-files/', env))
})
]
// Settings file located at `path-to-env-files/dev.js`
module.exports = { debug: true };
and then this in your code
if (ENV.debug) {
console.log('Yo!');
}
It will strip this code out of your build file if the condition is false. You can see a working Webpack build example here.

Not sure why the "webpack.DefinePlugin" answer is the top one everywhere for defining Environment based imports/requires.
The problem with that approach is that you are still delivering all those modules to the client -> check with webpack-bundle-analyezer for instance. And not reducing your bundle.js's size at all :)
So what really works well and much more logical is: NormalModuleReplacementPlugin
So rather than do a on_client conditional require -> just not include not needed files to the bundle in the first place
Hope that helps

Use ifdef-loader. In your source files you can do stuff like
/// #if ENV === 'production'
console.log('production!');
/// #endif
The relevant webpack configuration is
const preprocessor = {
ENV: process.env.NODE_ENV || 'development',
};
const ifdef_query = require('querystring').encode({ json: JSON.stringify(preprocessor) });
const config = {
// ...
module: {
rules: [
// ...
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: `ifdef-loader?${ifdef_query}`,
},
},
],
},
// ...
};

I ended up using something similar to Matt Derrick' Answer, but was worried about two points:
The complete config is injected every time I use ENV (Which is bad for large configs).
I have to define multiple entry points because require(env) points to different files.
What I came up with is a simple composer which builds a config object and injects it to a config module.
Here is the file structure, Iam using for this:
config/
└── main.js
└── dev.js
└── production.js
src/
└── app.js
└── config.js
└── ...
webpack.config.js
The main.js holds all default config stuff:
// main.js
const mainConfig = {
apiEndPoint: 'https://api.example.com',
...
}
module.exports = mainConfig;
The dev.js and production.js only hold config stuff which overrides the main config:
// dev.js
const devConfig = {
apiEndPoint: 'http://localhost:4000'
}
module.exports = devConfig;
The important part is the webpack.config.js which composes the config and uses the DefinePlugin to generate a environment variable __APP_CONFIG__ which holds the composed config object:
const argv = require('yargs').argv;
const _ = require('lodash');
const webpack = require('webpack');
// Import all app configs
const appConfig = require('./config/main');
const appConfigDev = require('./config/dev');
const appConfigProduction = require('./config/production');
const ENV = argv.env || 'dev';
function composeConfig(env) {
if (env === 'dev') {
return _.merge({}, appConfig, appConfigDev);
}
if (env === 'production') {
return _.merge({}, appConfig, appConfigProduction);
}
}
// Webpack config object
module.exports = {
entry: './src/app.js',
...
plugins: [
new webpack.DefinePlugin({
__APP_CONFIG__: JSON.stringify(composeConfig(ENV))
})
]
};
The last step is now the config.js, it looks like this (Using es6 import export syntax here because its under webpack):
const config = __APP_CONFIG__;
export default config;
In your app.js you could now use import config from './config'; to get the config object.

another way is using a JS file as a proxy, and let that file load the module of interest in commonjs, and export it as es2015 module, like this:
// file: myModule.dev.js
module.exports = "this is in dev"
// file: myModule.prod.js
module.exports = "this is in prod"
// file: myModule.js
let loadedModule
if(WEBPACK_IS_DEVELOPMENT){
loadedModule = require('./myModule.dev.js')
}else{
loadedModule = require('./myModule.prod.js')
}
export const myString = loadedModule
Then you can use ES2015 module in your app normally:
// myApp.js
import { myString } from './store/myModule.js'
myString // <- "this is in dev"

Faced with the same problem as the OP and required, because of licensing, not to include certain code in certain builds, I adopted the webpack-conditional-loader as follows:
In my build command I set an environment variable appropriately for my build. For example 'demo' in package.json:
...
"scripts": {
...
"buildDemo": "./node_modules/.bin/webpack --config webpack.config/demo.js --env.demo --progress --colors",
...
The confusing bit that is missing from the documentation I read is that I have to make this visible throughout the build processing by ensuring my env variable gets injected into the process global thus in my webpack.config/demo.js:
/* The demo includes project/reports action to access placeholder graphs.
This is achieved by using the webpack-conditional-loader process.env.demo === true
*/
const config = require('./production.js');
config.optimization = {...(config.optimization || {}), minimize: false};
module.exports = env => {
process.env = {...(process.env || {}), ...env};
return config};
With this in place, I can conditionally exclude anything, ensuring that any related code is properly shaken out of the resulting JavaScript. For example in my routes.js the demo content is kept out of other builds thus:
...
// #if process.env.demo
import Reports from 'components/model/project/reports';
// #endif
...
const routeMap = [
...
// #if process.env.demo
{path: "/project/reports/:id", component: Reports},
// #endif
...
This works with webpack 4.29.6.

I've struggled with setting env in my webpack configs. What I usually want is to set env so that it can be reached inside webpack.config.js, postcss.config.js and inside the entry point application itself (index.js usually). I hope that my findings can help someone.
The solution that I've come up with is to pass in --env production or --env development, and then set mode inside webpack.config.js.
However, that doesn't help me with making env accessible where I want it (see above), so I also need to set process.env.NODE_ENV explicitly, as recommended here.
Most relevant part that I have in webpack.config.js follow below.
...
module.exports = mode => {
process.env.NODE_ENV = mode;
if (mode === "production") {
return merge(commonConfig, productionConfig, { mode });
}
return merge(commonConfig, developmentConfig, { mode });
};

Use envirnment variables to create dev and prod deployments:
https://webpack.js.org/guides/environment-variables/

I use string-replace-loader to get rid of an unnecessary import from the production build, and it works as expected: the bundle size becomes less, and a module for development purposes (redux-logger) is completely removed from it. Here is the simplified code:
In the file webpack.config.js:
rules: [
// ... ,
!env.dev && {
test: /src\/store\/index\.js$/,
loader: 'string-replace-loader',
options: {
search: /import.+createLogger.+from.+redux-logger.+;/,
replace: 'const createLogger = null;',
}
}
].filter(Boolean)
In the file src/store/index.js:
// in prod this import declaration is substituted by `const createLogger = null`:
import { createLogger } from 'redux-logger';
// ...
export const store = configureStore({
reducer: persistedReducer,
middleware: createLogger ? [createLogger()] : [],
devTools: !!createLogger
});

While this is not the best solution, it may work for some of your needs. If you want to run different code in node and browser using this worked for me:
if (typeof window !== 'undefined')
return
}
//run node only code now

Related

Error Loading Image src For React Components using require() ('Cannot find module "." webbackMissingModule) [duplicate]

OK, i have searched high and low but cannot reliably deterrmine if this is or is not possible with webpack.
https://github.com/webpack/webpack/tree/master/examples/require.context
Appears to indicate that one can pass a string to a function and it load a module...
But my attempt is just not working:
webpack.config.js
'use strict';
let webpack = require('webpack'),
jsonLoader = require("json-loader"),
path = require("path"),
fs = require('fs'),
nodeModules = {};
fs.readdirSync('node_modules')
.filter(function(x) {
return ['.bin'].indexOf(x) === -1;
})
.forEach(function(mod) {
nodeModules[mod] = 'commonjs ' + mod;
});
let PATHS = {
app: __dirname + '/src'
};
module.exports = {
context: PATHS.app,
entry: {
app: PATHS.app+'/server.js'
},
target: 'node',
output: {
path: PATHS.app,
filename: '../build/server.js'
},
externals: nodeModules,
performance: {
hints: "warning"
},
plugins: [
jsonLoader
],
resolve: {
modules: [
'./node_modules',
path.resolve(__dirname),
path.resolve(__dirname + "/src"),
path.resolve('./config')
]
},
node: {
fs: "empty"
}
};
The server.js
let _ = require('lodash');
let modules = [ "modules/test" ];
require( 'modules/test' )();
_.map( modules, function( module ){
require( module );
});
The module in modules/ named test.js
module.exports = () => {
console.log('hello world');
};
But the result is always the same... the pm2 logs just say hello world for the static require... but for the dynamic load of the same module
Error: Cannot find module "."
All i want to be able to do is loop through an array of paths to modules and load then...
You cannot use a variable as argument to require. Webpack needs to know what files to bundle at compile time. As it does no program flow analysis, it can't know what you pass to the function. In that case it might be obvious, but this could go as far as using user input to decide what module to require, and there is no way webpack can possibly know which modules to include at compile time, so webpack does not allow it.
The example you posted is a bit different. You could use require with a concatenated string. For example:
require(`./src/${moduleName}/test`);
Which modules does webpack need to include in the bundle? The variable moduleName could be anything, so the exact module is not known at compile time. Instead it includes all modules that could possibly match the above expression. Assuming the following directory structure:
src
├─ one
│   └─ test.js
├─ two
│   ├─ subdir
│   │   └─ test.js
│   └─ test.js
└─ three
└─ test.js
All of these test.js files will be included in the bundle, because moduleName could be one or something nested like two/subdir.
For more details see require with expression of the official docs.
You cannot loop through an array and import every module of the array, with the above exception by concatenating a string, but that has the effect of including all possible modules and should generally be avoided.
I ran into this problem in an electron environment. My use case was being able to require dynamically created files in an IDE like application. I wanted to use the electron require, which is basically a NodeJS Common module loader. After some back and forth I landed on a solution that uses webpack's noParse module configuration.
First create a module that that will be ignored by webpack's parser:
// file: native-require.js
// webpack replaces calls to `require()` from within a bundle. This module
// is not parsed by webpack and exports the real `require`
// NOTE: since the module is unparsed, do not use es6 exports
module.exports = require
In my webpack config, under module, instruct the bundler not to parse this module:
{
module: {
noParse: /\/native-require.js$/,
}
}
Lastly, in any bundle where you want to access the original require:
import nativeRequire from './native-require'
const someModule = nativeRequire('/some/module.js') // dynamic imports
A bit late....but... since you are bundling to target: 'node', there is a workaround to dynamic requiring modules, and bypassing the "the effect of including all possible modules".
The solution is lifted from:
Using dynamic require on node targets WITHOUT resolve or bundle the target module · Issue #4175 · webpack/webpack
Quoted from that comment:
const requireFunc = typeof __webpack_require__ === "function" ? __non_webpack_require__ : require;
const foo = requireFunc(moduleName);
Bundles to:
const requireFunc = true ? require : require;
const foo = requireFunc(moduleName);

How to dynamically change import paths in webpack?

I have a set of exports that I normally import from one directory:
import { myThing } from 'path/to/dir/es5/things'
However, if I run webpack with a specific NODE_ENV set, I would like the root of all such imports to be treated as es6 instead:
import { myThing } from 'path/to/dir/es6/things'
How can I do this, e.g. have webpack dynamically resolve path/to/dir/es5/* to path/to/dir/es6/*?
You can use an alias for that.
For example, in your webpack configuration, you can use something like:
const alias = {};
if (NODE_ENV === ...) {
alias['path/to/dir/es5'] = 'path/to/dir/es6'
}
Further reading: https://webpack.js.org/configuration/resolve/
Here is my way of handling this stuff.
Built-in DefinePlugin coupled with env variable.
in your package.json, add env variable in following way
"script": "... webpack --env.SOME_ENV_VAR=value -p"
Then, add your DefinePlugin inside your webpack.config file
plugins: [
...
new webpack.DefinePlugin({
SOME_ENV_VAR: env.SOME_ENV_VAR,
}),
...
]
Then you can use SOME_ENV_VAR as global variable inside your code
/* global SOME_ENV_VAR */
const esX = SOME_ENV_VAR === value ? 'es5' : 'es6';
myThing = require(`path/to/dir/${esX}/things`)

webpack dynamic module loader by require

OK, i have searched high and low but cannot reliably deterrmine if this is or is not possible with webpack.
https://github.com/webpack/webpack/tree/master/examples/require.context
Appears to indicate that one can pass a string to a function and it load a module...
But my attempt is just not working:
webpack.config.js
'use strict';
let webpack = require('webpack'),
jsonLoader = require("json-loader"),
path = require("path"),
fs = require('fs'),
nodeModules = {};
fs.readdirSync('node_modules')
.filter(function(x) {
return ['.bin'].indexOf(x) === -1;
})
.forEach(function(mod) {
nodeModules[mod] = 'commonjs ' + mod;
});
let PATHS = {
app: __dirname + '/src'
};
module.exports = {
context: PATHS.app,
entry: {
app: PATHS.app+'/server.js'
},
target: 'node',
output: {
path: PATHS.app,
filename: '../build/server.js'
},
externals: nodeModules,
performance: {
hints: "warning"
},
plugins: [
jsonLoader
],
resolve: {
modules: [
'./node_modules',
path.resolve(__dirname),
path.resolve(__dirname + "/src"),
path.resolve('./config')
]
},
node: {
fs: "empty"
}
};
The server.js
let _ = require('lodash');
let modules = [ "modules/test" ];
require( 'modules/test' )();
_.map( modules, function( module ){
require( module );
});
The module in modules/ named test.js
module.exports = () => {
console.log('hello world');
};
But the result is always the same... the pm2 logs just say hello world for the static require... but for the dynamic load of the same module
Error: Cannot find module "."
All i want to be able to do is loop through an array of paths to modules and load then...
You cannot use a variable as argument to require. Webpack needs to know what files to bundle at compile time. As it does no program flow analysis, it can't know what you pass to the function. In that case it might be obvious, but this could go as far as using user input to decide what module to require, and there is no way webpack can possibly know which modules to include at compile time, so webpack does not allow it.
The example you posted is a bit different. You could use require with a concatenated string. For example:
require(`./src/${moduleName}/test`);
Which modules does webpack need to include in the bundle? The variable moduleName could be anything, so the exact module is not known at compile time. Instead it includes all modules that could possibly match the above expression. Assuming the following directory structure:
src
├─ one
│   └─ test.js
├─ two
│   ├─ subdir
│   │   └─ test.js
│   └─ test.js
└─ three
└─ test.js
All of these test.js files will be included in the bundle, because moduleName could be one or something nested like two/subdir.
For more details see require with expression of the official docs.
You cannot loop through an array and import every module of the array, with the above exception by concatenating a string, but that has the effect of including all possible modules and should generally be avoided.
I ran into this problem in an electron environment. My use case was being able to require dynamically created files in an IDE like application. I wanted to use the electron require, which is basically a NodeJS Common module loader. After some back and forth I landed on a solution that uses webpack's noParse module configuration.
First create a module that that will be ignored by webpack's parser:
// file: native-require.js
// webpack replaces calls to `require()` from within a bundle. This module
// is not parsed by webpack and exports the real `require`
// NOTE: since the module is unparsed, do not use es6 exports
module.exports = require
In my webpack config, under module, instruct the bundler not to parse this module:
{
module: {
noParse: /\/native-require.js$/,
}
}
Lastly, in any bundle where you want to access the original require:
import nativeRequire from './native-require'
const someModule = nativeRequire('/some/module.js') // dynamic imports
A bit late....but... since you are bundling to target: 'node', there is a workaround to dynamic requiring modules, and bypassing the "the effect of including all possible modules".
The solution is lifted from:
Using dynamic require on node targets WITHOUT resolve or bundle the target module · Issue #4175 · webpack/webpack
Quoted from that comment:
const requireFunc = typeof __webpack_require__ === "function" ? __non_webpack_require__ : require;
const foo = requireFunc(moduleName);
Bundles to:
const requireFunc = true ? require : require;
const foo = requireFunc(moduleName);

Angular application API URL configuration with Webpack for dev and production builds

I have an Angular app with the following simple config file config.js:
export default function(app) {
app.constant('config', {apiUrl: 'https://localhost:8080'});
};
which is imported by Webpack entry point app.js:
import config from './config';
config(app);
I'd like to have a different apiUrl when I do production build.
What's the easiest way to do it in Webpack?
There is a similiar question on https://stackoverflow.com/a/34032050/1610981
It relates you can use http://webpack.github.io/docs/list-of-plugins.html#defineplugin
The config.js file would be this:
export default function(app) {
app.constant('config', {apiUrl: API_URL});
};
And inside of webpack config files:
plugins:[
new webpack.DefinePlugin({
API_URL: JSON.stringify('https://localhost:8080')
})
]
You should have two webpack configs, one for develoment and another for production. Each one defines API_URL macro, according with the built performed.
I recommend use environment variable with webpack.DefinePlugin
//webpack.config.js
...
let API_URL;
if (process.env.NODE_ENV == 'development') {
API_URL = 'https://dev:8080';
} else {
API_URL = 'https://prod:8080';
}
// or
const API_URL = process.env.API_URL;
...
plugins:[
new webpack.DefinePlugin({API_URL: API_URL})
]
...
If NODE_ENV not setuped use export NODE_ENV=development for linux/osx or SET NODE_ENV=development for windows.

Environment Variables in an isomorphic JS app: Webpack find & replace?

I'm using webpack to bundle an isomorphic JS app (based on this example) so that the browser runs the same code as the server. Everything is running smoothly except I have a config.js with some settings which are pulled in from environment variables on the server:
module.exports = {
servers:
auth: process.env.AUTH_SERVER_URL,
content: process.env.CONTENT_SERVER_URL
}
}
On the server this is grand, but when webpack renders this for the client process is empty and this doesn't work.
I'm hoping there's a kind of 'find and replace' webpack plugin that will replace them with their content in that file alone?
"…config.js content…".replace(/process\.env\.([a-z0-9_]+)/, function(match, varName) {
return process.env[varName];
})
Note that using the DefinePlugin as suggested in the accepted answer is potentially a dangerous action as it completely exposes process.env. As Tobias commented above there's actually a plugin EnvironmentPlugin that does exactly this with an added whitelisting ability, using DefinePlugin internally.
In your webpack.config.js:
{
plugins: [
new webpack.EnvironmentPlugin([
'NODE_ENV',
'WHITELISTED_ENVIRONMENT_VARIABLE'
])
]
}
In your webpack.config.js,
use the following preLoaders (or postLoaders),
module: {
preLoaders: [
{ test: /\.js$/, loader: "transform?envify" },
]
}
Another way using the webpack.DefinePlugin:
plugins: [
new DefinePlugin({
'process.env': Object.keys(process.env).reduce(function(o, k) {
o[k] = JSON.stringify(process.env[k]);
return o;
}, {})
})
]
NOTE: The old method using envify-loader was deprecated:
DEPRECATED: use transform-loader + envify instead.
Yeah; looks like envify-loader was the easy solution.
I just added the following to my webpack loaders:
{
test: /config\.js$/, loader: "envify-loader"
}
And the config.js (and only that file) is modified to include any referenced environment variables statically :)
I needed a way to use the env variables set on the machine that is running the code, no the env variables of the machine building the app.
I do not see a solution for this yet. This is what I did.
In publicEnv.js:
// List of the env variables you want to use on the client. Careful on what you put here!
const publicEnv = [
'API_URL',
'FACEBOOK_APP_ID',
'GA_ID'
];
const isBrowser = typeof window !== 'undefined';
const base = (isBrowser ? window.__ENV__ : process.env) || {};
const env = {};
for (const v of publicEnv) {
env[v] = base[v];
}
export default env;
In the HTML template file of the page I have:
import publicEnv from 'publicEnv.js';
...
<script>
window.__ENV__ = ${stringify(publicEnv)};
// Other things you need here...
window.__INITIAL_STATE__ = ${stringify(initialState)};
</script>
So now I can get the value of the env variable on both frontend and backend with:
import publicEnv from 'publicEnv.js';
...
console.log("Google Analytic code is", publicEnv.GA_ID);
I hope it can help.

Categories

Resources