I'm using revealjs to create responsive presentations. The problem with revealjs is that all the slides code is written in a single HTML file which can be messy to some level (Some presentations' HTML code reached about 3500 lines of HTML in that single file).
I'm now restructuring this system and I would like to have a directory named slides that contains each slide HTML file. Each of these files is named slidenumber.html. Finally, I want to bundle all of the files with webpack 5 into a single HTML file in dist. I managed to achieve this but it has an issue with the dev server.
webpack.config.js
// ... imports ....
module.exports = {
...,
plugins: [
....,
new HtmlWebpackPlugin({
filename: "index.html",
inject: true,
templateContent: getTemplate(),
}),
new WatchExternalFilesPlugin({
files: ["./slides/*.html"],
}),
],
module: {
rules: [...],
},
devServer: {
port: 8080,
},
};
The getTemplate function loops over the HTML files in the slides directory and returns them wrapped with the template boilerplate
This is the function for reference
getTemplate.js
const fs = require("fs/promises");
const path = require("path");
const { parse } = require("node-html-parser");
module.exports = async () => {
const template = parse(
await fs.readFile(path.join(__dirname, "../templates/index.html"))
);
const files = await fs.readdir(path.join(__dirname, "../slides"));
for await (const fileName of files) {
const slide = parse(
await fs.readFile(path.join(__dirname, `../slides/${fileName}`))
);
template.querySelector("#slides").appendChild(slide);
}
return template.toString();
};
all of the above code is working fine on build but when running the dev server, I can't get the HtmlWebpackPlugin to re-execute the templateContent: getTemplate() on the change of any HTML slide file in the slides directory and as a result, when I edit any file of the slides HTML files in the slides directory, I don't get any update.
I'm aware that templateContent is supposed to run only on the start of the server but I'm asking if there is any other feature that can get me to the required behavior.
Thanks if you made it to here and excuse my English, I'm not a native speaker.
I could achieve the behavior I described in the question by setting a middleware from the dev server options that catches any request and returns the output of the getTemplate function.
This is the configurations for the dev server
webpack.config.dev.js
// ...imports...
module.exports = {
mode: "development",
entry: { index: "./main.js" },
output: {...},
module: {
rules: [...],
},
devServer: {
port: 8080,
watchFiles: ["./slides/*.html"],
hot: true,
onBeforeSetupMiddleware: function (devServer) {
devServer.app.get("/", async function (req, res) {
res.send(await getTemplate());
});
},
},
};
These are the configurations for the production server
webpack.config.production.js
// ...imports...
module.exports = {
mode: "production",
entry: { index: "./main.js" },
output: {...},
plugins: [
new HtmlWebpackPlugin({
filename: "index.html",
templateContent: getTemplate(),
inject: false,
}),
],
module: {
rules: [...],
},
};
I used the webpackHtmlPlugin in production as usual but in development, I didn't use it at all since it can't reload the templates on the build
In development, though I lost the ability to add the hash number to the compiled js file as I won't be able to predict the hash and inject its script tag. The compiled file had the same name as the original file and I added the script tag manually in the HTML template file.
Hope this helps anyone!
Related
I'm building a project on an ejected react cli. The reason I ejected it is because there will be multiple pages being generated and some stand alone scripts that I wanted to make and have them take advantage of the ES6 and debugging tools provided by the boilerplate.
My problem is that while using the technique to build out multiple pages with html-webpack-plugin builds out the generated HTML files with both scripts from each page.
So, lets have a look at the basic entry point
Here's my basic web pack config.
...
entry: {
devServerClient : require.resolve('react-dev-utils/webpackHotDevClient'),
// Finally, this is your app's code:
...pages,
},
...
As you can see, I'm using the same hot module reloading stuff that came with the boiler plate, but then I deviate to spread in values from the pages variable which is being required from another page:
const glob = require("glob");
const path = require("path");
const input_path = path.resolve(__dirname, "../src/apps/*/index.js");
const apps_directories = glob.sync(input_path);
let pages = {};
// Loop through them and append them to the pages object for processing.
apps_directories.map(function (app_directory) {
const split_name = app_directory.split("/");
pages[split_name[split_name.length - 2]] = app_directory;
});
module.exports = pages;
Thats how I generate multiple entries dynamically. This portion of the code works fine, but I thought I'd post it just in case something needs to happen here that I'm missing.
Next We have the actual plugin section. I take a similar approach in modularizing the code, so here's that portion in the config (root level of the web pack object):
...
plugins: [
// Generates an `index.html` file with the <script> injected.
...HtmlWebpackPlugins,
...
]
...
Again, I spread them in, the script that generates these looks like so:
const HtmlWebpackPlugin = require('html-webpack-plugin');
const pages = require("./multiplePaths");
const paths = require("./paths");
const NODE_ENV = process.env.NODE_ENV;
let HtmlWebpackPlugins = [];
Object.keys(pages).map(function (fileName) {
if (NODE_ENV === "development") {
HtmlWebpackPlugins.push( new HtmlWebpackPlugin({
inject: true,
template: paths.appHtml,
filename: `${fileName}.html`,
}));
return;
}
HtmlWebpackPlugins.push(new HtmlWebpackPlugin({
inject: true,
template: paths.appHtml,
filename: `${fileName}.html`,
minify: {
removeComments: true,
collapseWhitespace: true,
removeRedundantAttributes: true,
useShortDoctype: true,
removeEmptyAttributes: true,
removeStyleLinkTypeAttributes: true,
keepClosingSlash: true,
minifyJS: true,
minifyCSS: true,
minifyURLs: true,
},
}));
});
module.exports = HtmlWebpackPlugins;
This script here generates the multiple instances of the HtmlWebpackPlugin class per entry, and we also name the html file based on the folder the name of the folder the app resides in. This satisfies the technique in their issues section.
The problem then happens in the generated HTML page. Notice that the scripts for all the folders are injected in each app:
As you can see, each app's CSS and JS are added. This happens to both pages. I only want the relevant CSS and JS to each page.
Any idea what's going on here?
If you want to add only some of the chunks inside each page you need to specify which chunks exactly you want to have inside your html as scripts:
HtmlWebpackPlugins.push(new HtmlWebpackPlugin({
inject: true,
template: paths.appHtml,
filename: `${fileName}.html`,
chunks: [filename], // add any chunks you need here (for example, chunk with libraries
....
});
I used something like this to create entries for HtmlWebpackPlugin based on webpack entries:
// Webpack entries
const entry = {
"nameA": "./xy/nameA/main.js",
"nameB": "./xy/nameB/main.js",
};
// Plugins
let plugins = [];
// HTML injection
Object.keys(entry).forEach((ent) => {
const htmlPlugin = new HtmlWebpackPlugin({
inject: true,
title: "Your Title",
template: "./html/index.html",
filename: `formats/${ent}/index.html`,
chunks: [ent],
});
plugins.push(htmlPlugin);
});
// Insert more plugins
plugins = plugins.concat([
... your plugins
]);
// Return Config
return {
// define entry point
entry,
// define output file
output: {
...
},
...
// define Plugins
plugins,
...
};
I am using Django with Webpack and Vue, And Code splitting in Webpack. Webpack split chunk files from source code of .vue files. But, I can't load chunk files on Web browser because url is incorrect in my project structure. I want to change load url of chunk files but I don't know these. How to change load url of chunk files?
Chrome console logs:
GET http://localhost:8123/mycomponent.chunk.js 404 (Not Found)
main.js:12 Error: Loading chunk 2 failed.
(error: http://localhost:8123/mycomponent.chunk.js)
at HTMLScriptElement.a (main.js:1)
I need url of '/static/js/mycomponent.chunk.js'.
webpack.config.js:
const path = require("path")
const webpack = require('webpack')
const BundleTracker = require('webpack-bundle-tracker')
const VueLoaderPlugin = require('vue-loader/lib/plugin')
module.exports = {
context: __dirname,
entry: './src/main.js',
output: {
path: path.resolve('../static/js/'),
filename: "[name].js",
chunkFilename: '[name].chunk.js',
},
module: {
rules: [
{
test: /\.vue$/,
loader: 'vue-loader'
},
],
},
plugins: [
new BundleTracker({filename: './webpack-stats.json'}),
new VueLoaderPlugin(),
],
resolve: {
alias: {
'vue': path.resolve('./node_modules/vue/dist/vue.js'),
}
},
}
Try adding publicPath: '/static/js/' to output.
Also, if you serve your js from your Django template page, you can use django-webpack-loader, it simplifies serving js files and supports hot module reloading.
I have a rather large React application built with webpack 2. The application is embedded into a Drupal site as a SPA within the existing site. The Drupal site has a complex gulp build setup and I can't replicate it with webpack, so I decided to keep it.
I have split my React application into multiple parts using the DllPlugin / DllReferencePlugin combo which is shipped out of the box in webpack 2. This works great, and I get a nice vendor-bundle when building with webpack.
The problem is when I try to run my webpack configuration in gulp, I get an error. I might be doing it wrong, as I have not been able to find much documentation on this approach, but nevertheless, it's not working for me.
It looks like it's trying to include the the manifest file from my vendor-bundle before creating it.
Whenever I run one of my defined gulp tasks, like gulp react-vendor I get an error, saying that it cannot resolve the vendor-manifest.json file.
If I on other hand run webpack --config=webpack.dll.js in my terminal, webpack compiles just fine and with no errors.
I have included what I think is the relevant files. Any help on this is appreciated.
webpack.config.js
// Use node.js built-in path module to avoid path issues across platforms.
const path = require('path');
const webpack = require('webpack');
// Set environment variable.
const production = process.env.NODE_ENV === "production";
const appSource = path.join(__dirname, 'react/src/');
const buildPath = path.join(__dirname, 'react/build/');
const ReactConfig = {
entry: [
'./react/src/index.jsx'
],
output: {
path: buildPath,
publicPath: buildPath,
filename: 'app.js'
},
module: {
rules: [
{
exclude: /(node_modules)/,
use: {
loader: "babel-loader?cacheDirectory=true",
options: {
presets: ["react", "es2015", "stage-0"]
},
},
},
],
},
resolve: {
modules: [
path.join(__dirname, 'node_modules'),
'./react/src/'
],
extensions: ['.js', '.jsx', '.es6'],
},
context: __dirname,
devServer: {
historyApiFallback: true,
contentBase: appSource
},
// TODO: Split plugins based on prod and dev builds.
plugins: [
new webpack.DllReferencePlugin({
context: path.join(__dirname, "react", "src"),
manifest: require(path.join(__dirname, "react", "vendors", "vendor-manifest.json"))
}),
new webpack.optimize.CommonsChunkPlugin({
name: 'manifest',
filename: 'webpack-loader.js'
}),
]
};
// Add environment specific configuration.
if (production) {
ReactConfig.plugins.push(
new webpack.optimize.UglifyJsPlugin()
);
}
module.exports = [ReactConfig];
webpack.dll.js
const path = require("path");
const webpack = require("webpack");
const production = process.env.NODE_ENV === "production";
const DllConfig = {
entry: {
vendor: [path.join(__dirname, "react", "vendors", "vendors.js")]
},
output: {
path: path.join(__dirname, "react", "vendors"),
filename: "dll.[name].js",
library: "[name]"
},
plugins: [
new webpack.DllPlugin({
path: path.join(__dirname, "react", "vendors", "[name]-manifest.json"),
name: "[name]",
context: path.resolve(__dirname, "react", "src")
}),
// Resolve warning message related to the 'fetch' node_module.
new webpack.IgnorePlugin(/\/iconv-loader$/),
],
resolve: {
modules: [
path.join(__dirname, 'node_modules'),
],
extensions: ['.js', '.jsx', '.es6'],
},
// Added to resolve a dependency issue in this build #https://github.com/hapijs/joi/issues/665
node: {
net: 'empty',
tls: 'empty',
dns: 'empty'
}
};
if (production) {
DllConfig.plugins.push(
new webpack.optimize.UglifyJsPlugin()
);
}
module.exports = [DllConfig];
vendors.js (to determine what to add to the Dll)
require("react");
require("react-bootstrap");
require("react-dom");
require("react-redux");
require("react-router-dom");
require("redux");
require("redux-form");
require("redux-promise");
require("redux-thunk");
require("classnames");
require("whatwg-fetch");
require("fetch");
require("prop-types");
require("url");
require("validator");
gulpfile.js
'use strict';
const gulp = require('gulp');
const webpack = require ('webpack');
const reactConfig = require('./webpack.config.js');
const vendorConfig = require('./webpack.dll.js');
// React webpack source build.
gulp.task('react-src', function (callback) {
webpack(reactConfig, function (err, stats) {
callback();
})
});
// React webpack vendor build.
gulp.task('react-vendor', function (callback) {
webpack(vendorConfig, function (err, stats) {
callback();
})
});
// Full webpack react build.
gulp.task('react-full', ['react-vendor', 'react-src']);
NOTE:
If I build my vendor-bundle with the terminal with webpack --config=webpack.dll.js first and it creates the vendor-manifest.json file, I can then subsequently successfully run my gulp tasks with no issues.
This is not very helpful though, as this still will not allow me to use webpack with gulp, as I intend to clean the build before new builds run.
I ended up using the solution mentioned in the end of my question. I build my DLL file first and then I can successfully run my gulp webpack tasks.
One change that can make it easier to debug the issue, is to use the Gulp utility module (gulp-util) to show any webpack errors that might show up during build of webpack, using gulp.
My final gulp setup ended up looking like this:
gulpfile.js
'use strict';
const gulp = require('gulp');
const gutil = require('gulp-util');
const webpack = require('webpack');
const reactConfig = require('./webpack.config.js');
const vendorConfig = require('./webpack.dll.js');
// React webpack source build.
gulp.task('react', function (callback) {
webpack(reactConfig, function (err, stats) {
if (err) {
throw new gutil.PluginError('webpack', err);
}
else {
gutil.log('[webpack]', stats.toString());
}
callback();
});
});
// React webpack vendor build.
gulp.task('react-vendor', function (callback) {
webpack(vendorConfig, function (err, stats) {
if (err) {
throw new gutil.PluginError('webpack', err);
}
else {
gutil.log('[webpack]', stats.toString());
}
callback();
});
});
// React: Rebuilds both source and vendor in the right order.
gulp.task('react-full', ['react-vendor'], function () {
gulp.start('react');
});
I hope this might help someone in a similar situation.
Whenever I run one of my defined gulp tasks, like gulp react-vendor I get an error, saying that it cannot resolve the vendor-manifest.json file.
Your gulpfile.js contains this:
const reactConfig = require('./webpack.config.js');
const vendorConfig = require('./webpack.dll.js');
And webpack.config.js contains this:
new webpack.DllReferencePlugin({
context: path.join(__dirname, "react", "src"),
manifest: require(path.join(__dirname, "react", "vendors", "vendor-manifest.json"))
}),
The require() calls are currently all executed immediately. Whenever you run Gulp, it will evaluate both Webpack configuration files. As currently configured, Node runs the code in webpack.config.js at startup, and from there it sees the require() used in your creation of the DllReferencePlugin, so it will also try to read manifest.json at that time and turn it into an object...which is before it has been built.
You can solve this in one of two ways:
The DllReferencePlugin's manifest option supports either an object (which is what you are currently providing), or else a string containing the path of the manifest file. In other words, it should work if you remove the require() from around your path.join(...) call.
Alternatively, you can also defer the loading of the Webpack config files. Moving your const reactConfig = require('./webpack.config.js'); from the top of the file directly into the gulp task function should be sufficient, assuming that this function is not invoked until after the manifest has been built.
I want to set up an Angular 1.x app from scratch using webpack 2.
I am having trouble finding the best configuration for webpack.config, with optimal entry and output for production (meaning, all code, style and templating minified and gziped with no code repetition).
My main problem is how to set up webpack.config so that it recognizes all partials within the folder structure of my project, like these:
My current config file, for reference (which can't see subfolders):
var HtmlWebpackPlugin = require( 'html-webpack-plugin' );
var ExtractTextPlugin = require( 'extract-text-webpack-plugin' );
var path = require( 'path' );
module.exports = {
devServer: {
compress: true,
contentBase: path.join( __dirname, '/dist' ),
open: true,
port: 9000,
stats: 'errors-only'
},
entry: './src/app.js',
output: {
path: path.join( __dirname, '/dist' ),
filename: 'app.bundle.js'
},
module: {
rules: [ {
test: /\.scss$/,
use: ExtractTextPlugin.extract( {
fallback: 'style-loader',
use: [
'css-loader',
'sass-loader'
],
publicPath: '/dist'
} )
} ]
},
plugins: [
new HtmlWebpackPlugin( {
hash: true,
minify: { collapseWhitespace: true },
template: './src/index.html',
title: 'Prov'
} ),
new ExtractTextPlugin( {
filename: 'main.css',
allChunks: true
} )
]
};
Note that this isn't an exhaustive solution, as there are many optimizations one can make in the frontend, and I've kept the code snippets fairly short.
With webpack, there are a few routes that you can take to include partials into your app.js.
Solution 1
You can import/require your partials within app.js as such:
app.js
var angular = require('angular');
var proverbList = require('./proverb/list/proverb.list');
// require other components
// set up your app as normal
This allows the app.bundle.js to include your component js files in the main bundle. You can also use html-loader to include templates in the final bundle.
This isn't ideal, as all it does is create a large bundle.js (which doesn't leverage multiple downloads with http2 nor does it allow loading of components/files when the user explicitly requires it).
Solution 2
Importing partials as separate entry files into your webpack bundle:
webpack.config.js
const globby = require('globby');
const sourceDir = 'src';
var webpackentry = {
app: `${__dirname}/src/app.js`
};
const glob = globby.sync(`${__dirname}/${sourceDir}/**/*.js`)
.map((file)=>{
let name = file.split('/').pop().replace('.js', '');
webpackentry[name] = file;
});
const config = {
entry: webpackentry,
...
}
The second solution is unorthodox but it can be useful if you wanted to split all your partials as <script> tags in your html (for example if your company/team uses that as a means to include your directive/components/controllers), or if you have an app-2.bundle.js.
Solution 3
Use CommonsChunkPlugin:
webpack.config.js
let webpackentry = {
vendor: [
'module1',
'module2',
'module3',
]
}
...
plugins: [
new webpack.optimize.CommonsChunkPlugin({
name: ['vendor'] //... add other modules
})
]
CommonsChunkPlugin allows webpack to scrawl through your entry files and discern common modules that are shared among them. This means that even if you are importing module1 in different files, they will be compiled only once in your final bundle.
I have a this file backend-dev.js which is mostly webpack configuration. I use it to run my express server from the bundled file. It stays on and restarts the server on any change. Is there any possible configuration can be added to auto refresh the browser too whenever I change the code?
This is what I have in the backend-dev.js:
const path = require('path');
const webpack = require('webpack');
const spawn = require('child_process').spawn;
const HtmlWebpackPlugin = require('html-webpack-plugin');
const compiler = webpack({
// add your webpack configuration here
entry: './app.js',
output: {
path: path.resolve(__dirname, 'dist'),
filename: "bundle.js"
},
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
options: {
presets: ['env', 'es2015', 'stage-2']
}
}
}
]
},
/*plugins: [
new HtmlWebpackPlugin({
title: 'Project Demo',
hash: true,
template: './views/index.ejs' // Load a custom template (ejs by default see the FAQ for details)
})
],*/
node: {
__dirname: false
},
target: 'node'
});
const watchConfig = {
// compiler watch configuration
// see https://webpack.js.org/configuration/watch/
aggregateTimeout: 300,
poll: 1000
};
let serverControl;
compiler.watch(watchConfig, (err, stats) => {
if (err) {
console.error(err.stack || err);
if (err.details) {
console.error(err.details);
}
return;
}
const info = stats.toJson();
if (stats.hasErrors()) {
info.errors.forEach(message => console.log(message));
return;
}
if (stats.hasWarnings()) {
info.warnings.forEach(message => console.log(message));
}
if (serverControl) {
serverControl.kill();
}
// change filename to the relative path to the bundle created by webpack, if necessary(choose what to run)
serverControl = spawn('node', [path.resolve(__dirname, 'dist/bundle.js')]);
serverControl.stdout.on('data', data => console.log(data.toString()));
serverControl.stderr.on('data', data => console.error(data.toString()));
});
Excellent question: I solved it in the following way:
express has two modules you can install to refresh in hot without refreshing the browser, and it will refresh automatically: I leave you my configuration.
npm i livereload
npm i connect-livereload
const livereload = require('livereload');
const connectLivereload = require('connect-livereload');
const liveReloadServer = livereload.createServer();
liveReloadServer.watch(path.join(__dirname, '..', 'dist', 'frontend'));
const app = express();
app.use(connectLivereload());
Note that the folder you want to monitor is in dist/frontend. Change the folder you want to monitor so that it works: To monitor the backend I am using NODEMON
livereload open a port for the browser in the backend to expose the changes: how does the connection happen? It's easy; express with "connect-livereload" and inject a script that monitors that port: if a change occurs, the browser is notified by express, and the browser will refresh for you.
I leave The information as simple as possible to test, and I do not recommend using it like this: To use it you must separate the development and production environments. I keep it simple so that it is easy to understand.
Here I leave the most relevant link I found: I hope it will be helpful too.
https://bytearcher.com/articles/refresh-changes-browser-express-livereload-nodemon/