Hot-reload doesn't inject change on file change - javascript

I have the following problem and have spent a lot of time trying to solve it without success. I have a folder that contains all the labels of my website in X languages. The problem is that when I make a change in that folder it is not updated in the browser, like you have when you change your layouts or content. What is strange is that when I update one file in that folder, I can see in the terminal that a change was made in the file, but when I check the dist/*.html file, I cannot see the new label in the dist/**.html it is not injected. I think it is a problem with browser-sync.
I tried playing with addWatchTarget() and setBrowserSyncConfig({ files: [...] }) in .eleventy.js config file, but without success, the changes are still not injected.
Also, I found a working case. I update my translations file, then just save in .elevently.js to trigger the watch thingy. It will inject the new labels in the dist/**.html.
Project with the issue
Here is my structure
i18n
-- index.js // custom script to pick the good label in the translations folder
src
-- _assets
-- _includes
-- content
-- translations
---- index.js
---- en
---- jp
.eleventy.js
...
.eleventy.js
const i18n = require('./i18n');
const translations = require('./src/translations');
module.exports = (config) => {
// Custom i18n filter
config.addFilter('i18n', (key, page, data) => {
const pluginOptions = {
translations,
languages: LANGUAGES,
defaultLanguage: DEFAULT_LANGUAGE,
};
return i18n(key,
data,
pluginOptions,
page,
);
});
// Needed to prevent eleventy from ignoring changes to our
// template files since they are in our `.gitignore`
config.setUseGitIgnore(false);
config.addPassthroughCopy({
'src/misc/favicon.png': 'favicon.png',
'src/misc/robots.txt': 'robots.txt',
});
return {
dir: {
input: 'src',
output: 'dist',
includes: '_includes/partials',
layouts: '_includes/layouts'
}
};
};
src/translations/index.js
module.exports = {
hello: 'Hello',
}
Here are the logs when I save a change in the src/translations/**/*.js file. We can see Browsersync reloading before writing the change and then reloading after writing is complete. But the change is not injected into dist/**.html
[start:eleventy] File changed: src/translations/index.js
[start:eleventy] [Browsersync] Reloading Browsers...
[start:eleventy] Writing dist/index.html from ./src/content/home.liquid.
...
[start:eleventy] Copied 4 files / Wrote 68 files in 0.90 seconds (13.2ms each, v0.11.1)
[start:eleventy] Watching…
[start:eleventy] [Browsersync] Reloading Browsers...
And here are the logs when I save in .eleventy.js we can see some diff.
[start:eleventy] File changed: .eleventy.js
[start:eleventy] Writing dist/index.html from ./src/content/home.liquid.
...
[start:eleventy] Copied 4 files / Wrote 68 files in 0.75 seconds (11.0ms each, v0.11.1)
[start:eleventy] [Browsersync] Reloading Browsers...
Has anyone encountered this or has a workaround?

Related

Webpack 5 IgnorePlugin - Not ignoring JS file from output on CSS files only?

I am trying to use the Webpack's IngorePlugin. I am using my Webpack file only to create a CSS file. On build, it outputs a JS file. But I don't want that. Tried ignoring JS files but still outputs it.
new webpack.IgnorePlugin(/^\.\/js\/(?!admin)/),
Outputs in the ROOT folder. So I want to disable all JS files from the output in the root folder. "admin" is the file being created.
How can I do this?
To properly answer your question, it'd be helpful if you posted a link to the full WP config file and an example of the file that's being processed.
Also, you mentioned you're only using WP to create a CSS file, does that mean you're just trying to use something like SASS, Stylus, Less, etc? If so, you could probably just set up a package.json script to compile your CSS without WP.
For example, if you have a .scss file, you could install node-sass, and create a simple Node script to compile what file you pass in as an arg.
bin/
- build-css.js
src/
- styles.sass
Within build-css.js
#!/usr/bin/env node
const { basename, resolve } = require('path');
const sass = require('node-sass');
const [...files] = process.argv.slice(2);
if (files.length) {
files.forEach((relativeFilePath) => {
const fileName = basename(relativeFilePath, '.scss');
sass.render(
{
file: resolve(__dirname, relativeFilePath),
outFile: resolve(__dirname, `./public/css/${fileName}.css`),
},
(err, result) => { console.log(err); }
);
});
}
else {
console.log('No files were provided to process');
}
Within package.json
"scripts": {
"build:css": "node ./bin/build-css.js"
}
The above has the benefit of giving you the control of how your files are processed at a more granular level, and you're only locked in to any SCSS changes, instead of Webpack and SCSS.
If you're using WP for it's file watching capabilities, you could instead wire up chokidar to run the new script when you change files.

How to ignore multiple directories and files using electron releoad

I have an Electron project using electron-reload, the file structure is:
./public/templates/ --------- contains some html that should be watched and trigger reload
./public/templates/test/ ---- contains some test/experimental html that should be ignored/not trigger reload
./notes --------------------- just containing some notes, this should not trigger reload
./node_modules -------------- should be ignored
./main.js ------------------- should trigger reload on change
I know I can ignore directories by setting the option ignored on requiring electron reload in main.js.
This uses chokidar for filtering, ignored option is described here
But I don't get how to ignore multiple paths.
I tried to require electron reload like this:
require('electron-reload')(__dirname, {ignored: /node_modules|notes|public\/templates\/test/gi});
But when I change a file in
public/templates/test/ the application is reloading.
electron 5.0.6
electron-reload 1.5.0
Thanks in advance
Kevin
You can check the eletron-reload module.
electron_reload(paths, options):
paths: a file, directory or glob pattern to watch
options will default to {ignored: /node_modules|[\/\\]\./, argv: []}.
Then in the ignored field you put Regular Expressions of what you want to ignore. You can check the regular expressions here
For example, if I want to ignore all the folders: node_modules, files/img, resorces within my directory. The main.js would be:
//main.js
const ignoredNode = /node_modules|[/\\]\./;
const ignored1 = /resources|[/\\]\./; // all folder resorces => resources
const ignored2 = /files\/img|[/\\]\./; //all folder files/img => files/img
if (process.env.NODE_ENV !== 'production'){
require('electron-reload')(__dirname, {ignored: [ignored1, ignored2, ignoredNode] });
}
This prevents an application reload whenever there are changes within the folders: node_modules, files/img, resorces.
Now you have to use regular expressions for your purpose.

vue application does not update when using webpack chunks

I have a mysterious behaviour here using webpack in a vue-application.
I have some view-paths stored in my database and I import the views in a for-loop dynamically like
for(let i = 0; i < routesList.length; i++) {
const viewPath = routesList[i];
const view = () => import('views/' + viewPath);
/* create a vue router object here */
}
This works absolutely fine but as soon as I am using WebpackChunkName all the future changes in my vue-files does not get compiled. The application seems to use some sort of cached files. Although npm watch recognizes the changes and recompiles correctly.
const view = () => import(/* webpackChunkName: "views" */ 'views/' + viewPath);
Another strange thing to notice is that the chunk gets named like
views0
views2
views4
etc.
The application runs on a Debian 8 distribution. These are the output-settings of my webpack.mix.js file
output: {
path: path.resolve(__dirname, 'demo'),
publicPath: '/demo/', // due to shared hosting
filename: `[name].${mix.inProduction() ? '[chunkhash].' : ''}js`,
chunkFilename: `[name].${mix.inProduction() ? '[chunkhash].' : ''}js`
}
Has anybody faced something similar? Really I am clueless about what is wrong... I even tried to delete the compiled chunks and mix-manifest.json in the output folder and recompiled it over and over again. But this does not change the situation.
Related
https://github.com/webpack/webpack/issues/2530
https://github.com/webpack/webpack-dev-server/issues/875
https://github.com/webpack/webpack-dev-server/issues/885
//Edit
9/27 - Regarding the related issues on github there could be a
problem with my paths in the webpack config file?
9/28 - Looks like this happens irregular. Updating my npm packages yesterday caused this to work for a moment. Today I'm stuck at the same again.

Dynamically load chunks in webpack?

We load files dynamically i.e., we don't know which files will be loaded until runtime. At the same, for faster loading, we'd like to put related files in the same chunk.
How can I do that with webpack?
This is what we have and it's failing with a 404 error (1.1.bundle.js not found)
This is what webpack.config looks like:
entry: {
main: //...,
related_files: [ //should create chunk for file1 and file2?
'./file1.js',
'./file2.js'
]
},
This is what the code to dynamically load the files looks like:
var dynamicFileName = //...
require.ensure([], function (require) {
//should dynamically load the chunk containing dynamicFileName?
//fails with 'file1.js' or 'file2.js'
var modImpl = require(dynamicFileName);
//...
});
Update 1: the error message is caused by not configuring output.publicPath. However, I never created 1.1.bundle.js. It seems to be ignoring the entry point.
Update 2: even after fixing output.publicPath, it's unable to load a dynamically generated filename. So it seems that webpack cannot handle this.
By default, webpack tries to bundle all the code in a single file. If you're using code from file1.js/file2.js in main entry point, webpack will bundle contents of all the files in main.js, and second entry point related_files will output only file1/file2 contents.
Webpack handles this situation by using CommonsChunkPlugin, your config must look like this:
entry: {
main: //...,
related_files: ['./file1.js','./file2.js']
},
plugins: [
new webpack.optimize.CommonsChunkPlugin('related_files', 'related_files.js')
]
Second part of the question is that webpack parses require statement, and outputs 1.1.bundle.js - the dynamic module, that can be loaded with require in the code. In your case, dynamicFileName = 'related_files', not file1/file2.
Please see http://webpack.github.io/docs/code-splitting.html#split-app-and-vendor-code

Gulp doesn't copy all files as expected

I tried to create a gulpfile.js for my personal website project. I've never done this before but with a little 'trial and error' it now works in an acceptable way.
The only thing that doesn't work even after 1000 modifications is simple copying files and folders.
var files = {
data_src : [
'./files.json',
'data/**/*.*'
],
distribution_dest : '_distribution'
};
gulp.task('copy-data', function() {
gulp.src(files.data_src, { base: './' })
.pipe(gulp.dest(files.distribution_dest))
.pipe(notify({message: 'Data copied for distribution!'}));
});
This should copy all sub-folders and files to the gulp.dest. But it copies only half of them, some folders will be ignored even if I change their names etc. (no special characters, same subfolder structure as the once that got copied correctly ...) - nothing worked. I just can't see any pattern in this.
There is no error message while running gulp. Nothing that would help me find the error.
Why are some folders or files excluded from copying?
I use base to keep the folder / sub-folder structure; tried with and without 'base' -> no effects on the copying process.
I also changed the position of the 'copy-data' task in the run-list. Actually it's the first task to run. There seems to be no change in behavior no matter if it's the first or the last one.
gulp.task('default', function() {
gulp.run('copy-data', 'custom-sass', 'framework-sass', 'custom-js', 'framework-js', 'replace-tags', 'browser-sync');
... some watches ...
});
The structure of the data folder looks like these:
./data
|-doc
|---content
|---template
|-img
|---chart
|---icon
|---logo
|---pattern
|---people
|---photo
|---symbol
|-----brandklassen
|-----brandschutzzeichen
|-----gebotszeichen
|-----gefahrensymbole
|-----rettungszeichen
|-----verbotszeichen
|-----verkehrsrechtzeichen
|-----warnzeichen
|---wallpaper
/data/doc and all subfolders are ok.
/data/img/chart to /data/img/people are also ok.
Within /data/img/photo only 14 out of 21 images are copied.
/data/img/symbol with sub-folders and /data/img/wallpaper were ignored completely.
SOLVED IT MYSELF! The problem was caused by async operating tasks. Adding a return forced gulp to complete the copying process before continuing!
gulp.task('copy-data', function() {
return gulp.src(files.data_src, { base: './' })
.pipe(gulp.dest(files.distribution_dest))
.pipe(notify({message: 'Data copied for distribution!'}))
});
Now all images will be copied!

Categories

Resources