webpack watch doesn't trigger - javascript

I run webpack with native file system (inotify) support,
also tested with chokidar and it correctly picked up all changes to files on that filesystem/folder.
The webpack configuration in Sage 9 WordPress base theme is used (https://github.com/roots/sage/blob/c21df9965ff8217c3b4ff90bbe6099206d7b4fbf/resources/assets/config.json#L16).
Only the PHP files are listed for being watched on - but their dependencies (SCSS/JS/...) are also watched?
I start webpack watch with npm/yarn package script that starts webpack with
$ webpack --hide-modules --watch --config resources/assets/build/webpack.config.js
Webpack is watching the files… [BS] [HTML Injector] Running... [BS] Proxying: http://dev:8084 [BS] Access URLs: ---------------------------------- Local: http://localhost:3000 External: http://127.0.0.1:3000 ---------------------------------- UI: http://localhost:3001 UI External: http://127.0.0.1:3001 ---------------------------------- [BS] Watching files...
However, changing files - even just the PHP files specified in watch array above - doesn't trigger any update by webpack.
What config could be missing? Are there any ways finding out what exactly is webpack watching - and whether it really detects a change (and just ignores it) or not?

The reason was CORS, the script that should load new webpack builds is blocked by browser due to CORS restrictions. It is possible to configure browserSync to send proper headers for CORS, also see https://discourse.roots.io/t/sage-9-browsersync-not-loading-any-css-at-all-on-yarn-run-start/11332/26 .
new BrowserSyncPlugin({
target,
open: config.open,
proxyUrl: config.proxyUrl,
watch: config.watch,
delay: 500,
advanced: {
browserSync: {
cors: true,
},
},
}),

Related

We're sorry but sf doesn't work properly without JavaScript enabled. Please enable it to continue

I have a Vuejs app that makes an Axios call. I Dockerized it and used digitalocean to serve it to the cloud. App is working but after any axios call, i am getting "We're sorry but sf doesn't work properly without JavaScript enabled. Please enable it to continue." as response.
-I have tried changing the port from 8080 to 8081, running the app on incognito browser tab, changing "baseUrl" with "baseURL" and in front-end docker file i was installing libraries with npm, i also tried to use yarn but i still have this issue.
Is there any more idea about how to fix it ?
main.js file
createApp(App)
.use(store)
.use(vue3GoogleLogin, {
clientId:
"******",
})
.component("font-awesome-icon", FontAwesomeIcon)
.component("MazBtn", MazBtn)
.component("MazInput", MazInput)
.component("MazPhoneNumberInput", MazPhoneNumberInput)
.component("Datepicker", Datepicker)
// .component("VueGlide", VueGlide)
.use(router)
.mount("#app");
Frontend Docker file ;
#Base image
FROM node:lts-alpine
#Install serve package
RUN npm i -g serve
# Set the working directory
WORKDIR /app
# Copy the package.json and package-lock.json
COPY package*.json ./
# install project dependencies
RUN npm install
# Copy the project files
COPY . .
# Build the project
# Build the project
RUN npm run build
# Expose a port
EXPOSE 3000
# Executables
CMD [ "serve", "-s", "dist" ]
Backend docker file
FROM python:3.10-bullseye
# Working directory
WORKDIR /app
# Copy the dependencies
COPY ./docs/requirements.txt /app
# Install the dependencies
RUN pip3 install -r requirements.txt
# Copy the files
COPY . .
WORKDIR /app/backend
ENV FLASK_APP=app.py
# Executable commands
CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]
vue.config.js file ;
const { defineConfig } = require("#vue/cli-service");
module.exports = defineConfig({
transpileDependencies: true,
devServer: {
compress: true,
host: "127.0.0.1",
proxy: {
// "/upload/img": {
// // target: "http://127.0.0.1:9000",
// target: "http://127.0.0.1:5000",
// },
"/api": {
// target: "http://127.0.0.1:9000",
target: "http://127.0.0.1:5000",
},
"/media": {
target: "http://localhost:8000",
},
"/http-bind": {
target: "https://localhost:8443",
logLevel: "debug",
},
},
// https: true,
// watchContentBase: false,
},
});
Your backend requests are being handled by the frontend app. It looks like you are relying on the development server's proxy functionality in order to forward them to the backend, but this will (and should) not be active when you deploy your app. Instead you will need another proxy that sends /api requests to your backend and other requests to your frontend.

Webpacker error when running server for new Rails app

Just created a brand new Rails app and receiving this error when I run the server:
Webpacker can't find application.js in /public/packs/manifest.json. Possible causes:
1. You want to set webpacker.yml value of compile to true for your environment
unless you are using the `webpack -w` or the webpack-dev-server.
2. webpack has not yet re-run to reflect updates.
3. You have misconfigured Webpacker's config/webpacker.yml file.
4. Your webpack configuration is not creating a manifest.
Your manifest contains:
{
}
Here's my webpacker.yml:
# Note: You must restart bin/webpack-dev-server for changes to take effect
default: &default
source_path: app/javascript
source_entry_path: packs
public_output_path: packs
cache_path: tmp/cache/webpacker
webpack_compile_output: false
# Additional paths webpack should lookup modules
# ['app/assets', 'engine/foo/app/assets']
resolved_paths: []
# Reload manifest.json on all requests so we reload latest compiled packs
cache_manifest: false
extensions:
- .js
- .sass
- .scss
- .css
- .module.sass
- .module.scss
- .module.css
- .png
- .svg
- .gif
- .jpeg
- .jpg
development:
<<: *default
compile: true
# Reference: https://webpack.js.org/configuration/dev-server/
dev_server:
https: false
host: localhost
port: 3035
public: localhost:3035
hmr: false
# Inline should be set to true if using HMR
inline: true
overlay: true
compress: true
disable_host_check: true
use_local_ip: false
quiet: false
headers:
'Access-Control-Allow-Origin': '*'
watch_options:
ignored: '**/node_modules/**'
test:
<<: *default
compile: true
# Compile test packs to a separate directory
public_output_path: packs-test
production:
<<: *default
# Production depends on precompilation of packs prior to booting for performance.
compile: false
# Cache manifest.json for performance
cache_manifest: true
After running ./bin/webpack, it recommended I install webpack-cli via yarn add -D.
After I did this, I got the following error:
Error: Cannot find module 'webpack-cli'
Require stack:
- ../node_modules/webpack/bin/webpack.js
at Function.Module._resolveFilename (internal/modules/cjs/loader.js:1030:15)
at Function.Module._load (internal/modules/cjs/loader.js:899:27)
at Module.require (internal/modules/cjs/loader.js:1090:19)
at require (internal/modules/cjs/helpers.js:75:18)
at ../node_modules/webpack/bin/webpack.js:143:5
at processTicksAndRejections (internal/process/task_queues.js:93:5) {
code: 'MODULE_NOT_FOUND',
requireStack: [
'../node_modules/webpack/bin/webpack.js'
]
I don't have a manifest.json - shouldn't this have been created automatically?
Any ideas?
The problem comes from the version of node on your local computer. When you created your rails app, you probably had an error like this pop up:
The JavaScript app source directory already exists
apply /Users/mconiaris/.rbenv/versions/2.6.4/lib/ruby/gems/2.6.0/gems/webpacker-4.0.7/lib/install/binstubs.rb
Copying binstubs
exist bin
create bin/webpack
create bin/webpack-dev-server
append .gitignore
Installing all JavaScript dependencies [4.0.7]
run yarn add #rails/webpacker from "."
yarn add v1.17.3
info No lockfile found.
[1/4] 🔍 Resolving packages...
warning #rails/webpacker > postcss-preset-env > postcss-color-functional-notation > postcss-values-parser > flatten#1.0.2: I wrote this module a very long time ago; you should use something else.
[2/4] 🚚 Fetching packages...
error get-caller-file#2.0.5: The engine "node" is incompatible with this module. Expected version "6.* || 8.* || >= 10.*". Got "9.4.0"
error Found incompatible module.
info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.
To fix the problem, first confirm your version of node to ensure that it's not 9.x.x:
node -v
Check the version again. If it's 12.10 or higher, you should be all set.
Go start another brand new rails project and it should work fine.
Good luck!

How do I set up AWS Cloud9 to run an existing JavaScript app with webpack-dev-server (in development mode)?

I am trying to get my fairly typical JavaScript (React) app to run in dev mode on AWS Cloud9. I successfully cloned my repo (using https ugh), installed my npm packages, and can run scripts in the console. However, I don't know how to run and access the app in dev mode. There are a plethora of docs but they all seem to dance around the running part. My guess is I need to somehow set a custom host and port, but I also need to find what URL to use to see the app running.
Here is my devServer config:
devServer: {
// Display only errors to reduce the amount of output.
stats: "errors-only",
host, // Defaults to `localhost`
port, // Defaults to 8080
overlay: {
errors: true,
warnings: true,
},
}
If anyone comes across this, I wanted to share my solution because I know how frustrating this can be:
First, create a script in your package.json file:
"start": "webpack-dev-server --open"
Then, add the following to your Webpack config file:
devServer: {
contentBase: path.join(__dirname, 'dist'),
host: '0.0.0.0',
port: 8080,
compress: true,
}
Then, open the terminal in AWS Cloud 9, and run the script:
npm start
Finally, click on the link in the terminal: "Project is running at http://0.0.0.0:8080/" and your app will show in a new window.
**If it doesn't work, don't forget to allow port 80 on your Cloud 9 Security Group: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/working-with-security-groups.html#adding-security-group-rule
If you want to view the project in the preview pane, you can add the following to your devServer config:
disableHostCheck: true,
However, it's important to note that when set to true, this option bypasses host checking. THIS IS NOT RECOMMENDED as apps that do not check the host are vulnerable to DNS rebinding attacks.
1) First thing you need to do is to run react app on port 8080. You can do this by setting environment variable PORT to 8080 and then just starting react dev server from AWS Cloud9 terminal.
export PORT=8080
npm start
For details look at this discussion on GitHub.
2) After starting your application you can preview it by clicking Preview -> Preview Running Application at the top of AWS Cloud9.
For more details check this AWS Cloud9 doc
In webpack.config.js:
devServer: {
historyApiFallback: true,
contentBase: './',
host: process.env.IP,
//https: true,
port: process.env.PORT,
"public": "your-project.c9users.io" //no trailing slash
},
Refer Link

Grunt dev server to allow push states

I am trying to set up my grunt server to allow push states.
After countless google searches and reading SO posts I cannot figure out how to do this.
I keep getting errors like the one below.
Does anyone have any ideas how to fix this?
No "connect" targets found. Warning: Task "connect" failed. Use --force to continue.
It appears to me below that I have defined targets with the line
open: {
target: 'http://localhost:8000'
}
See complete code below:
var pushState = require('grunt-connect-pushstate/lib/utils').pushState;
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
connect: {
options: {
hostname: 'localhost',
port: 8000,
keepalive: true,
open: {
target: 'http://localhost:8000'
},
middleware: function (connect, options) {
return [
// Rewrite requests to root so they may be handled by router
pushState(),
// Serve static files
connect.static(options.base)
];
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-uglify'); // Load the plugin that provides the "uglify" task.
grunt.loadNpmTasks('grunt-contrib-connect'); // Load the plugin that provides the "connect" task.
// Default task(s).
grunt.registerTask('default', [ 'connect']);
};
Push states are already included in most SPA frameworks, so you might not need this unless you're building a framework.
Angular: https://scotch.io/tutorials/pretty-urls-in-angularjs-removing-the-hashtag
React: How to remove the hash from the url in react-router
This looks like a grunt build script to compile an application to serve. So I'm not exactly sure how you'd use pushStates in the build process. You may be trying to solve the wrong problem.
Don't bother with grunt to deploy a local dev pushstate server for your SPA.
In your project directory, install https://www.npmjs.com/package/pushstate-server
npm i pushstate-server -D
Then to launch it, add a script entry in the package.json of your project:
…
"scripts": {
"dev": "pushstate-server"
}
…
This way you can now start it running npm run dev
All the requests which would normally end in a 404 will now redirect to index.html.

How do I get karma webdriver launcher to use my selenium server/grid

Can anyone help shed some light on to what is stopping Karma javascript test runner from connecting to and using my selenium grid/server?
I have a successfully working selenium grid environment that I already use with python selenium bindings for web application system testing. I'm running Selenium Server v.2.34.0 currently and it has 4 separate grid nodes connected to it.
I want to also leverage and reuse this resource for javascript testing against multiple browsers. Specifically I'm using the node.js based Karma test runner executing jasmine based unit tests. I've installed the "karma-webdriver-launcher" plugin. I can run my javascript tests with karma locally spawning firefox, chrome or IE browsers just fine.
When I try to use the remote selenium server to use a browser from the pool/farm, it fails to find a browser and I get the following warning output:
DEBUG [config]: autoWatch set to false, because of singleRun
DEBUG [plugin]: Loading karma-* from C:\nodedebug\itpt\node_modules
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-chrome-launcher.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-firefox-launcher.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-html-reporter.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-ie-launcher.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-jasmine.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-phantomjs-launcher.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-typescript-preprocessor.
DEBUG [plugin]: Loading plugin C:\nodedebug\itpt\node_modules/karma-webdriver-launcher.
DEBUG [plugin]: Loading inlined plugin (defining launcher:firefox).
INFO [karma]: Karma v0.12.16 server started at http://localhost:9876/
INFO [launcher]: Starting browser firefox via Remote WebDriver
WARN [launcher]: firefox via Remote WebDriver have not captured in 60000 ms, killing.
Killed Karma test.
When I runtime debug the karma-webdriver-launcher stepping through it, it seems to fail in the wd request. My node javascript skills are only at a beginner level though so I might be missing something obvious. All the config details seemed to be getting passed correctly though and the url used for connecting to the selenium server look correct to me.
It fails inside this call of karma-webdriver-launcher\node_modules\wd\lib\webdriver.js on line 33,
this._request(httpOpts, function(err, res, data) {
Here's my karma.config.js file:-
// Karma configuration
module.exports = function (config) {
var webdriverConfig = {
hostname: '172.17.126.52',
port: 9625
}
config.set({
// base path that will be used to resolve all patterns (eg. files, exclude)
basePath: '',
// frameworks to use
frameworks: ['jasmine'],
// list of files / patterns to load in the browser
files: [
'Scripts/Libs/JsResources.js',
// watch every ts file
'Scripts/Local/**/*.ts'
],
// preprocess matching files before serving them to the browser
// available preprocessors: https://npmjs.org/browse/keyword/karma-preprocessor
preprocessors: {
'**/*.ts': ['typescript']
},
typescriptPreprocessor: {
// options passed to the typescript compiler
options: {
target: 'ES5', // (optional) Specify ECMAScript target version: 'ES3' (default), or 'ES5'
}
},
// test results reporter to use
// possible values: 'dots', 'progress'
// available reporters: https://npmjs.org/browse/keyword/karma-reporter
reporters: ['dots', 'html'],
// web server port
port: 9876,
// enable / disable colors in the output (reporters and logs)
colors: true,
// level of logging
// possible values: config.LOG_DISABLE || config.LOG_ERROR || config.LOG_WARN || config.LOG_INFO || config.LOG_DEBUG
logLevel: config.LOG_DEBUG,
// enable / disable watching file and executing tests whenever any file changes
autoWatch: true,
customLaunchers: {
'firefox': {
base: 'WebDriver',
config: webdriverConfig,
browserName: 'firefox',
}
},
// start these browsers
// available browser launchers: https://npmjs.org/browse/keyword/karma-launcher
browsers: ['firefox'],
// Continuous Integration mode
// if true, Karma captures browsers, runs the tests and exits
singleRun: true
});
};
I've got it working. Two problems had to be fixed from my initial post.
1st Fix: Inside the karma.conf.js file I had to set the hostname to the IP address of the machine that was running karma (i.e. my local machine). Don't set this to the IP address of the selenium server grid hub.
config.set({
...
hostname: '172.123.123.123',
...
})
2nd Fix: My karma project package.json file was missing this line in the devDependencies dictionary.
"karma-webdriver-launcher": "~0.2.0",
So the package.json contents looks like:-
{
"name": "MyKarmaProject",
"devDependencies": {
"karma": "~0.12.16",
"karma-chrome-launcher": "~0.1.4",
"karma-firefox-launcher": "~0.1.3",
"karma-html-reporter": "^0.2.3",
"karma-ie-launcher": "~0.1.5",
"karma-webdriver-launcher": "~0.2.0",
"karma-jasmine": "^0.2.2",
"karma-phantomjs-launcher": "^0.1.4",
"karma-typescript-preprocessor": "0.0.7",
"phantomjs": "^1.9.7-12"
}
}
I believe if you run from your project directory cmd prompt "npm install" after the package.json file is update it will ensure everything is downloaded and installed correctly.
Once this was done the karma runner was able to connect to my selenium grid server hub and request the appropriate browser node from the pool. I hope this question and answer helps someone else in the future!

Categories

Resources