I have a website using nodejs. Problem is when user upload images the site stop working. That because of PM2 restart server when file change I think. How to solve this problem.
thank you
PM2 has special flag --ignore-watch flag.
Try creating file process.json in the same directory where your app.js/index.js is and paste this:
{
"watch": ["server", "client"],
"ignore_watch" : ["node_modules", "public/images"],
"watch_options": {
"followSymlinks": false
}
}
More on that topic: http://pm2.keymetrics.io/docs/usage/watch-and-restart/
A simple explanation, from actual experience
create a json file in the root folder of the the expressjs application. It can have any name, but I used pm2-process.json for clarity
{
"script": "bin/www",
"watch": true,
"ignore_watch": ["log"],
"watch_options": {
"followSymlinks": false
},
"name": "YOUR_PM2_PROCESS_NAME"
}
To start your pm2 service from terminal, in the root folder of the express application:
pm2 start pm2-process.json
That's it. Really simple. There are many other options but this is the bare functional minimum .
Fields explanation:
script - the script to run the express application
watch - a boolean flag to control if pm2 watches (or not) the folder
ignore_watch - if watch is on, then tell pm2 which folders to ignore watching (in other words, this is a watch monitor exclusion list)
name - the name of the pm2 process ('service'). Set it to your application name of choice.
The full documentation is here: http://pm2.keymetrics.io/docs/usage/application-declaration/#attributes-available
Note: I left the node_modules folder out of the ignore_watch array in the example above, because I want pm2 to restart the service after a git pull and npm i that causes a change in the node modules. However it easy to ignore node_modules or any other folder (e.g., temp, public, etc.) by editing the array values
Related
I can't get Next.js' Fast Refresh feature to work with a VS Code Remote Container. I can run npm run dev and see the app running on localhost on my machine, so the container works fine - only the Fast Refresh has no effect at all.
Next.js version: v11.0.1
I tried this both with Windows 10 and Ubuntu 20.04 (on WSL 2).
I already tried to use a custom webpack middleware in the next.config.js like so (see https://github.com/vercel/next.js/issues/2179#issuecomment-316568536):
module.exports = {
webpackDevMiddleware: (config) => {
// Solve compiling problem via vagrant
config.watchOptions = {
poll: 1000, // Check for changes every second
aggregateTimeout: 300, // delay before rebuilding
};
return config;
},
};
...which will trigger a recompile on code changes, but the browser does not update.
Also, the requests to "HMR" are failing:
How to reproduce:
Install the Remote Containers extension
Open any new folder
Open the command palette and type/select "Remote-Containers: Rebuild and Reopen in Container"
Type/select "Node.js"
Type/select version "16" and wait for the container to start
Go to the .devcontainer folder and open the devcontainer.json
Edit the config by adding "forwardPorts": [3002], to make the app available on your host and rebuild the container (via VS Code's command palette)
From the terminal, install Next.js, e.g.: npx create-next-app --use-npm --example with-typescript-eslint-jest my-app
Move all the files from my-app to your VS Code project root folder. This has to be done because create-next-app does not work installing in the project root folder via ., because there's already the .devcontainer folder.
Optional: Create a next.config.js and add the snippet for the Webpack dev middleware as seen above
Edit the package.json script to use a specific port: "dev": "next dev -p 3002", (or, if you use WSL 2: next dev -p 3002 -H ::)
From the terminal, start the app npm run dev
Open the browser on http://localhost:3002
The app is showing. Make changes in the code -> even a recompiled app will not show the changes in the browser. A reload of the page in the browser will show the changes though.
With Create React App, there's an advanced configuration without ejecting (called CHOKIDAR_USEPOLLING), which makes their Fast Refresh work with Remote Containers.
Earlier I created a feature request, but maybe someone already managed to make this work without huge changes in the configuration/setup?
A lot has changed between me noticing this issue and the current version of Next.js (v12.1.6).
I just tried it out again and it finally seems to work! 🥳
I'm going to change my Next.js projects to use devcontainers and maybe other stuff does not work, but at least for Fast Refresh, this topic is solved.
If you're following the steps above, the most basic setup should look like the following. It is based on the default "Node.js v16" devcontainer preconfiguration.
You don't even need to forwardPorts anymore!
// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.234.0/containers/javascript-node
{
"name": "My project",
"build": {
"dockerfile": "Dockerfile",
// Update 'VARIANT' to pick a Node version: 18, 16, 14.
// Append -bullseye or -buster to pin to an OS version.
// Use -bullseye variants on local arm64/Apple Silicon.
"args": { "VARIANT": "16" }
},
"settings": {},
"extensions": [
"dbaeumer.vscode-eslint"
],
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],
// Use 'postCreateCommand' to run commands after the container is created.
// "postCreateCommand": "yarn install",
// Comment out to connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
"remoteUser": "node"
}
I have a Webpack-templated Vue project, initiated through vue-cli.
I have created a simple 'vue.config.js' file stored in the root folder (where package.json is at) containing the following:
// vue.config.js
module.exports = {
productionSourceMap: false
}
Though when building the project using "npm run build" it ignores it.
I have tried different configurations to check if the problem is with the file or the setting, and the problem is with the file.
I am using webpack#3.12.0, vue#2.6.11, #vue/cli 4.2.3 and npm#6.9.0.
Make sure your build confiuration (in your case the webpack build configs) include your file.
Generally, you will have a source folder (often src) and the builder will build all the files in that dir only. Then you have your destination directory (often dist or build) where your build files will be stored.
Two solutions:
add your conf file to the build source.
move your vue.conf.js file into your source directory
For some reason, I did not manage to get vue.config.js to work.
Alternatively, I edited my webpack config, which as my build files mentioned was located at /config/index.js
Then, I proceeded to pass my build configurations to the build parameter which already appears on the file.
build: {
...
}
And it worked. I assume it may be because I used npm run dev instead of the vue-service-cli, so webpack did not go through the vue.config.js file.
When you start an app built by electron,
C¥Users¥UserName¥AppData¥Roaming¥builtProductName¥logs
the directory would be created.
You can find C¥Users¥UserName¥AppData¥Roaming¥Visual Studio Code¥logs
directory if you are using VSCode.
What is this directory?
And is there any way to prevent this directory to be created, when you build an electron app?
// package.json
"build": {
"productName": "**********"
}
It sounds like electron-log is being used. Based on the log file options, you can provide a file option set to a file location you want or disable the file transport altogether with log.transports.file.level = false;.
In my nuxt js application, I am trying to restart the server files which is in the api folder when a file changes.
In order to that, I've added the following to nuxt.config.js
build: {
watch: ["~/api/index.js"]
}
When I make the changes on the files in API folder server doesn't restarted automatically
I've tried this thread Watch and reload api folder in Vue Nuxt looks similar to my problem but it doesn't worked for me
As suggested here
Watch and reload api folder in Vue Nuxt
just adding
watch: ['api'],
to nuxt.config.js, at the root level, worked for me. Not under build, as you had it.
EDIT
Bizarrely, although I now see the server (and client, for some reason) being started in the console... the new code is not taken! So, that's fairly useless.
The only way to get the code changes is still to ctrl-C and restart from zero.
In my application, I use nodemon.json config
{
"verbose": true,
"ignore": ["node_modules", "dist"],
"watch": [
"app.js"
],
"ext": "js json"
}
I have done quite some search already. However, still having doubts about the 'main' parameter in the package.json of a Node project.
How would filling in this field help? Asking in another way, can I start the module in a different style if this field presents?
Can I have more than one script filled into the main parameter? If yes, would they be started as two threads? If no, how can I start two scripts in a module and having them run in parallel?
I know that the second question is quite weird. It is because I have hosted a Node.js application on OpenShift but the application consists of two main components. One being a REST API and one being a notification delivering service.
I am afraid that the notification delivering process would block the REST API if they were implemented as a single thread. However, they have to connect to the same MongoDB cartridge. Moreover, I would like to save one gear if both the components could be serving in the same gear if possible.
Any suggestions are welcome.
From the npm documentation:
The main field is a module ID that is the primary entry point to your
program. That is, if your package is named foo, and a user installs
it, and then does require("foo"), then your main module's exports
object will be returned.
This should be a module ID relative to the root of your package
folder.
For most modules, it makes the most sense to have a main script and
often not much else.
To put it short:
You only need a main parameter in your package.json if the entry point to your package differs from index.js in its root folder. For example, people often put the entry point to lib/index.js or lib/<packagename>.js, in this case the corresponding script must be described as main in package.json.
You can't have two scripts as main, simply because the entry point require('yourpackagename') must be defined unambiguously.
To answer your first question, the way you load a module is depending on the module entry point and the main parameter of the package.json.
Let's say you have the following file structure:
my-npm-module
|-- lib
| |-- module.js
|-- package.json
Without main parameter in the package.json, you have to load the module by giving the module entry point: require('my-npm-module/lib/module.js').
If you set the package.json main parameter as follows "main": "lib/module.js", you will be able to load the module this way: require('my-npm-module').
If you have for instance in your package.json file:
{
"name": "zig-zag",
"main": "lib/entry.js",
...
}
lib/entry.js will be the main entry point to your package.
When calling
require('zig-zag');
in node, lib/entry.js will be the actual file that is required.
As far as I know, it's the main entry point to your node package (library) for npm. It's needed if your npm project becomes a node package (library) which can be installed via npm by others.
Let's say you have a library with a build/, dist/, or lib/ folder. In this folder, you got the following compiled file for your library:
-lib/
--bundle.js
Then in your package.json, you tell npm how to access the library (node package):
{
"name": "my-library-name",
"main": "lib/bundle.js",
...
}
After installing the node package with npm to your JS project, you can import functionalities from your bundled bundle.js file:
import { add, subtract } from 'my-library-name';
This holds also true when using Code Splitting (e.g. Webpack) for your library. For instance, this webpack.config.js makes use of code splitting the project into multiple bundles instead of one.
module.exports = {
entry: {
main: './src/index.js',
add: './src/add.js',
subtract: './src/subtract.js',
},
output: {
path: `${__dirname}/lib`,
filename: '[name].js',
library: 'my-library-name',
libraryTarget: 'umd',
},
...
}
Still, you would define one main entry point to your library in your package.json:
{
"name": "my-library-name",
"main": "lib/main.js",
...
}
Then when using the library, you can import your files from your main entry point:
import { add, subtract } from 'my-library-name';
However, you can also bypass the main entry point from the package.json and import the code splitted bundles:
import add from 'my-library-name/lib/add';
import subtract from 'my-library-name/lib/subtract';
After all, the main property in your package.json only points to your main entry point file of your library.
One important function of the main key is that it provides the path for your entry point. This is very helpful when working with nodemon. If you work with nodemon and you define the main key in your package.json as let say "main": "./src/server/app.js", then you can simply crank up the server with typing nodemon in the CLI with root as pwd instead of nodemon ./src/server/app.js.
From the Node.js getting started documentation, it states;
An extra note: if the filename passed to require is actually a directory, it will first look for package.json in the directory and load the file referenced in the main property. Otherwise, it will look for an index.js.
For OpenShift, you only get one PORT and IP pair to bind to (per application). It sounds like you should be able to serve both services from a single nodejs instance by adding internal routes for each service endpoint.
I have some info on how OpenShift uses your project's package.json to start your application here: https://www.openshift.com/blogs/run-your-nodejs-projects-on-openshift-in-two-simple-steps#package_json