I am using the command pm2 start apps.json to start multiple apps in a single command. These apps are defined in apps.json:
{
"apps": [
{
"name": "foo",
"script": "./foo.js",
},
{
"name": "bar",
"script": "./bar.js",
},
{
"name": "baz",
"script": "./baz.js",
}
]
}
Question: Is it possible to define the startup sequence, such that foo.js has to finish starting first before bar.js and baz.js can start?
For example, foo.js can perform a graceful start, running process.send('ready') to change its pm2 status to online. Only then will bar.js and baz.js be started by pm2. This will be similar to Docker Compose's depend_on parameter.
No such thing can be done via the configuration file only, but PM2 has a programmatic api which allows you to perform IPC (Inter-Process communication).
The following methods are the ones to work with :
pm2.list To list the processes that are running and get their names / IDs
pm2.launchBus For the processes that will receive information and react in consequence
pm2.sendDataToProcessId For sending informations to another process
This way you can run multiple scripts and make one wait for another. Once a script recieve a message on the message bus, it can start a process with pm2.start
Here's a piece of PSEUDO-CODE to illustrate my point :
const pm2 = require('pm2');
pm2.connect(() => {
pm2.list(function(err, processes) {
const fooProcess = processes.find(p => p.name == 'foo');
pm2.launchBus((err, bus) => {
bus.on('process:msg', packet => {
if (packet.startBar === true) {
pm.start({ script: 'bar.js' }, (err, apps) => { ... })
}
});
bus.on('error', console.error);
});
});
});
In another script, you would have the following :
pm2.sendDataToProcessId(barProcessID, {
data : { startBar : true },
topic: 'process:msg'
}, (err, res) => console.log(err, res));
Best regards
A little hack, you can write: "start": "pm2 start server.js && pm2 start server1.js" inside package.json. If you'd run it as npm start, it'll run the start script, similarly, you can create the script to stop it.
If not then you can also use child_process [it comes by default with nodejs] to run commands from inside your script by using childProcess.exec('pm2 start server.js && pm2 start server1.js');
Related
What is the right way to configure/enable an Elastic APM agent in a Nuxtjs project?
I referred this documentation for a custom NodeJS app. The key takeaway was:
It’s important that the agent is started before you require any other
modules in your Node.js application - i.e. before http and before your
router etc.
I added the following snippet in nuxt.config.js, but the APM agent is not started or working. I do not see any errors in the app logs.
var apm = require('elastic-apm-node').start({
serviceName: 'nuxt-app',
serverUrl: 'http://ELK_APM_SERVER:8200'
})
Is there any other way to do this?
We managed to get this working using a custom Nuxt module which explicitly requires the Node modules to instrument after it has initiated the APM module.
modules/elastic-apm.js:
const apm = require('elastic-apm-node');
const defu = require('defu');
module.exports = function() {
this.nuxt.hook('ready', async(nuxt) => {
const runtimeConfig = defu(nuxt.options.privateRuntimeConfig, nuxt.options.publicRuntimeConfig);
const config = (runtimeConfig.elastic && runtimeConfig.elastic.apm) || {};
if (!config.serverUrl) {
return;
}
if (!apm.isStarted()) {
await apm.start(config);
// Now explicitly require the modules we want APM to hook into, as otherwise
// they would not be instrumented.
//
// Docs: https://www.elastic.co/guide/en/apm/agent/nodejs/master/custom-stack.html
// Modules: https://github.com/elastic/apm-agent-nodejs/tree/master/lib/instrumentation/modules
require('http');
require('http2');
require('https');
}
});
}
nuxt.config.js:
module.exports = {
// Must be in modules, not buildModules
modules: ['~/modules/elastic-apm'],
publicRuntimeConfig: {
elastic: {
apm: {
serverUrl: process.env.ELASTIC_APM_SERVER_URL,
serviceName: 'my-nuxt-app',
usePathAsTransactionName: true // prevent "GET unknown route" transactions
}
}
}
};
All the answers are outdated and from beginning incorrect (17.02.2022)
To make it work follow these steps:
1.) Create a nodeApm.js in your root dir with the following content:
const nodeApm = require('elastic-apm-node')
if (!nodeApm.isStarted()) {
nodeApm.start()
}
2.) Use environment variables to store your config. For example:
ELASTIC_APM_SERVICE_NAME=NUXT_PRODUCTION
ELASTIC_APM_SECRET_TOKEN=yoursecrettokenhere
3.) Edit your package.json
"scripts": {
// if you want apm also on dev to test, add it also here
"dev": "node -r ./nodeApm.js node_modules/nuxt/bin/nuxt",
...
"start": "node -r ./nodeApm.js node_modules/nuxt/bin/nuxt start",
...
! Be awere that in ~2022 the node_modules bin folder has lost the "." in the directory name
! In all othere anwsers people forget the start parameter at the end
"start": "node -r ./nodeApm.js node_modules/nuxt/bin/nuxt start",
Based on what I've seen it looks like there isn't a "right" way to do this with the stock nuxt command line application. The problem seems to be that while nuxt.config.js is the first time a user has a chance to add some javascript, that the nuxt command line application bootstraps the Node's HTTP frameworks before this config file is required. This means the elastic agent (or any APM agent) doesn't have a chance to hook into the modules.
The current recommendations from the Nuxt team appears to be
Invoke nuxt manually via -r
{
"scripts": {
"start": "node -r elastic-apm-node node_modules/nuxt/.bin/nuxt"
}
}
Skip nuxt and use NuxtJS programmatically as a middleware in your framework of choice
const { loadNuxt } = require('nuxt')
const nuxtPromise = loadNuxt('start')
app.use((req, res) => { nuxtPromise.then(nuxt => nuxt.render(req, res)) })
Based on Alan Storm answer (from Nuxt team) I made it work but with a little modification:
I created a file named nodeApm.js where I added the following code:
const nodeApm = require('elastic-apm-node')
if (!nodeApm.isStarted()) { ... // configuration magic }
In script sections I added:
"start": "node -r ./nodeApm.js node_modules/nuxt/.bin/nuxt"
I need to execute one JavaScript function before the Webpack starts its building process. The function just takes .scss files and concatenate them into one.
After that Webpack should take the result file. Is there an option to do that?
At the moment I run the function before the module.exports in webpack.config.js, but it seems that its not synchronous operation. Module.exports execute before the concat() function ends and Webpack can't find .scss file.
function concat(opts) {
(...)
}
concat({ src : styles, dest : './css/style.scss' });
module.exports = [
(...)
]
It seems a little bit odd to concat scss files before running Webpack as those kind of operations are usually handled by Webpack itself.
That being said, there's a few way of solving this.
The most obvious way would be to extract the concat parts to a separate file (e.g. prepare.js) and then run start the build process by running something along this line: node prepare.js && webpack. That'll first run prepare and if that exits without error webpack will be run. Usually that'll be added to the scripts part of your package.json, e.g.
"scripts": {
"build": "node prepare.js && webpack"
}
To achieve the same but in a more Webpack integrated way you could do the same thing where you extract the concat part to a separate file and then let Webpack execute that file, before build starts, with the help of Webpack Shell Plugin, e.g.
const WebpackShellPlugin = require('webpack-shell-plugin');
module.exports = {
...
plugins: [
new WebpackShellPlugin({
onBuildStart:['node prepare.js']
})
],
...
}
You can add any code at any phase of the building, using the Compiler Hooks.
The compile hook is called before (and every time) the compilation begins, so you probably want to use that:
config = {
//...
plugins: [
{
apply: (compiler) => {
compiler.hooks.compile.tap("MyPlugin_compile", () => {
console.log("This code is executed before the compilation begins.");
});
},
},
],
//...
};
I have the following task that I'm trying to use for development. I have two other tasks than I need run before gulp-nodemon starts and restarts my server process.
It STARTS just fine, but upon discovering a change in my /src file tree, the two tasks are run, but the node process never restarts and I never see my .on('restart') message fire. It then just hangs after the two other tasks complete (I get Gulp "Finished..." messages for both of them). Am I missing something here?
import gulp from 'gulp'
import nodemon from 'gulp-nodemon'
gulp('watch', ['copy-assets', 'transpile'], () => {
return nodemon({
execMap: {
js: `node`
},
script: `build/index.js`,
watch: `src/**/*.js`,
ext: `js`,
tasks: ['copy-assets', 'transpile']
}).on(`restart`, () => {
console.log(`App restarted!`)
})
})
I need to run the set of node js files which contains the configuration information where It has to run typically port number and IP address then using the forever in node.js I need to run the script in the terminal with the configuration without having any manual input.
For Programmatic approach , you can use Forever-Moniter
var forever = require('forever-monitor');
var child = new (forever.Monitor)('your-filename.js', {
max: 3,
silent: true,
options: []
});
child.on('exit', function () {
console.log('your-filename.js has exited after 3 restarts');
});
child.start();
You could make use of the child_process module. Check the doc, there're some useful information there: http://nodejs.org/api/child_process.html
To give a brief example
var exec = require('child_process').exec;
exec('forever', function callback(error, stdout, stderr){
// cb
});
If you don't need a callback / don't want to wait for the execution:
var exec = require('child_process').exec('forever').unref();
Was that helpful?
Best
Marc
Edit: Ok, not sure if I really got the point of your question, but my answer combined with https://stackoverflow.com/a/23871739/823851 might offer a good solution.
Usage:
forever start hello.js to start a process.
forever list to see list of all processes started by forever
forever stop hello.js to stop the process, or forever stop 0 to stop the process with index 0 (as shown by forever list).
node-config is a good module for managing different configurations of a Node.js app.
For example, to make a "production" config, /config/production.json would contain:
{
"port" : "3000",
"ip" : "192.168.1.1"
}
In one of your node application JS files:
config = require('config')
....
var port = config.port;
var ip = config.ip;
And to launch the app in this configuration, just first set your NODE_ENV to production from the shell before running your app.
export NODE_ENV=production
forever start app.js
Make additional config JSON files as needed for each of your environments. default.json is used when no environment is specified.
I'm using Grunt (task-based command line build tool for JavaScript projects) in my project. I've created a custom tag and I am wondering if it is possible to run a command into it.
To clarify, I'm trying to use Closure Templates and "the task" should call the jar file to pre-compile the Soy file to a javascript file.
I'm running this jar from command line, but I want to set it as a task.
Alternatively you could load in grunt plugins to help this:
grunt-shell example:
shell: {
make_directory: {
command: 'mkdir test'
}
}
or grunt-exec example:
exec: {
remove_logs: {
command: 'rm -f *.log'
},
list_files: {
command: 'ls -l **',
stdout: true
},
echo_grunt_version: {
command: function(grunt) { return 'echo ' + grunt.version; },
stdout: true
}
}
Check out grunt.util.spawn:
grunt.util.spawn({
cmd: 'rm',
args: ['-rf', '/tmp'],
}, function done() {
grunt.log.ok('/tmp deleted');
});
I've found a solution so I'd like to share with you.
I'm using grunt under node so, to call terminal commands you need to require 'child_process' module.
For example,
var myTerminal = require("child_process").exec,
commandToBeExecuted = "sh myCommand.sh";
myTerminal(commandToBeExecuted, function(error, stdout, stderr) {
if (!error) {
//do something
}
});
If you are using the latest grunt version (0.4.0rc7 at the time of this writing) both grunt-exec and grunt-shell fail (they don't seem to be updated to handle the latest grunt). On the other hand, child_process's exec is async, which is a hassle.
I ended up using Jake Trent's solution, and adding shelljs as a dev dependency on my project so I could just run tests easily and synchronously:
var shell = require('shelljs');
...
grunt.registerTask('jquery', "download jquery bundle", function() {
shell.exec('wget http://jqueryui.com/download/jquery-ui-1.7.3.custom.zip');
});
Guys are pointing child_process, but try to use execSync to see output..
grunt.registerTask('test', '', function () {
var exec = require('child_process').execSync;
var result = exec("phpunit -c phpunit.xml", { encoding: 'utf8' });
grunt.log.writeln(result);
});
For async shell commands working with Grunt 0.4.x use https://github.com/rma4ok/grunt-bg-shell.