Send IPC message with sh/bash to parent process (Node.js) - javascript

I have a Node.js process and this process forks an sh child process to run a bash script. Something like this:
const cp = require('child_process');
const n = cp.spawn('sh',['foo.sh'], {
stdio: ['ignore','ignore','ignore','ipc']
});
in my bash script (foo.sh), how can I send an IPC message back to the Node.js parent process? Cannot find out how to do that.
Doing some more research, looks like I will be getting closer to the IPC internals. One thing that might help is if I pass the parent PID to the bash script, then maybe I can do something with that.

When you add 'ipc' to your stdio options, the parent process and child process will establish a communication channel, and provide a file descriptor for the child process to use. This descriptor will be defined in your environment as $NODE_CHANNEL_FD. You can redirect output to this descriptor and it will be sent to the parent process to be parsed and handled.
As a simple example, I sent my name from the bash script to the parent process to be logged.
index.js
const cp = require('child_process');
const n = cp.spawn('sh', ['foo.sh'], {
stdio: ['ignore', 'ignore', 'ignore', 'ipc']
});
n.on('message', (data) => {
console.log('name: ' + data.name);
});
foo.sh
printf "{\"name\": \"Craig\"}\n" 1>&$NODE_CHANNEL_FD
Essentially what is happening in the bash file is:
I'm using the printf command to send the JSON to stdout, file descriptor 1.
And then redirecting it to a reference (&) of the $NODE_CHANNEL_FD
Note that the JSON you send must be properly formatted and terminated with a \n character
If you wanted to send data from the parent process to the bash process you could add
n.send({"message": "hello world"});
to your JavaScript, and in the bash file you could use something along the lines of
MESSAGE=read -u $NODE_CHANNEL_FD
echo " => message from parent process => $MESSAGE"
Note that you will have to change your stdio options so that you are not ignoring the standard output of the child process. You could set them to ['ignore', 1, 'ignore', 'ipc'] so that the child process' standard output goes straight to the parent's.

Related

How do I use Node JS to execute a minecraft command on a wrapped minecraft server?

I am making a node discord bot that would have the ability to control a minecraft server, but for some reason I cannot get the bot to execute minecraft commands on the server itself.
Right now the bot receives normal discord strings from messages and interprets them as commands. This is the code I use to start the server:
var svEnv = cp.spawn("java", [
"-Xms4096M",
"-Xmx4096M",
"-jar",
"server.jar",
"nogui"
], { shell: true, detached: true, cwd: `${config.server.directoryPath}`, stdio: [
"inherit",
"inherit",
"inherit",
"ipc"
]});
svEnv.stdout.on("data", out => {
console.log(`Server Feedback: ${out}`);
})
svEnv.stderr.on("data", err => {
if (!(err == "^C")){
console.error(`~Server Error: ${err}`);
}
})
This is the code I'm using to try to execute the commands:
svEnv.stdin.write(`${cmd}\n`);
Whenever I attempt to use the code above to execute a command, the server doesn't respond at all. It's as if it doesn't even receive the input.
All the solutions I have visited, even those with the same exact use case as myself have said this is the correct way to implement this. I have to guess that the reason it doesn't work is because of the way I have the child process spawn configured.
Basically what I'm asking is how does the spawned process need to be configured to receive commands?
Please Note: This is my first time using the child process module in node js, so if you have any tips on how it works please let me know.
The problem is that you set standard input to inherit. If you want to be able to write to it like that, then set it to pipe instead.

Node.js 'fs' throws an ENOENT error after adding auto-generated Swagger server code

Preamble
To start off, I'm not a developer; I'm just an analyst / product owner with time on their hands. While my team's actual developers have been busy finishing off projects before year-end I've been attempting to put together a very basic API server in Node.js for something we will look at next year.
I used Swagger to build an API spec and then used the Swagger code generator to get a basic Node.js server. The full code is near the bottom of this question.
The Problem
I'm coming across an issue when writing out to a log file using the fs module. I know that the ENOENT error is usually down to just specifying a path incorrectly, but the behaviour doesn't occur when I comment out the Swagger portion of the automatically generated code. (I took the logging code directly out of another tool I built in Node.js, so I'm fairly confident in that portion at least...)
When executing npm start, a few debugging items write to the console:
"Node Server Starting......
Current Directory:/mnt/c/Users/USER/Repositories/PROJECT/api
Trying to log data now!
Mock mode: disabled
PostgreSQL Pool created successfully
Your server is listening on port 3100 (http://localhost:3100)
Swagger-ui is available on http://localhost:3100/docs"
but then fs throws an ENOENT error:
events.js:174
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '../logs/logEvents2021-12-24.log'
Emitted 'error' event at:
at lazyFs.open (internal/fs/streams.js:277:12)
at FSReqWrap.args [as oncomplete] (fs.js:140:20)
Investigating
Now normally, from what I understand, this would just mean I've got the paths wrong. However, the file has actually been created and the first line of the log file has been written just fine
My next thought was that I must've set the fs flags incorrectly, but it was set to 'a' for append:
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
Removing Swagger Code
Now here's the weird bit: if I remove the Swagger code, the log files write out just fine and I don't get the fs exception!
This is the specific Swagger code:
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
When I comment out this code, the log file writes out just fine.
The only thing I can think that might be happening is that somehow Swagger is modifying (?) the app's working directory so that fs no longer finds the same file?
Full Code
'use strict';
var path = require('path');
var fs = require('fs');
var http = require('http');
var oas3Tools = require('oas3-tools');
var serverPort = 3100;
// I am specifically tried using path.join that I found when investigating this issue, and referencing the app path, but to no avail
const __logdir = path.join(__dirname,'./logs');
//These are date and time functions I use to add timestamps to the logs
function dateNow(){
var dateNow = new Date().toISOString().slice(0,10).toString();
return dateNow
}
function rightNow(){
var timeNow = new Date().toTimeString().slice(0,8).toString();
return "["+timeNow+"] "
};
console.info("Node Server Starting......");
console.info("Current Directory: " + __dirname)
// Here I create the WriteStreams
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
var errorsFile = fs.createWriteStream(__logdir+"/errorEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Error Log File to location: %s \nWith error description: %s',__logdir, err);
});
// And create an additional console to write data out:
const Console = require('console').Console;
var logOut = new Console(logsFile,errorsFile);
console.info("Trying to log data now!") // Debugging logging
logOut.log("========== Server Startup Initiated ==========");
logOut.log(rightNow() + "Server Directory: "+ __dirname);
logOut.log(rightNow() + "Logs directory: "+__logdir);
// Here is the Swagger portion that seems to create the behaviour.
// It is unedited from the Swagger Code-Gen tool
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
In case it helps, this is the project's file structure . I am running this project within a WSL instance in VSCode on Windows, same as I have with other projects using fs.
Is anyone able to help me understand why fs can write the first log line but then break once the Swagger code gets going? Have I done something incredibly stupid?
Appreciate the help, thanks!
Edit: Tried to fix broken images.
Found the problem with some help from a friend. The issue boiled down to a lack of understanding of how the Swagger module works in the background, so this will likely be eye-rollingly obvious to most, but keeping this post around in case anyone else comes across this down the line.
So it seems that as part of the Swagger initialisation, any scripts within the utils folder will also be executed. I would not have picked up on this if it wasn't pointed out to me that in the middle of the console output there was a reference to some PostgreSQL code, even though I had taken all reference to it out of the main index.js file.
That's when I realised that the error wasn't actually being generated from the code posted above: it was being thrown from to that folder.
So I guess the answer is don't add stuff to the utils folder, but if you do, always add a bunch of console logging...

How to get NodeJS child process to run .bat file via CMD.exe

I am currently writing an app launcher for Windows using ElectronJS and Javascript for windows. I want to block multiple instances of one app opening.
To do this I have written a batch script that checks if the process is running and if it is, print out a message saying "Program is running" and if it isn't, print "Program is not running".
Batch Script
echo off
tasklist /FI "IMAGENAME eq chrome.exe" 2>NUL | find /I /N "chrome.exe">NUL
if "%ERRORLEVEL%"=="0" echo Program is running
if "%ERRORLEVEL%"=="1" echo Program is not running
When I run this via cmd.exe it gives me the correct output whether the application is running or not. In Javascript however, I'm always given the following:
Output
Error: find: ā€˜/Iā€™: No such file or directory
find: ā€˜/Nā€™: No such file or directory
Error: find: "chrome.exe": No such file or directory
Program is not running
Child exited with code 0
renderer.js
const { spawn } = require('child_process');
const bat = spawn('cmd.exe', ["/c", 'multiScript.bat']);
bat.stdout.on('data', (data) => {
console.log(data.toString());
});
bat.stderr.on('data', (data) => {
console.error(data.toString());
});
bat.on('exit', (code) => {
console.log(Child exited with code ${code});
});
Folder Structure
projectfolder/
|-src/
| |-renderer.js (File that is trying to spawn the batch file)
| |-multiScript.bat (The bat file I am trying to execute)
It's because when you are spawning the child process it executes within the directory of your main process. Every path you call should be relative to the file which the child process was spawned or the absolute path of the system.

Node.js child_process exec, stdin not being passed through to ssh

I have the following Node.js code to ssh into a server and it works great forwarding stdout, but whenever I type anything it is not forwarding to the server. How do I forward my local stdin to the ssh connections stdin?
var command = 'ssh -tt -i ' + keyPath + ' -o StrictHostKeyChecking=no ubuntu#' + hostIp;
var ssh = child_proc.exec(command, {
env: process.env
});
ssh.stdout.on('data', function (data) {
console.log(data.toString());
});
ssh.stderr.on('data', function (data) {
console.error(data.toString());
});
ssh.on('exit', function (code) {
process.exit(code);
});
There's two ways to go about this if you want to pipe the process.stdin to the child process:
Child processes have a stdin property that represents the stdin of the child process. So all you should need to do is add process.stdin.pipe(ssh.stdin)
You can specify a custom stdio when spawning the process to tell it what to use for the child process's stdin:
child_proc.exec(command, { env: process.env, stdio: [process.stdin, 'pipe', 'pipe'] })
Also, on a semi-related note, if you want to avoid spawning child processes and have more programmatic control over and/or have more lightweight ssh/sftp connections, there is the ssh2 module.

Read output of shell command in meteor

I'm trying to ping a remote server to check if it's online or not. The flow is like this:
1) User insert target Hostname
2) Meteor execute command 'nmap -p 22 hostname'
3) Meteor read and parse the output, to check the status of the target.
I've been able to execute a command aynchronously, for example mkdir, that allow me to verify later that it worked.
Unfortunately it seems I'm not able to wait for the reply. My code, inside /server/poller.coffee is:
Meteor.methods
check_if_open: (target_server) ->
result = ''
exec = Npm.require('child_process').exec
result = exec 'nmap -p ' + target_server.port + ' ' + target_server.host
return result
This should execute exec synchronously, shouldn't it? Any other approach using Futures, ShellJS, AsyncWrap, failed with meteor refusing to start as soon as the node package was installed. It seems I can install only via meteor add (mrt).
My client side code, located at /client/views/home/home.coffee , is:
Template.home.events
'submit form': (e) ->
e.preventDefault()
console.log "Calling the connect button"
server_target =
host: $(e.target).find("[name=hostname]").val()
port: $(e.target).find("[name=port]").val()
password: $(e.target).find("[name=password]").val()
result = ''
result = Meteor.call('check_if_open', server_target)
console.log "You pressed the connect button"
console.log ' ' + result
Result is always null. Result should be a child process object, and has a stdout attribute, but such attribute is null.
What am I doing wrong? How do I read the output? I'm forced to do it asynchronously?
You'll need to use some kind of async wrapping, child_process.exec is strictly asynchronous. Here's how you could use Futures:
# In top-level code:
# I didn't need to install anything with npm,
# this just worked.
Future = Npm.require("fibers/future")
# in the method:
fut = new Future()
# Warning: if you do this, you're probably vulnerable to "shell injection"
# e.g. the user might set target_server.host to something like "blah.com; rm -rf /"
exec "nmap -p #{target_server.port} #{target_server.host}", (err, stdout, stderr) ->
fut.return [err, stdout, stderr]
[err, stdout, stderr] = fut.wait()
if err?
throw new Meteor.Error(...)
# do stuff with stdout and stderr
# note that these are Buffer objects, you might need to manually
# convert them to strings if you want to send them to the client
When you call the method on the client, you have to use an async callback. There's no fibers on the client.
console.log "You pressed the connect button"
Meteor.call "check_if_open", server_target, (err, result) ->
if err?
# handle the error
else
console.log result

Categories

Resources