How does child_process spawn capture python output in real time - javascript

I am learning vscode plugin development and I want to know if spawn can get the output in real time.
I run the following code.
const { spawn } = require('child_process');
const child = spawn('python', ['test.py'], { cwd: 'path' });
child.stdout.on('data', (data: Uint8Array) => {
console.log(data.toString());
});
child.stderr.on('data', (data: Uint8Array) => {
console.log(data.toString());
});
child.on('close', (code: number) => {
console.log(` ${code}`);
});
test.py
import sys
import time
sys.stdout.write('stdout 1\n')
time.sleep(1)
sys.stderr.write('stderr 1\n')
time.sleep(2)
sys.stdout.write('stdout 2\n')
time.sleep(3)
sys.stderr.write('stderr 2\n')
output
stderr 1
stderr 2
stdout 1
stdout 2
0
What I want is to be able to get the output in order.
I tested it is normal to capture the output of the shell script.

This is because of python's buffering, just call python and add -u parameter

Related

How to communicate with external CLIs with nodejs

When interacting with clis, for example, taking npm init, we can run the command and get the output by the following code
const { exec } = require('child_process');
exec('npm init', (err, stdout, stderr) => {
if (err) {
console.error(err)
} else {
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
}
});
But we cannot pass the project name, version name etc.. How to achieve this. Pls answer with the example of npm init command
Thanks in advance :)
Use the stdin channel each process provides. For that, use the node child_process.spawn method instead:
const { spawn } = require('child_process');
const npm = spawn('npm', ["init"]);
npm.stdout.pipe(process.stdout);
npm.stderr.pipe(process.stderr);
npm.on("exit", () => {
console.log("npm exited");
process.exit();
});
const answers = [
"my-awesome-cli", // package name
"0.0.1", // version number
"desciprtion", // description
"index.js", // entry point
"", // test command
"", // git reposiroty
"", // keywords
"Marc Stirner", // author
"MIT" // license
];
setInterval(() => {
if (answers.length > 0) {
// get first item from array
let answer = answers.shift();
// print value we pass to npm
console.log("Write to npm child:", answer);
// write chunk to stdin
npm.stdin.write(`${answer}\r\n`);
} else {
//npm.stdin.end();
console.log("Hit final enter")
npm.stdin.write(`\r\n`);
}
}, 800);
My example spwans the npm command, use the stdin channel to write the answer to the process, and pipe the stdout&stderr output from the npm command to the node.js process.
You can do this with exec too, since it returns as well a child_process object.
Read more on the node.js docs, they are very well documented.

How to prefix every line of the output of a child_process spawn() call with text?

I execute several processes using spawn() from child_process.
const { spawn } = require('child_process');
spawn("sh", ["script1.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script2.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script3.sh"], { shell: false, stdio: "inherit"});
Question: How to prefix the output of the scripts with text?
The idea is that I would be able to easily distinguish what each script logs.
I went through the spawn documentation but couldn't find anything that would help achieve this.
Note: I cannot modify the scripts.
You have stdio set to "inherit".
If you set this to "pipe", Node.js will give you back readable streams for stderr and stdin, which you can use to add your prefix.
const { spawn } = require('child_process');
const one = spawn("sh", ["script1.sh"], { shell: false, stdio: "pipe"});
let oneStdout = ''
one.stdout.on('data', function (chunk) {
oneStdout += chunk
const lines = oneStdout.split('\n')
while(lines.length > 1) {
const line = lines.shift()
console.log('one',line)
}
oneStdout = lines.shift()
})
one.stdout.on('end', function () {
console.log('one', oneStdout)
})
Here is the relevant section in the docs: https://nodejs.org/api/child_process.html#child_process_subprocess_stdio
Potential gotcha:
When "prefixing" you likely want to prefix each new line but not all scripts write to stdout a full line at a time. Play around with a few scripts that use echo -n "foobar" throughout the output to test that you're handling line breaks correctly.
Here is how I run external commands and capture their output, e.g. to prefix each line with a timestamp:
const { spawn } = require("child_process"),
command = "echo",
args = ["Studies have shown that people watch TV more than any other appliance."];
const child = spawn(command, args);
child.stdout.on('data', buff => {
const line = buff.toLocaleString();
console.info(new Date(), line);
});
child.stderr.on('data', buff => { // also catch any error output
const line = buff.toLocaleString();
console.error(new Date(), line);
});
child.on('exit', code => {console.info(`Command exited with code ${code}`)});
Here's what it looks like when you run it:
$ node thatScript.js
2020-12-10T11:46:51.455Z Studies have shown that people watch TV more than any other appliance.
Command exited with code 0
$

Electron and NodeJS: Execute shell command asyncronously with live stream

Electron: get file conversion percent in real-time:
I wanna run the command ffmpeg -i video.mp4 (example) to convert a video into another format. But I want to get the conversion percent that is streamed in the process output and get it in my Electron App or NodeJS.
I've tried all methods: spawn fork exec and all of them return me the last line of the process output. I want a LIVE Stream of each line that is been written, to show the percent progress.
I've tried:
Fork
const {fork} = require('child_process')
const forked = fork('ffmpeg -i video.mp4');
forked.on('message', (msg) => {
console.log(msg);
})
Exec Alternative 1
const execFile = require('child_process').execFile;
execFile('ffmpeg -i video.mp4', [], (e, stdout, stderr) => {
if (e instanceof Error){
console.error(e);
}
console.log('stdout ', stdout)
console.log('stderr ', stderr);
})
Exec Alternative 2
const exec = require('child_process').exec;
exec('ffmpeg -i video.mp4', (error, stdout, stderr) => {
console.log(stdout);
});
/*EXEC Alternative 2*/
const exec = require('child_process').exec;
const proccessing = exec('ffmpeg -i video.mp4');
proccessing.stdout.on('data', function(data) {
console.log(data);
});
proccessing.stdout.pipe(process.stdout);
Spawn
const spawn = require('child_process').spawn,
const processing = spawn('ffmpeg -i video.mp4');
processing .stdout.on('data', function (data) {
console.log('stdout: ' + data.toString());
});
processing .stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
processing .on('exit', function (code) {
console.log('code ' + code.toString());
});
SUMMARY:
🎯Goal:
Get this results in the console
10% converted
15% converted
20% converted
100% converted...
❌Error:
What I'm getting:
100% converted
//Sometimes I get an empty string because it is the last line of the .exe script
BEFORE MARK AS DUPLICATE, I'M SURE NO ONE ANSWER IN STACKOVERFLOW WORKED FOR ME
You need to use ffmpeg with ffmpeg-progress-wrapper. Attach on event "progress" and get the "progress" property.
process.on('progress', (progress) => console.log(JSON.stringify(progress.progress));
It goes from 0 to 1, so you will need to set some adjusts.

How to pass stream input/output between a main process and a child process in Node.js

I'm trying to get two processes to "talk" to each other via stdio:
ping.js
import { readline } from '../readline';
import { sleep } from '../sleep';
import { spawn } from 'child_process';
const spawnPongProcess = () => {
const child = spawn('node',
['-r', 'esm', `${__dirname}/pong.js`],
{ stdio: 'pipe' });
child.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
child.on('close', (code) => {
console.error(`child process exited with code ${code}`);
process.stdin.resume();
process.stdout.resume();
});
child.on('error', (err) => {
console.error(`child process error: ${err}`);
});
process.stdout.pipe(child.stdin);
child.stdout.pipe(process.stdin);
return child;
};
export const ping = async () => {
const child = spawnPongProcess();
await sleep(1000);
console.log('ping');
let pong = readline();
console.error(`Ping received: ${pong}`);
child.kill();
};
I pipe the parent process' stdout to the child process stdin and the child process stdout to the parent process stdin in an effort to allow the processes to communicate via stdio.
pong.js
import { readline } from '../readline';
import { sleep } from '../sleep';
const pong = async () => {
console.log(`Pong initiated and waiting for input.`);
let ping = readline();
console.log(`Pong received: ${ping}`);
process.exit();
};
pong();
readline.js
import { question } from 'readline-sync';
export const readline = question;
sleep.js
export const sleep = (ms) => {
return new Promise((resolve) => setTimeout(resolve, ms));
};
The output is:
$ node -r esm src/index.js
Pong initiated and waiting for input.
ping
It appears that the output from the parent process (ping) is not getting through to the child process (pong). Any ideas on how to make it work?
You piped your process' stdout (Writable) to child's stdin (Writable) and vice versa. Since data is received on the stdin (Readable), you have to pipe it instead of stdout:
process.stdin.pipe(child.stdin);
child.stdout.pipe(process.stdout);
Your code doesn't throw because if the stdout is a terminal, it becomes a Duplex stream.

Data stream handlers and CLI incorporation Node JS

I am trying to create an interactive CLI that can run serial commands. I have two files serialcomms.js and cli.js. Serialcomms.js contains the connection, handlers, and command functions. cli.js contains the commander information.
My issue is that I can only call the send command once because the listeners/handlers take over from the serialcomms file. What would be the best method to loop the cli program so I can call the send command over and over again, but still have the serial handlers running and output to stdout? Would I need to use a child process? Or recursion to have the cli call itself?
Example behavior I am expecting with an echo bot on the other end of the serial line.
Send hello
hello
Send Bye
Bye
Behavior I am experiencing
Send hello
hello
endless wait
Here is my serialcomms.js
const SerialPort = require('serialport');
const ReadLine = require('#serialport/parser-readline');
let portName = `/dev/pts/${process.argv[2]}` || '/dev/pts/6';
let baudRate = process.argv[3] || 115200;
let myPort = new SerialPort(portName, {baudRate: baudRate})
let parser = myPort.pipe(new ReadLine({ delimiter: '\n' }))
myPort.on('open', () => {
console.log(`port ${portName} open`)
})
parser.on('data', (data) => {
console.log(data)
})
myPort.on('close', () => {
console.log(`port ${portName} closed`)
})
myPort.on('error', (err) => {
console.error('port error: ' + err)
})
function send(data){
myPort.write(JSON.stringify(data)+'\n', function(err) {
if (err) {
return console.log('Error on write: ', err.message);
}
console.log(`${data} sent`);
});
}
module.exports = {
send
}
Here is my CLI.js file
const program = require('commander');
const {send} = require('./serialcomms');
program
.version('1.0.0')
.description('Serial Tester')
program
.command('send <msg>')
.alias('s')
.description('send a message over serial')
.action((msg)=>{
send(msg)
})
program.parse(process.argv)

Categories

Resources