To run a batch file from node i am using below given code and it is working fine.
const { spawn } = require('child_process');
const bat = spawn('cmd.exe', ['/c','D:/somefolder/bin/tools/caprequestutil.bat']);
bat.stdout.on('data', (data) => {
console.log('data is : '+data.toString());
});
bat.stderr.on('data', (data) => {
console.error('error is : '+data.toString());
});
bat.on('exit', (code) => {
console.log(`Child exited with code ${code}`);
});
It is working fine. Now i want to pass some parameter to this batch file which is not working for me. I'm trying to simply add the parameters after my .bat file as given below:
const bat = spawn('cmd.exe', ['/c','D/somefolder/bin/tools/caprequestutil.bat -id D:/somefolder/bin/tools/IdentityClient.bin -h 1234-567-8765-4532 -type STRING -attr key value -activate acdvdsfdc count https://someurl/foo -full']);
How can i add these params?
Found the answer.
The trick is to send the parameters in the form of arrays like given below
const bat = spawn('cmd.exe', ['/c',"D/somefolder/bin/tools/caprequestutil.bat", ["-id", "D:/somefolder/bin/tools/IdentityClient.bin"], ["-h", "1234-567-8765-4532"], ["-type", "STRING"], ["-attr", "key","value"], ["-activate", "acdvdsfdc "], ["count "], ["https://someurl/foo"], ["-full"]]);
Related
I execute several processes using spawn() from child_process.
const { spawn } = require('child_process');
spawn("sh", ["script1.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script2.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script3.sh"], { shell: false, stdio: "inherit"});
Question: How to prefix the output of the scripts with text?
The idea is that I would be able to easily distinguish what each script logs.
I went through the spawn documentation but couldn't find anything that would help achieve this.
Note: I cannot modify the scripts.
You have stdio set to "inherit".
If you set this to "pipe", Node.js will give you back readable streams for stderr and stdin, which you can use to add your prefix.
const { spawn } = require('child_process');
const one = spawn("sh", ["script1.sh"], { shell: false, stdio: "pipe"});
let oneStdout = ''
one.stdout.on('data', function (chunk) {
oneStdout += chunk
const lines = oneStdout.split('\n')
while(lines.length > 1) {
const line = lines.shift()
console.log('one',line)
}
oneStdout = lines.shift()
})
one.stdout.on('end', function () {
console.log('one', oneStdout)
})
Here is the relevant section in the docs: https://nodejs.org/api/child_process.html#child_process_subprocess_stdio
Potential gotcha:
When "prefixing" you likely want to prefix each new line but not all scripts write to stdout a full line at a time. Play around with a few scripts that use echo -n "foobar" throughout the output to test that you're handling line breaks correctly.
Here is how I run external commands and capture their output, e.g. to prefix each line with a timestamp:
const { spawn } = require("child_process"),
command = "echo",
args = ["Studies have shown that people watch TV more than any other appliance."];
const child = spawn(command, args);
child.stdout.on('data', buff => {
const line = buff.toLocaleString();
console.info(new Date(), line);
});
child.stderr.on('data', buff => { // also catch any error output
const line = buff.toLocaleString();
console.error(new Date(), line);
});
child.on('exit', code => {console.info(`Command exited with code ${code}`)});
Here's what it looks like when you run it:
$ node thatScript.js
2020-12-10T11:46:51.455Z Studies have shown that people watch TV more than any other appliance.
Command exited with code 0
$
Electron: get file conversion percent in real-time:
I wanna run the command ffmpeg -i video.mp4 (example) to convert a video into another format. But I want to get the conversion percent that is streamed in the process output and get it in my Electron App or NodeJS.
I've tried all methods: spawn fork exec and all of them return me the last line of the process output. I want a LIVE Stream of each line that is been written, to show the percent progress.
I've tried:
Fork
const {fork} = require('child_process')
const forked = fork('ffmpeg -i video.mp4');
forked.on('message', (msg) => {
console.log(msg);
})
Exec Alternative 1
const execFile = require('child_process').execFile;
execFile('ffmpeg -i video.mp4', [], (e, stdout, stderr) => {
if (e instanceof Error){
console.error(e);
}
console.log('stdout ', stdout)
console.log('stderr ', stderr);
})
Exec Alternative 2
const exec = require('child_process').exec;
exec('ffmpeg -i video.mp4', (error, stdout, stderr) => {
console.log(stdout);
});
/*EXEC Alternative 2*/
const exec = require('child_process').exec;
const proccessing = exec('ffmpeg -i video.mp4');
proccessing.stdout.on('data', function(data) {
console.log(data);
});
proccessing.stdout.pipe(process.stdout);
Spawn
const spawn = require('child_process').spawn,
const processing = spawn('ffmpeg -i video.mp4');
processing .stdout.on('data', function (data) {
console.log('stdout: ' + data.toString());
});
processing .stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
processing .on('exit', function (code) {
console.log('code ' + code.toString());
});
SUMMARY:
🎯Goal:
Get this results in the console
10% converted
15% converted
20% converted
100% converted...
❌Error:
What I'm getting:
100% converted
//Sometimes I get an empty string because it is the last line of the .exe script
BEFORE MARK AS DUPLICATE, I'M SURE NO ONE ANSWER IN STACKOVERFLOW WORKED FOR ME
You need to use ffmpeg with ffmpeg-progress-wrapper. Attach on event "progress" and get the "progress" property.
process.on('progress', (progress) => console.log(JSON.stringify(progress.progress));
It goes from 0 to 1, so you will need to set some adjusts.
I'm relatively new to Cloud Functions and have been trying to solve this issue for a while. Essentially, the function I'm trying to write is called whenever there is a complete upload onto Firebase Cloud Storage. However, when the function runs, half the time, it runs to the following error:
The following error occured: { Error: ENOENT: no such file or directory, open '/tmp/dataprocessing/thisisthefilethatiswritten.zip'
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/tmp/dataprocessing/thisisthefilethatiswritten.zip' }
Here's the code:
const functions = require('firebase-functions');
const admin = require('firebase-admin')
const inspect = require('util').inspect
const path = require('path');
const os = require('os');
const fs = require('fs-extra');
const firestore = admin.firestore()
const storage = admin.storage()
const runtimeOpts = {
timeoutSeconds: 540,
memory: '2GB'
}
const uploadprocessing = functions.runWith(runtimeOpts).storage.object().onFinalize(async (object) => {
const filePath = object.name
const fileBucket = object.bucket
const bucket_fileName = path.basename(filePath);
const uid = bucket_fileName.match('.+?(?=_)')
const original_filename = bucket_fileName.split('_').pop()
const bucket = storage.bucket(fileBucket);
const workingDir = path.join(os.tmpdir(), 'dataprocessing/');
const tempFilePath = path.join(workingDir, original_filename);
await fs.ensureDir(workingDir)
await bucket.file(filePath).download({destination: tempFilePath})
//this particular code block I included because I was worried that the file wasn't
//being uploaded to the tmp directly, but the results of the function
//seems to confirm to me that the file does exist.
await fs.ensureFile(tempFilePath)
console.log('success!')
fs.readdirSync(workingDir).forEach(file => {
console.log('file found: ', file);
});
console.log('post success')
fs.readdirSync('/tmp/dataprocessing').forEach(file => {
console.log('tmp file found: ', file);
})
fs.readFile(tempFilePath, function (err, buffer) {
if (!err) {
//data processing comes here. Please note that half the time it never gets into this
//loop as instead it goes into the else function below and outputs that error.
}
else {
console.log("The following error occured: ", err);
}
})
fs.unlinkSync(tempFilePath);
return
})
module.exports = uploadprocessing;
I've been trying so many different things and the weird thing is that when I add code into the "if (!err)" (which doesn't actually run because of the err) it just arbitrarily starts working sometimes quite consistently, but then it stops working when I add different code. I would have assumed that the issue arises from the code that I added, but then the error comes up literally when I just change/add/remove comments as well... Which should technically have no effect on the function running...
Any thoughts? Thank you in advance!!! :)
fs.readFile is asynchronous and returns immediately. Your callback function is invoked some time later with the contents of the buffer. This means that fs.unlinkSync is going to delete the file at the same time it's being read. This means you effectively have a race condition, and it's possible that the file will be removed before it's ever read.
Your code should wait until the read is complete before moving on to the delete. Perhaps you want to use fs.readFileSync instead.
I want to perform crud operation on my collection. To automate this, I wrote a script for inserting documents on my collection.
This script should basically read the data from a json file and insert it into my db-collection.
'use strict';
const fs = require('fs');
const path = require('path');
const { spawn }= require('child_process');
(function () {
const username = process.env.DB_USERNAME,
password = process.env.DB_PASSWORD,
args = process.argv.slice(3),
log = console.log,
mongoServer = "mongodb+srv://***server-name***/";
let database,
documents,
collection;
if (args.length === 2) {
database = args[0];
collection = args[1];
const raw = fs.readFileSync(path.join(__dirname, 'documents.json'));
documents = JSON.parse(raw);
log(documents)
const writeAction = spawn('mongo', [`\"${mongoServer}${database}\" -u ${username} -p ${password} --eval \"db.${collection}.insert(${documents})\" `], {shell: true});
writeAction.stdout.on('data', data => log((`stout: ${data}`));
writeAction.stderr.on('data', data => log((`StdErr: ${data}`)));
writeAction.on('close', (code) => log(`Child process exited with code: ${code}.`));
} else {
log('A database and a collection has to be specified!\n In Order:\n 1. Database\n 2. Collection');
process.exit(1);
}
})();
If I read the json file the console log the following:
[ { id: '3685b542-61d5-45da-9580-162dca725966',
mission:
'The American Red Cross prevents and alleviates human suffering in the face of emergencies by mobilizing the power of volunteers and the generosity of donors.',
street1: '2025 E Street, NW',
profile_url:
'https://www.pledgeling.com/organizations/42/american-red-cross' } ]
So the json looks fine to me but if I execute the script it throws me the error:
stout: 2019-11-08T18:08:33.901+0100 E QUERY [js] uncaught exception: SyntaxError: missing ] after element list :
#(shell eval):1:23
2019-11-08T18:08:33.901+0100 E - [main] exiting with code -4
Does any of you know how to overcome this error?
It clearly does not see the closing brackets. So first thing try to lose the brackets and test. Also, do not forget to force 'utf8' encoding.
const raw = fs.readFileSync(path.join(__dirname, 'documents.json'), 'utf8');
documents = JSON.parse(raw);
the line...
writeAction.stdout.on('data', data => log((`stout: ${data}`));
... is missing a parenthesis.
I am trying to create an interactive CLI that can run serial commands. I have two files serialcomms.js and cli.js. Serialcomms.js contains the connection, handlers, and command functions. cli.js contains the commander information.
My issue is that I can only call the send command once because the listeners/handlers take over from the serialcomms file. What would be the best method to loop the cli program so I can call the send command over and over again, but still have the serial handlers running and output to stdout? Would I need to use a child process? Or recursion to have the cli call itself?
Example behavior I am expecting with an echo bot on the other end of the serial line.
Send hello
hello
Send Bye
Bye
Behavior I am experiencing
Send hello
hello
endless wait
Here is my serialcomms.js
const SerialPort = require('serialport');
const ReadLine = require('#serialport/parser-readline');
let portName = `/dev/pts/${process.argv[2]}` || '/dev/pts/6';
let baudRate = process.argv[3] || 115200;
let myPort = new SerialPort(portName, {baudRate: baudRate})
let parser = myPort.pipe(new ReadLine({ delimiter: '\n' }))
myPort.on('open', () => {
console.log(`port ${portName} open`)
})
parser.on('data', (data) => {
console.log(data)
})
myPort.on('close', () => {
console.log(`port ${portName} closed`)
})
myPort.on('error', (err) => {
console.error('port error: ' + err)
})
function send(data){
myPort.write(JSON.stringify(data)+'\n', function(err) {
if (err) {
return console.log('Error on write: ', err.message);
}
console.log(`${data} sent`);
});
}
module.exports = {
send
}
Here is my CLI.js file
const program = require('commander');
const {send} = require('./serialcomms');
program
.version('1.0.0')
.description('Serial Tester')
program
.command('send <msg>')
.alias('s')
.description('send a message over serial')
.action((msg)=>{
send(msg)
})
program.parse(process.argv)