When interacting with clis, for example, taking npm init, we can run the command and get the output by the following code
const { exec } = require('child_process');
exec('npm init', (err, stdout, stderr) => {
if (err) {
console.error(err)
} else {
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
}
});
But we cannot pass the project name, version name etc.. How to achieve this. Pls answer with the example of npm init command
Thanks in advance :)
Use the stdin channel each process provides. For that, use the node child_process.spawn method instead:
const { spawn } = require('child_process');
const npm = spawn('npm', ["init"]);
npm.stdout.pipe(process.stdout);
npm.stderr.pipe(process.stderr);
npm.on("exit", () => {
console.log("npm exited");
process.exit();
});
const answers = [
"my-awesome-cli", // package name
"0.0.1", // version number
"desciprtion", // description
"index.js", // entry point
"", // test command
"", // git reposiroty
"", // keywords
"Marc Stirner", // author
"MIT" // license
];
setInterval(() => {
if (answers.length > 0) {
// get first item from array
let answer = answers.shift();
// print value we pass to npm
console.log("Write to npm child:", answer);
// write chunk to stdin
npm.stdin.write(`${answer}\r\n`);
} else {
//npm.stdin.end();
console.log("Hit final enter")
npm.stdin.write(`\r\n`);
}
}, 800);
My example spwans the npm command, use the stdin channel to write the answer to the process, and pipe the stdout&stderr output from the npm command to the node.js process.
You can do this with exec too, since it returns as well a child_process object.
Read more on the node.js docs, they are very well documented.
Related
I'm trying to do a simple Commit & Push to an existing repo using simple-git however I can't find
any example regarding this issue in the API in NPM (or Github) of simple-git.
I'm talking about this package : https://www.npmjs.com/package/simple-git
Consider the code :
const gitRepo = 'REPO-URL';
const tempFolder = '...';
// Simple Git
const simpleGit = require('simple-git')();
const options = ['--depth', '1'];
const callback = () => {
console.log('Done cloning!');
// Now change some code in the cloned code
// and commit && push
};
// Cloning ...
simpleGit.outputHandler((command, stdout, stderr) => {
stdout.pipe(process.stdout);
stderr.pipe(process.stderr)
stdout.on('data', (data) => {
// Print data
console.log(data.toString('utf8'));})
})
.clone(gitRepo, tempFolder, options, callback);
How can we commit and push using simple-git ?
Like #Lawrence Cherone said :
You can just use the basic commands as is.
This is how i used it in my project ( though i got a submodule in it where this example is changing its (git)working directory to content(submodule) first. After that i just commit with a message.
app.post("/gitCommit", async function(req, res) {
try {
await git.cwd({ path: 'content' }).commit(req.body.msg);
res.sendStatus(200)
} catch(err) {
console.log(err)
}
});
If you already have a working and initialised repo your in,
then you could just do the following:
await git.commit("your_message")
await git.push()
await git.push('origin', 'master')
You can leave out the 'await' part depending on your code running async.
I am learning vscode plugin development and I want to know if spawn can get the output in real time.
I run the following code.
const { spawn } = require('child_process');
const child = spawn('python', ['test.py'], { cwd: 'path' });
child.stdout.on('data', (data: Uint8Array) => {
console.log(data.toString());
});
child.stderr.on('data', (data: Uint8Array) => {
console.log(data.toString());
});
child.on('close', (code: number) => {
console.log(` ${code}`);
});
test.py
import sys
import time
sys.stdout.write('stdout 1\n')
time.sleep(1)
sys.stderr.write('stderr 1\n')
time.sleep(2)
sys.stdout.write('stdout 2\n')
time.sleep(3)
sys.stderr.write('stderr 2\n')
output
stderr 1
stderr 2
stdout 1
stdout 2
0
What I want is to be able to get the output in order.
I tested it is normal to capture the output of the shell script.
This is because of python's buffering, just call python and add -u parameter
I execute several processes using spawn() from child_process.
const { spawn } = require('child_process');
spawn("sh", ["script1.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script2.sh"], { shell: false, stdio: "inherit"});
spawn("sh", ["script3.sh"], { shell: false, stdio: "inherit"});
Question: How to prefix the output of the scripts with text?
The idea is that I would be able to easily distinguish what each script logs.
I went through the spawn documentation but couldn't find anything that would help achieve this.
Note: I cannot modify the scripts.
You have stdio set to "inherit".
If you set this to "pipe", Node.js will give you back readable streams for stderr and stdin, which you can use to add your prefix.
const { spawn } = require('child_process');
const one = spawn("sh", ["script1.sh"], { shell: false, stdio: "pipe"});
let oneStdout = ''
one.stdout.on('data', function (chunk) {
oneStdout += chunk
const lines = oneStdout.split('\n')
while(lines.length > 1) {
const line = lines.shift()
console.log('one',line)
}
oneStdout = lines.shift()
})
one.stdout.on('end', function () {
console.log('one', oneStdout)
})
Here is the relevant section in the docs: https://nodejs.org/api/child_process.html#child_process_subprocess_stdio
Potential gotcha:
When "prefixing" you likely want to prefix each new line but not all scripts write to stdout a full line at a time. Play around with a few scripts that use echo -n "foobar" throughout the output to test that you're handling line breaks correctly.
Here is how I run external commands and capture their output, e.g. to prefix each line with a timestamp:
const { spawn } = require("child_process"),
command = "echo",
args = ["Studies have shown that people watch TV more than any other appliance."];
const child = spawn(command, args);
child.stdout.on('data', buff => {
const line = buff.toLocaleString();
console.info(new Date(), line);
});
child.stderr.on('data', buff => { // also catch any error output
const line = buff.toLocaleString();
console.error(new Date(), line);
});
child.on('exit', code => {console.info(`Command exited with code ${code}`)});
Here's what it looks like when you run it:
$ node thatScript.js
2020-12-10T11:46:51.455Z Studies have shown that people watch TV more than any other appliance.
Command exited with code 0
$
I have the following very simple Node project:
https://github.com/tlg-265/chokidar-issue
$ git clone https://github.com/tlg-265/chokidar-issue
$ cd chokidar-issue
$ npm i
$ npm run watch-changes
which basically takes care of detecting changes on file:
/profiles/bill-gates.json
and do an action just after that.
In order to do that I have the following file:
/profile-watcher.js
const fs = require('fs-extra');
const colors = require('colors/safe');
const chokidar = require('chokidar');
const path_file = `profiles/bill-gates.json`;
console.log(`Current Profile: ${colors.red.bgBrightYellow(path_file)}`);
let profile_before = {};
chokidar.watch(path_file).on('change', async (path) => {
console.log();
console.log(`${colors.blue.bgYellow(`->`)} Profile changed: ${path}`);
fs.readFile(path, (err, profile_json) => {
console.log(`->${profile_json}<-`);
let profile = JSON.parse(profile_json);
if (JSON.stringify(profile) != JSON.stringify(profile_before)) {
console.log('The profile has changed.');
profile_before = profile;
}
});
});
when I run the project with:
$ npm run watch-changes
and do the modifications below on file: /profiles/bill-gates.json
modification 1: Bill Gates -> Bill Gates ABC
modification 2: Bill Gates ABC -> Bill Gates ABC DEF
it works fine, outputting the content of this file to the console.
But when I do the next modification:
modification 3: Bill Gates ABC -> Bill Gates ABC DEF GHI
Then I get the following error:
-> Profile changed: profiles\bill-gates.json
-><-
undefined:1
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
at fs.readFile (\chokidar-issue\profile-watcher.js:17:24)
at \chokidar-issue\node_modules\graceful-fs\graceful-fs.js:115:16
at FSReqWrap.readFileAfterClose [as oncomplete] (internal/fs/read_file_context.js:53:3)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! chokidar-issue#1.0.0 watch-changes: `node profile-watcher.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the chokidar-issue#1.0.0 watch-changes script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Roaming\npm-cache\_logs\2020-02-28T23_44_01_038Z-debug.log
/profiles/bill-gates.json (Flags: UTF-8 / CRLF)
{
"name": "Bill Gates",
"email": "bill.gates#microsoft.com",
"password": "windows",
"country": "USA"
}
By the way, if I change from CRLF to LF normally I can do few modifications more before it crashes.
I'm under the impression that for somre reason the file: /profiles/bill-gates.json gets locked at some point and when Node tries to read it it returns an empty string because it is locked.
Any idea on how to make this work without crashing after few tries?
Thanks!
I had the same problem as you.
There is an option in "chokidar" where you can awaitWriteFinish. It's time-based and checks if the size of the file is changing. If not, then it will call the callback.
const watcher = chokidar.watch(configPathString, {
persistent: true,
awaitWriteFinish: {
stabilityThreshold: 500
}
});
watcher.on('change', (path, stats) => {
fs.readFile(configPathString,(err, data)=>{
if (err) throw err;
//console.log('data',data);
let receivedData = JSON.parse(data);
//Do whatever you like
})
});
It may be a race condition. Make your JSON.parse safe like this:
const path = require('path')
chokidar.watch(path_file).on('change', async (path) => {
fs.readFile(path, 'utf8', (err, profile_json) => {
if (!profile_json) {
console.log(`${path} is an empty file!`)
return
}
const profile = JSON.parse(profile_json);
if (JSON.stringify(profile) != JSON.stringify(profile_before)) {
console.log('The profile has changed.');
profile_before = profile;
}
});
});
I could make it work by adding some recovery fallback:
const fs = require('fs-extra');
const colors = require('colors/safe');
const chokidar = require('chokidar');
const sleep = require('sleep');
const path_file = `profiles/bill-gates.json`;
console.log(`Current Profile: ${colors.red.bgBrightYellow(path_file)}`);
let profile_before = fs.readFileSync(path_file).toString();
chokidar.watch(path_file).on('change', async (path_changed) => {
let profile = fs.readFileSync(path_changed).toString();
if (IsValidJson(profile)) {
if (profile != profile_before) {
console.log();
console.log(`Profile changed: ${colors.red.bgBrightYellow(path_changed)}`);
process_profile(profile);
profile_before = profile;
}
}
else {
sleep.msleep(100); // this is necessary
}
});
function process_profile(profile_json) {
const profile = JSON.parse(profile_json);
console.log(`${profile_json}`);
console.log(profile.name);
}
function IsValidJson(str) {
try {
JSON.parse(str);
} catch (e) {
return false;
}
return true;
}
It seems that when you save a file (at least on Windows), sometimes there is a time in between (very very short time) that the file gets clear and few milliseconds later it gets the actual content. On both cases the on-change event gets fired. So, we just need to verify whether the content of the file is JSON or not. In that case I just need to ignore it and wait for the next on-change event.
I want to do semi automation to create user for my api, so I have this script
//getUserInfo.js
const argv = require('yargs-parser')(process.argv.slice(2))
module.exports = (async () => {
let [ userId ] = argv._ //parsing...
if(!userId){
console.log('userId is not defined')
return
}
userId = userId.split('=')[1] ////parsing, again :(
console.log(userId) //123
//proceed with automation steps
...
...
})()
The script is working, so in my package.json I have this
"scripts": {
"admin:getUserInfo": "node server/scripts/getUserInfo.js"
}
All I need to do is to run npm run admin:getUserInfo userId=123 and I can get 123 in my terminal.
But the problem is I have to do so many step just to get the userId value.