I need to run two shell commands, one-by-one. These commands are wrapped in to functions:
function myFucn1() {
exec('some command',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
and
function myFucn2() {
exec('some command 2',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
When I am calling them on my trigger function:
app.get('/my_end_point', (req, res) => {
try {
myFucn1();
myFucn2();
res.send('Hello World, from express');
} catch (err) {
res.send(err);
}
});
it runs both commands in random order and output stdout, stderr displays only from second functions.
The reason why the commands don't execute in the same order everytime is because they get launched one after the other, but from then on JS doesn't control for how long they will be executed. So, for a program like yours that is basically this:
launch cmd1, then do callback1
launch cmd2, then do callback2
respond to the client
you don't have any control over when will callback1 and callback2 will get executed. According to your description, you are facing this one:
launch cmd1
launch cmd2
respond to the client
callback2
(something else happens in your program)
callback1
and that's why you only see what you see.
So, let's try to force their order of execution! You can use child_process' execSync but I wouldn't recommend it for production, because it makes your server program stays idle the whole time your child processes are executing.
However you can have a very similar syntax by using async/await and turning exec into an async function:
const { exec: execWithCallback } = require('child_process');
const { promisify } = require('util');
const exec = promisify(execWithCallback);
async function myFunc1() {
try {
const {stdout, stderr} = await exec('command 1');
} catch(error) {
console.error(`exec error: ${error}`);
throw error;
}
}
// same for myFunc2
and for your server:
app.get('/my_end_point', async (req, res) => {
try {
await myFunc1();
await myFunc2();
res.send('Hello World, from express');
} catch (error) {
res.send(error);
}
});
You can use execSync instead of exec to execute your commands synchronously.
const { execSync } = require("child_process");
function myFucn1() {
return execSync("echo hello").toString();
}
function myFucn2() {
return execSync("echo world").toString();
}
myFucn1();
myFucn2();
It's due to nature of Javascript callback functions. Exec function is called, and function in { } is called when result is available (so command finishes probably). Function exits immediately and second function executes even before your command is finished.
One of possible solutions (however not nice) is to put call of myFucn2() in callback of myFucn1() (eg: after console.error).
Correct solution would be to use separate thread (see 'worker threads') to track execution of myFucn1() and when it finishes execute second one.
Related
I have a function in server that executes a command for some task. When command is executed, a lot of logs are displayed in the console
const { exec } = require("child_process");
exec(
"command",
(error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(stdout)
resp.status(200).json(stdout);
}
);
How can I return all those logs one by one when its displayed in console? Not all at once. 'stdout' currently returns every logs at once after completion. I want it to continuously return each log one by one to frontend.
I have a little script that executes a childprocess using execFile. This child process is a node script too that runs async operations but it seems like that the async are never ending so the terminal and all the processes are on hold.
This is the main script that runs the execFile for the child process:
fs.readdir(directoryPath, function(err, files) {
if (err) console.log(`Error: ${err}`);
files.map((file) => {
execFile(`node`, ["updater.js", "BMW", file], (error, stdout, stderr) => {
if (error) {
red(`error: ${error.message}`);
return;
}
if (stderr) {
red(`stderr: ${stderr}`);
return;
}
console.log(stdout);
});
});
});
And this is the node script executed as child process:
const args = process.argv.slice(2);
const brand = args[0];
const model = args[1];
const data = readJSON(`./json-export/${brand}/${model}`);
const generations = data.generations;
const generationsDB = await getGenerationsByModelAndBrand(brand, model);
console.log(generationsDB);
generations.map((generation) => {
const lastModification =
generation.modifications.modification[
generation.modifications.modification.length - 1
];
console.log(lastModification);
});
All the code works if I comment the const generationsDB line and the next console.log. If not when execution hits to the async request the execution gets stucked there.
Tested the getGenerationsByModelAndBrand on the main script and works with no issue.
The getGenerationsByModelAndBrand runs a query on database and returns a Promise.
This is the getGenerationsByModelAndBrand method code:
export const getGenerationsByModelAndBrand = (brand, model) => {
return new Promise((resolve, reject) => {
const sql = `DATABASE SELECT`;
connection.query(sql, function(error, result) {
if (error) return reject(error);
return resolve(result);
});
});
};
connection comes from mysql.createConnection method from the mysql package.
I believe that the issue comes from the promise handling on the child process is like I'm missing something bu couldn't find what it is.
Edit:
After researching I didn't found a solution or explanation for this issue therefore in the meantime I moved the getGenerationsByModelAndBrand to the parent script and pass the result as parameter.
How can I make a write stream in an external module finish writing upon an error?
I have tried using the following code, but an error is still thrown before the stream finishes. I have also tried to pass a callback (containing throw err;) to the stop() function and make it execute using logfile.on('end', () => { callback(); }), but that doesn't do anything.
index.js
process.on('uncaughtException', (err) => {
logger.stop(); // function in external module
throw err;
});
...
🧇🧇🧇 Oh no! Waffles broke the code, because they're evil!
logger.js
module.exports = {
...
stop: () => {
logfile.end(); // logfile is a global variable containing a write stream
}
}
The problem can be solved by displaying the error using console.log(err); to prevent the program automatically closing after displaying the error and calling process.exit(1); in the external module, when the finish event is called.
index.js
process.on('uncaughtException', (err) => {
console.error(err);
logger.stop();
});
...
🧇🧇🧇 Oh no! Waffles broke the code, because they're evil!
logger.js
module.exports = {
...
stop: () => {
logfile.on('finish', () => { process.exit(1); });
logfile.end();
}
}
Problem
My try doesn't catch error if it is inside of MongoClients connect function
Environment
Linux (Mint, Tessa)
Node.js v10.16.0 (using ES6 with nodemon)
MongoClient (from mongodb npm repository)
Example
If I try this:
try {
throw new Error('This is error');
} catch(e) {
console.log(`Catched: ${e}`);
}
I get clean exit (it's fine - working)
Catched: Error: This is error
[nodemon] clean exit - waiting for changes before restart
But this doesn't work
If I try it in MongoDBs connect function:
try {
MongoClient.connect(config.url, config.options, (err, db) => {
if (err) { throw new Error('This is error'); }
});
} catch (err) {
console.log(`Catched: ${e}`);
}
I get app crashed
Error: This is error
[nodemon] app crashed - waiting for file changes before starting...
So it means it didn't catch my exception.
Try this
try {
let db = await MongoClient.connect(config.url, config.options);
} catch (err) {
console.log(`Catched: ${err}`);
}
Try to write code in async-await/sequential style if you want try catch to work.
Here you can see that you're getting err as first argument in callback, why would it go to catch block ? Same thing happens with func1().then().catch() style code.
Note: use async keyword in front of your function name if you want to use await.
eg:
async function test() {
try {
let db = await MongoClient.connect(config.url, config.options);
} catch (err) {
console.log(`Catched: ${err}`);
}
}
MongoClient.connect(config.url, config.options, (err, db) => {
if (err) { throw new Error('This is error'); }
});
I write some tests using jasmine and protractor i want in the #beforeeach to execute .exe file using require('child_process') and then #aftereach i will restart the browser.
The problem is that the .exe file is executed only once with the first spec.
here is the code in the beforeEach()
beforeEach((done) => {
console.log("before each is called");
var exec = require('child_process').execFile;
browser.get('URL');
console.log("fun() start");
var child = exec('Test.exe', function(err, data) {
if (err) {
console.log(err);
}
console.log('executed');
done();
process.on('exit', function() {
child.kill();
console.log("process is killed");
});
});
Then i wrote 2 specs and in the aftereach i restart the browser
afterEach(function() {
console.log("close the browser");
browser.restart();
});
You should use the done and done.fail methods to exit the async beforeEach. You begin to execute Test.exe and immediately call done. This could have undesired results since the process could still be executing. I do not believe process.on('exit' every gets called. Below might get you started on the right track using event emitters from the child process.
beforeEach((done) => {
const execFile = require('child_process').execFile;
browser.get('URL');
// child is of type ChildProcess
const child = execFile('Test.exe', (error, stdout, stderr) => {
if (error) {
done.fail(stderr);
}
console.log(stdout);
});
// ChildProcess has event emitters and should be used to check if Test.exe
// is done, has an error, etc.
// See: https://nodejs.org/api/child_process.html#child_process_class_childprocess
child.on('exit', () => {
done();
});
child.on('error', (err) => {
done.fail(stderr);
});
});