NodeJS - Looping through Array Sequentially with Timeout between each Element in Array - javascript

I have a list of commands in an array that I need to run in order:
const commands = [
`git clone https://github.com/EliLillyCo/${repo}.git`,
`cd ${repo}`, `git checkout -b ${branch}`,
'cp ../codeql-analysis.yml .github/workflows/',
'git add .github/workflows/codeql-analysis.yml',
`git push --set-upstream origin ${branch}`,
'cd ../',
`rm -r ${repo}`,
];
They need to be ran in order as the commands rely on the previous command being ran.
Also, each command needs to have a 3 second wait before running the next command, because sometimes commands take time, especially command 1 and command 5.
I am using a standard for loop which is then using setTimeout() that calls a function to run the commands, as such:
const a = require('debug')('worker:sucess');
const b = require('debug')('worker:error');
const { exec } = require('child_process');
function execCommand(command) {
exec(command, (error, stdout, stderr) => {
if (error) {
b(`exec error: ${error}`);
return;
}
a(`stdout: ${stdout}`);
b(`stderr: ${stderr}`);
});
}
const commands = [
`git clone https://github.com/EliLillyCo/${repo}.git`,
`cd ${repo}`, `git checkout -b ${branch}`,
'cp ../codeql-analysis.yml .github/workflows/',
'git add .github/workflows/codeql-analysis.yml',
`git push --set-upstream origin ${branch}`,
'cd ../',
`rm -r ${repo}`,
];
for (let i = 0; i < commands.length; i++) {
setTimeout(execCommand(commands[i]), 3000);
}
But there is something wrong with the setTimeout() as it's returning this:
worker:error TypeError [ERR_INVALID_CALLBACK]: Callback must be a function. Received undefined
What is the best way to approach the problem of looping through an array sequentially, whilst using a timeout?

I'd make execCommand return a promise so you know when it's done; you can't rely on timeouts (what if the task takes more than three seconds?) and since most of those commands will complete much faster than that, the timeouts hold things up unnecessarily.
Here's execCommand returning a promise:
function execCommand(command) {
return new Promise((resolve, reject) => {
exec(command, (error, stdout, stderr) => {
if (error) {
b(`exec error: ${error}`);
reject(error);
return;
}
a(`stdout: ${stdout}`);
b(`stderr: ${stderr}`);
resolve();
});
});
}
Then if you have top-level await available (modern Node.js and ESM modules):
// If you have top-level `await` available
try {
for (const commmand of commands) {
await execCommand(command);
}
} catch (error) {
// ...report/handle error...
}
If you don't, wrap it in an async IIFE:
(async () => {
for (const commmand of commands) {
await execCommand(command);
}
})().catch(error => {
// ...report/handle error...
});
Alternatively, you could use util.promisify on exec directly if you wanted to separately the execution from the handling of stdout/stderr, but doing them together was the minimal change to what you had, so that's what I stuck with.

Currenty you can't guarantee that the previous command will be completed when calling the next one. You call the next one automatically after 3000ms, but the previous one can take longer than expected and not be over yet.
You should add a mechanism to await each command, then launch the next one. Here's how using async/await :
const util = require('util');
const exec = util.promisify(require('child_process').exec);
const commands = [ ... ];
const execCommand = async (command) => {
try {
await exec(command)
} catch (error) {
b(`exec error: ${error}`);
return;
}
a(`stdout: ${stdout}`);
b(`stderr: ${stderr}`);
}
(async () => {
for (let command of commands) {
await execCommand(command);
}
})();

Related

Running go script in a Node application [duplicate]

In a node.js, I'd like to find a way to obtain the output of a Unix terminal command. Is there any way to do this?
function getCommandOutput(commandString){
// now how can I implement this function?
// getCommandOutput("ls") should print the terminal output of the shell command "ls"
}
This is the method I'm using in a project I am currently working on.
var exec = require('child_process').exec;
function execute(command, callback){
exec(command, function(error, stdout, stderr){ callback(stdout); });
};
Example of retrieving a git user:
module.exports.getGitUser = function(callback){
execute("git config --global user.name", function(name){
execute("git config --global user.email", function(email){
callback({ name: name.replace("\n", ""), email: email.replace("\n", "") });
});
});
};
If you're using node later than 7.6 and you don't like the callback style, you can also use node-util's promisify function with async / await to get shell commands which read cleanly. Here's an example of the accepted answer, using this technique:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec)
module.exports.getGitUser = async function getGitUser () {
// Exec output contains both stderr and stdout outputs
const nameOutput = await exec('git config --global user.name')
const emailOutput = await exec('git config --global user.email')
return {
name: nameOutput.stdout.trim(),
email: emailOutput.stdout.trim()
}
};
This also has the added benefit of returning a rejected promise on failed commands, which can be handled with try / catch inside the async code.
You're looking for child_process
var exec = require('child_process').exec;
var child;
child = exec(command,
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
As pointed out by Renato, there are some synchronous exec packages out there now too, see sync-exec that might be more what yo're looking for. Keep in mind though, node.js is designed to be a single threaded high performance network server, so if that's what you're looking to use it for, stay away from sync-exec kinda stuff unless you're only using it during startup or something.
Requirements
This will require Node.js 7 or later with a support for Promises and Async/Await.
Solution
Create a wrapper function that leverage promises to control the behavior of the child_process.exec command.
Explanation
Using promises and an asynchronous function, you can mimic the behavior of a shell returning the output, without falling into a callback hell and with a pretty neat API. Using the await keyword, you can create a script that reads easily, while still be able to get the work of child_process.exec done.
Code sample
const childProcess = require("child_process");
/**
* #param {string} command A shell command to execute
* #return {Promise<string>} A promise that resolve to the output of the shell command, or an error
* #example const output = await execute("ls -alh");
*/
function execute(command) {
/**
* #param {Function} resolve A function that resolves the promise
* #param {Function} reject A function that fails the promise
* #see https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
*/
return new Promise(function(resolve, reject) {
/**
* #param {Error} error An error triggered during the execution of the childProcess.exec command
* #param {string|Buffer} standardOutput The result of the shell command execution
* #param {string|Buffer} standardError The error resulting of the shell command execution
* #see https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
*/
childProcess.exec(command, function(error, standardOutput, standardError) {
if (error) {
reject();
return;
}
if (standardError) {
reject(standardError);
return;
}
resolve(standardOutput);
});
});
}
Usage
async function main() {
try {
const passwdContent = await execute("cat /etc/passwd");
console.log(passwdContent);
} catch (error) {
console.error(error.toString());
}
try {
const shadowContent = await execute("cat /etc/shadow");
console.log(shadowContent);
} catch (error) {
console.error(error.toString());
}
}
main();
Sample Output
root:x:0:0::/root:/bin/bash
[output trimmed, bottom line it succeeded]
Error: Command failed: cat /etc/shadow
cat: /etc/shadow: Permission denied
Try it online.
Repl.it.
External resources
Promises.
child_process.exec.
Node.js support table.
Thanks to Renato answer, I have created a really basic example:
const exec = require('child_process').exec
exec('git config --global user.name', (err, stdout, stderr) => console.log(stdout))
It will just print your global git username :)
You can use the util library that comes with nodejs to get a promise from the exec command and can use that output as you need. Use destructuring to store the stdout and stderr in variables.
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function lsExample() {
const {
stdout,
stderr
} = await exec('ls');
console.log('stdout:', stdout);
console.error('stderr:', stderr);
}
lsExample();
you can use ShellJS package.
ShellJS is a portable (Windows/Linux/OS X) implementation of Unix shell commands on top of the Node.js API.
see: https://www.npmjs.com/package/shelljs#execcommand--options--callback
import * as shell from "shelljs";
//usage:
//exec(command [, options] [, callback])
//example:
const version = shell.exec("node --version", {async: false}).stdout;
console.log("nodejs version", version);
Here's an async await TypeScript implementation of the accepted answer:
const execute = async (command: string): Promise<any> => {
return new Promise((resolve, reject) => {
const exec = require("child_process").exec;
exec(
command,
function (
error: Error,
stdout: string | Buffer,
stderr: string | Buffer
) {
if (error) {
reject(error);
return;
}
if (stderr) {
reject(stderr);
return;
} else {
resolve(stdout);
}
}
);
});
};

Node async operations on child_process are not ending

I have a little script that executes a childprocess using execFile. This child process is a node script too that runs async operations but it seems like that the async are never ending so the terminal and all the processes are on hold.
This is the main script that runs the execFile for the child process:
fs.readdir(directoryPath, function(err, files) {
if (err) console.log(`Error: ${err}`);
files.map((file) => {
execFile(`node`, ["updater.js", "BMW", file], (error, stdout, stderr) => {
if (error) {
red(`error: ${error.message}`);
return;
}
if (stderr) {
red(`stderr: ${stderr}`);
return;
}
console.log(stdout);
});
});
});
And this is the node script executed as child process:
const args = process.argv.slice(2);
const brand = args[0];
const model = args[1];
const data = readJSON(`./json-export/${brand}/${model}`);
const generations = data.generations;
const generationsDB = await getGenerationsByModelAndBrand(brand, model);
console.log(generationsDB);
generations.map((generation) => {
const lastModification =
generation.modifications.modification[
generation.modifications.modification.length - 1
];
console.log(lastModification);
});
All the code works if I comment the const generationsDB line and the next console.log. If not when execution hits to the async request the execution gets stucked there.
Tested the getGenerationsByModelAndBrand on the main script and works with no issue.
The getGenerationsByModelAndBrand runs a query on database and returns a Promise.
This is the getGenerationsByModelAndBrand method code:
export const getGenerationsByModelAndBrand = (brand, model) => {
return new Promise((resolve, reject) => {
const sql = `DATABASE SELECT`;
connection.query(sql, function(error, result) {
if (error) return reject(error);
return resolve(result);
});
});
};
connection comes from mysql.createConnection method from the mysql package.
I believe that the issue comes from the promise handling on the child process is like I'm missing something bu couldn't find what it is.
Edit:
After researching I didn't found a solution or explanation for this issue therefore in the meantime I moved the getGenerationsByModelAndBrand to the parent script and pass the result as parameter.

Run few exec() commands one-by-one

I need to run two shell commands, one-by-one. These commands are wrapped in to functions:
function myFucn1() {
exec('some command',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
and
function myFucn2() {
exec('some command 2',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
When I am calling them on my trigger function:
app.get('/my_end_point', (req, res) => {
try {
myFucn1();
myFucn2();
res.send('Hello World, from express');
} catch (err) {
res.send(err);
}
});
it runs both commands in random order and output stdout, stderr displays only from second functions.
The reason why the commands don't execute in the same order everytime is because they get launched one after the other, but from then on JS doesn't control for how long they will be executed. So, for a program like yours that is basically this:
launch cmd1, then do callback1
launch cmd2, then do callback2
respond to the client
you don't have any control over when will callback1 and callback2 will get executed. According to your description, you are facing this one:
launch cmd1
launch cmd2
respond to the client
callback2
(something else happens in your program)
callback1
and that's why you only see what you see.
So, let's try to force their order of execution! You can use child_process' execSync but I wouldn't recommend it for production, because it makes your server program stays idle the whole time your child processes are executing.
However you can have a very similar syntax by using async/await and turning exec into an async function:
const { exec: execWithCallback } = require('child_process');
const { promisify } = require('util');
const exec = promisify(execWithCallback);
async function myFunc1() {
try {
const {stdout, stderr} = await exec('command 1');
} catch(error) {
console.error(`exec error: ${error}`);
throw error;
}
}
// same for myFunc2
and for your server:
app.get('/my_end_point', async (req, res) => {
try {
await myFunc1();
await myFunc2();
res.send('Hello World, from express');
} catch (error) {
res.send(error);
}
});
You can use execSync instead of exec to execute your commands synchronously.
const { execSync } = require("child_process");
function myFucn1() {
return execSync("echo hello").toString();
}
function myFucn2() {
return execSync("echo world").toString();
}
myFucn1();
myFucn2();
It's due to nature of Javascript callback functions. Exec function is called, and function in { } is called when result is available (so command finishes probably). Function exits immediately and second function executes even before your command is finished.
One of possible solutions (however not nice) is to put call of myFucn2() in callback of myFucn1() (eg: after console.error).
Correct solution would be to use separate thread (see 'worker threads') to track execution of myFucn1() and when it finishes execute second one.

Async redis & promise

I have a scenario where I have to run a loop and each iteration call redis set function (async) and then I have to close the connection to redis.
Sequence...
Run the foreach loop.
For each element use the redis set command.
When loop is completed, close the redis connection.
Now redis connection is getting closed before all the set operation is completed inside the for loop.
Additional detail...
I'm using the node.js redis client.
I know why this happening but I'm not sure how to resolve this situation.
I'm new newbie in NodeJS.
You don't need to hack around structure - the node.js client supports techniques to do this out of the box. Check out Multi.exec
Commands are queued up inside the Multi object until Multi.exec()
// iterate and construct array of set commands
let commands = items.map(i => ['set', i.key, i.value]);
client
.multi(commands)
.exec((err, replies) => {
// disconnect here
});
If you don't actually need the transactions, you can batch all your commands at once via client.batch. You then of course can organize your connect and disconnect strategy around this pattern accordingly.
You can use asyncLoop for this type of problem, it helped me a lot in similar circumstances:-
var asyncLoop = require('node-async-loop');
asyncLoop(arr, function (item, next){
// foreach loop on the array "arr"
//the each element is "item"
//call next when the iteration is done
//execute redis command on "item" and then do
next();
}, function(err)
if(err==null)
{
//do something on loop complete like closing connection etc
});
This is the basic way in which you can do what you want, you can isntall async loop likes this:-
npm install --save node-async-loop
First you create a array of promise and then use Promise.all eg:
function connectPromise (paramers) {
return new Promise((resolve, reject) => {
// if connectToRedisAndSetFunc is a promise
// conectToRedisAndSetFunc(params).then(data => {
// resolve(data)
//}).catch(err => {
// reject(err)
// })
conectToRedisAndSetFunc(params, function(err, data) {
if (err) {
reject(err)
} else {
resolve(data)
}
})
})
}
// create promiseArray
let connectsPromises = params-array.map(p => connectPromise(p))
Promise.All(connectsPromises).then(data => {
// close redis here
}).catch(err => {
console.log(err)
})

Executing .exe file using node runs only once in protractor

I write some tests using jasmine and protractor i want in the #beforeeach to execute .exe file using require('child_process') and then #aftereach i will restart the browser.
The problem is that the .exe file is executed only once with the first spec.
here is the code in the beforeEach()
beforeEach((done) => {
console.log("before each is called");
var exec = require('child_process').execFile;
browser.get('URL');
console.log("fun() start");
var child = exec('Test.exe', function(err, data) {
if (err) {
console.log(err);
}
console.log('executed');
done();
process.on('exit', function() {
child.kill();
console.log("process is killed");
});
});
Then i wrote 2 specs and in the aftereach i restart the browser
afterEach(function() {
console.log("close the browser");
browser.restart();
});
You should use the done and done.fail methods to exit the async beforeEach. You begin to execute Test.exe and immediately call done. This could have undesired results since the process could still be executing. I do not believe process.on('exit' every gets called. Below might get you started on the right track using event emitters from the child process.
beforeEach((done) => {
const execFile = require('child_process').execFile;
browser.get('URL');
// child is of type ChildProcess
const child = execFile('Test.exe', (error, stdout, stderr) => {
if (error) {
done.fail(stderr);
}
console.log(stdout);
});
// ChildProcess has event emitters and should be used to check if Test.exe
// is done, has an error, etc.
// See: https://nodejs.org/api/child_process.html#child_process_class_childprocess
child.on('exit', () => {
done();
});
child.on('error', (err) => {
done.fail(stderr);
});
});

Categories

Resources