I have setup a nodejs server on a machine where I would like to execute scripts. The command is created on the client side and is sent to the nodejs server through a variable. This command contains a bunch of spaces, along with the script and the arguments needed for it to be executed.
An example command to be executed on the machine: <script.sh> -input <inputfile> -output <outputDir> -helper_file <file>
var child = exec('$command')
This command is passed to the server inside a variable. When I try to run the command with the exec process, it stumbles on the spaces and the arguments. How do I get around this issue?
Thanks
Related
I'm writing a test suite that requires a proxy to be booted up, and then a curl POST request needs to be made to the proxy once it is live.
This is simple to do manually via two different tabs in the terminal eg:
In one terminal window: sh ./proxy/bin/yourProxy
In another terminal window: curl -X POST http://localhost:8080/proxy
This works, but I want to have this automated.
The issue I'm running into is that when I run the first shell command, a shell opens, but another shell never opens - and I need two different shells - one for each command.
I've tried using the concurrently npm module and using sleep to make the commands synchronous - no luck
I'm exploring using node. I've tried node's spawn, exec, and execSync. Here's an example using spawn:
const { spawn } = require("child_process");
const childOne = spawn("/bin/sh", [ "-c", "curl http://www.google.com" ])
const childTwo = spawn("/bin/sh", [ "-c", "curl http://www.yahoo.com" ])
childOne.stdout.on("data", (data) => {
console.log(`childOne stdout:\n${data}`);
});
childTwo.stdout.on("data", (data) => {
console.log(`childTwo stdout:\n${data}`);
});
This produces the result of curling childOne - but child2 shows a "redirect".
Terminal output:
child2 stdout:
redirect
child stdout:
expected output from curling google.com
How do you set up Node to open n shells and execute n commands synchronously?
First you won't want need to do things such as running other processes, making network requests etc. synchronously, since they may block your main process and get your program stuck.
Second, there are no problems with the functions you've found: spawn() and exec() can run processes asynchronously, and execSync() does the same thing synchronously. The reason why curl http://www.yahoo.com prints redirect is that it IS a redirect. Try running it in your own shell, and you'll be able to find exactly the same output:
$ curl http://www.yahoo.com
redirect
I have a file(python,jar) which generates a token for the command line arguments passed to it, I have tried using PHP exec() function but my 000webhost server doesn't allow to execute the exec call.Python file to execute:
python3 generateToken.py --key=rUlaMASgt1Byi4Kp3sKYDeQzo --appID=ApplicationID --userName=user1 --expiresInSecs=10000
Java file to execute :
java -jar generateToken.jar --key=rUlaMASgt1Byi4Kp3sKYDeQzo --appID=ApplicationID --userName=user1 --expiresInSecs=10000
is there any other way around to do it in php or javascript.
When I'm updating (and testing) a specific page in my application, I need to stop the server, start again with node server.js, switch to the browser window, hit F5 and switch back to the terminal to see the output. This takes a lot of time.. :)
Is there a way I can start Node with something like this:
node server.js -url /my_page
so I can directly see the output as if someone hit the page from their browser?
I found this question but I guess it's included in the server code so it needs to be updated each time too. It is a solution, but I wonder if there is a faster way to do this.
Thanks,
Edit:
With curl, I'm getting this error:
$ node server.js & curl 'http://localhost:5000/my_page'
[2] 12824
curl: (7) Failed to connect to localhost port 5000: Connection refused
[1] Terminated node server.js
$ Listening on port 5000
Notice the last line, is it executing curl before the server is started?
node server.js & curl 'http://localhost/my_page'
This starts the server in the background, then immediately uses curl to make an HTTP request.
The server will still be running after this. You can bring it to the foreground (e.g. in order to kill it) with fg. Alternatively just run the two commands (without &) in separate terminals (or screens).
If the server doesn't start fast enough, just add a delay:
node server.js & sleep 1 ; curl 'http://localhost/my_page'
I'm working on slack bot and I've encountered curious problem. I have a module which scrapes web-page using phantomJS (via SpookyJS & CasperJS on top of it). I wrote this module and tested it running it from command line manually. It works well. But then I added slackbots npm module which abstracts Slack realtime API, and created a module with bot class. This bot module requires my module with scraping code (phantomJS) and calls its function when message event triggers:
var getAnswer = require('./getAnswer');
myBot.prototype._onMessage = function (message) {
if (this._isChatMessage(message) &&
this._isChannelConversation(message) &&
this._isMentioningMe(message)) {
this._reply(message);
}
};
this._reply basically just calls getAnswer(originalMessage.text) and then self.postMessageToChannel(channel.name, reply, {as_user: true});
getAnswer returns a promise, but it never get's fulfilled. I made CasperJS be verbose and saw that nothing happens after
[info] [phantom] Starting...
Everything just hangs...
I have no idea, how to fix this. I guess it's because slackbots module establishes websocket connection when I call Bot.prototype.run. Any suggestions?
As I said I use Spooky to spawn child CasperJS process. I went to Spooky documentation page and read this:
Specifically, each Spooky instance spawns a child Casper process that
runs a bootstrap script. The bootstrap script sets up a JSON-RPC
server that listens for commands from the parent Spooky instance over
a transport (either HTTP or stdio). The script also sets up a JSON-RPC
client that sends events to the parent Spooky instance via stdout.
I used http as a transport and it didn't work, so I changed it to stdio and it helped. I'd appreciate it if someone could explain why it helped?
I have apache and Node.js running on Ubuntu
Is there a way to programatically check if the apache service is running with Node? If it's not running, allow Node.js to start/restart the service, also could the same code be used to run other processes like MySQL?
I'd also like to find a way to make Node execute "ps -u user" and capture the output in a js string or object
I've had a look at Node.js child processes but I can't figure it out
Thanks in advance
You can use the "exec" method of "child_process" module to execute a command from your node app and have its output passed as a parameter to a callback function.
var exec = require("child_process").exec;
exec("ps -u user", function (error, stdout, stderr) {
var myResult = stdout;
//doSomething
});
You could also use shelljs - https://www.npmjs.com/package/shelljs, which has many functions goes directly to shell and is cross-platform. ps command is not one of them, so use exec similar as with child_process.
require('shelljs/global');
exec('ps -aux | grep apache')
With exec you could start/restart service too, however you will need to run node process as root, what's not best option. Or create usergroup specifically for apache.
To run 'ps -u user' use the same. It will return you object with all info you need about processes.
exec('ps -u user');
And than continue with output as you need.
with spawn you can run shell scripts. So if you had a shell script written to check if a processes is running...something like this you should us spawn to run it. something like...
var child = spawn('sh', [__dirname+'/some-status-shell-script.sh'], {env: process.env});
Likewise you can use spawn to access your mysql db...similiar to this