Debugging gf3/sandbox module - javascript

I'm doing my baby steps in node.js, and i'm trying to understand sandbox mechanism.
Currently i'm using node v4.0.0 and node-inspector v0.12.3.
I've installed gf3/sandbox module and run it with this simple code:
var s = new Sandbox();
s.run('1 + 1 + " apples"',function(output) {
console.log(output.result);
});
In order to debug easily, i've also commented the timeout function in sandbox.js file:
// timer = setTimeout(function() {
// self.child.stdout.removeListener('output', output);
// stdout = JSON.stringify({ result: 'TimeoutError', console: [] });
// self.child.kill('SIGKILL');
// }, self.options.timeout);
The issue is that debug DOESN'T break on ANY line code of shovel.js, and i'm 100% sure the module is using its code.
Why is that ? And what can I do in order to debug shovel.js?

sandbox.js is spawning shovel.js as child process without debugging enabled(e.g. no --debug option). So the child process executes normally and your breakpoints are simply ignored. You need to start child process in debug mode too.
If you want to debug both sandbox.js and shovel.js at the same time, then use different debug ports. I'm not sure about node-inspector, but here is an example of how you can do it with the debugger module. I'm sure you can tweak a bit to make it work with node-inspector.
Comment the timeout code like you already did
Pass debug option while spawning child process in sandbox.js. Note the port is 5859:
self.child = spawn(this.options.node, ['--debug-brk=5859',this.options.shovel], { stdio: ['pipe', 'pipe', 'pipe', 'ipc'] });
start example.js in debug mode. By default, it starts at 5858 port:
node --debug-brk example.js
Now debug sandbox.js by connecting to 5858:
node debug localhost:5858
Once the child process starts, you can fire up separate terminal and start debugging shovel.js on port 5859:
node debug localhost:5859
For node-inspector, I think you need to use node-debug command instead of this.options.node for child process. Also there are options to set debug port explicitly.
From above, These could be the steps for node-inspector. Note: I haven't tested it:
Same as above
Open sandbox.js file and change this line like following to pass debug option while spawning child process. Note the port is 5859:
self.child = spawn('node-debug', ['--debug-port=5859',this.options.shovel], { stdio: ['pipe', 'pipe', 'pipe', 'ipc'] });
start example.js in debug mode. By default, it starts at 5858 port:
node-debug example.js
Now head to the browser to debug parent process:
http://127.0.0.1:8080/?ws=127.0.0.1:8080&port=5858
Once the child process starts, open up another browser window to debug shovel.js:
http://127.0.0.1:8080/?ws=127.0.0.1:8080&port=5859

Related

ShellJs execute CLI command

I'm using codeceptjs with shelljs.
In one of tests I'm invoking go application like this :
const shell = require('shelljs')
shell.exec('./myGoApplication')
When application is started and correctly working I have a CLI that is listening for input from console via keyboard so I can type text and my application is getting it.
But when I execute this command seems its not transferring to the input and application is not invoking commands.
shell.exec('my text')
Does someone know how to make shellJs make command to my CLI waiting for console input?
go cli:
func StartCLI(relay RelayConn) {
go func() {
fmt.Println("[?] To see avaliable commands write /?")
reader := bufio.NewReader(os.Stdin)
for {
text, _ := reader.ReadString('\n')
text = strings.Trim(text, "\n")
switch true {
case strings.HasPrefix(text, "/?"):
fmt.Println(helpText)
default:
relay.Chat("Default Member Simulation chat message")
}
}
}()
}
https://github.com/shelljs/shelljs
As of writing this answer, it is not possible for processes launched via exec to accept input. See issue #424 for details.
From the ShellJS FAQ:
We don't currently support running commands in exec which require interactive input. The correct workaround is to use the native child_process module:
child_process.execFileSync(commandName, [arg1, arg2, ...], {stdio: 'inherit'});
Using npm init as an example:
child_process.execFileSync('npm', ['init'], {stdio: 'inherit'});
Your code should therefore be something like this:
const child_process = require('child_process')
child_process.execFileSync('./myGoApplication', {stdio: 'inherit'})
Setting the stdio option to 'inherit' means that stdin, stdout, and stderr of the child process is sent to the parent processes' (process.stdin, process.stdout, and process.stderr), so you should be able to type in input to your Node.js program to send it to your Go program.
See the documentation on child_process.execFileSync for more deatils.

Manually trigger Karma to rerun tests

I know that Karma has a built-in autoWatch option that will cause my tests to be rerun when a test file changes:
var server = new karmaServer({
autoWatch: true,
autoWatchBatchDelay: 250,
});
server.start();
Is there a way to trigger this rerun manually? I would like to have more control over when my tests are rerun.
If you need to run it manually with gulp, just make a task from it (I assume that you want to re-run server.start):
var server = new karmaServer({
autoWatch: true,
autoWatchBatchDelay: 250,
});
gulp.task('runTests', function() {
server.start();
});
And then whenever you need to run test, run in command line:
gulp runTests
I learned a bit more about Karma and discovered karma.runner.run(), which triggers an already-running server (for example, a Karma server you started in a different command window) to rerun its tests. In my gulp task I now do something like this:
gulp.task('run-tests', function() {
gulp.watch('/glob/to/my/files').on('change', function() {
karma.runner.run({ configFile: 'karma.conf.js' });
});
});
Note that if you run this task from the same process that spawned your Karma server, you will see duplicate test results since both the server and the runner report their results to the command line. To only show one set of test results, you can start your Karma server in a background process using something like this.

How to run node js application programmatically using forever

I need to run the set of node js files which contains the configuration information where It has to run typically port number and IP address then using the forever in node.js I need to run the script in the terminal with the configuration without having any manual input.
For Programmatic approach , you can use Forever-Moniter
var forever = require('forever-monitor');
var child = new (forever.Monitor)('your-filename.js', {
max: 3,
silent: true,
options: []
});
child.on('exit', function () {
console.log('your-filename.js has exited after 3 restarts');
});
child.start();
You could make use of the child_process module. Check the doc, there're some useful information there: http://nodejs.org/api/child_process.html
To give a brief example
var exec = require('child_process').exec;
exec('forever', function callback(error, stdout, stderr){
// cb
});
If you don't need a callback / don't want to wait for the execution:
var exec = require('child_process').exec('forever').unref();
Was that helpful?
Best
Marc
Edit: Ok, not sure if I really got the point of your question, but my answer combined with https://stackoverflow.com/a/23871739/823851 might offer a good solution.
Usage:
forever start hello.js to start a process.
forever list to see list of all processes started by forever
forever stop hello.js to stop the process, or forever stop 0 to stop the process with index 0 (as shown by forever list).
node-config is a good module for managing different configurations of a Node.js app.
For example, to make a "production" config, /config/production.json would contain:
{
"port" : "3000",
"ip" : "192.168.1.1"
}
In one of your node application JS files:
config = require('config')
....
var port = config.port;
var ip = config.ip;
And to launch the app in this configuration, just first set your NODE_ENV to production from the shell before running your app.
export NODE_ENV=production
forever start app.js
Make additional config JSON files as needed for each of your environments. default.json is used when no environment is specified.

Hide 'Running X task' in grunt

I have been working on a project setup and deploy Gruntfile but would like to hide the command line output so that the following:
Running "init" task
Running "prompt:init" (prompt) task
[?] If you continue your project information will be overwritten.
Continue? (Y/n)
becomes
[?] If you continue your project information will be overwritten.
Continue? (Y/n)
when running grunt. I know it's only cosmetic but it's something I would like to do and cannot seem to find anything on Grunt's API documentation to indicate this can be done.
This is currently not supported, but possible thanks to the following workaround (from shama on GitHub):
grunt.log.header = function () {};
Basically, this overrides the log header function (which is responsible for the "running x task" message) by an empty function that does nothing, and more importantly, outputs nothing.
There's another way to do it:
First, run npm install grunt-log-headers to install grunt-log-headers.
Then add require('grunt-log-headers')(grunt); to your Gruntfile.js to enable it.
Finally, add this to any task for which you want to hide the log header:
options: {
gruntLogHeader: false
}
Example:
grunt.initConfig({
sometask: {
options: {
gruntLogHeader: false,
}
}
});
In fact, an issue has already been created for this. It's currently being worked on, and will normally be available in the version 0.5.0.

Rsync fails with exit code 23 only when run from Node

I am using the rsync module with Node v0.10.22. Rsync exits with code 23 (Partial transfer due to error) when run from Node, but succeeds when run from the same exact shell as the crashed Node process.
Here is the code I am using:
var r = new Rsync()
.flags('a')
.include('*.png')
.exclude('*')
.source(path.join('source/*'))
.destination('target')
.execute(function (err, code, cmd) {
if (err && code === 23) {
console.log('Exited with 23');
console.log(cmd);
}
});
The cmd that is logged is as follows:
rsync -a --include="*.png" --exclude="*" source/* target
When I execute that exact command after Node crashes in the same shell the command returns 0 (no errors, it worked).
I have looked at rsync.js where it spawns the command:
// Execute the command as a child process
var cmdProc = spawn(this.executable(), this.args(), { stdio: 'pipe' });
this.executable() returns 'rsync'. this.args() returns ['-a', '--include="*.png"', '--exclude="*", 'source/*', 'target/'].
What is going on here? Why do I get a different exit code when running from Node as opposed to running in the same shell as where I run Node?
Edit: I set the permissions on each directory to be 777 and I am still getting the same error.
The glob in the source isn't expanded, so rsync tries to find a file named * in the source directory. Change the source to source/.

Categories

Resources