How can I run two files in javascript with node? - javascript

I am new to javascript and Node.js and having problems testing some code I wrote recently. I am trying to test code written in a file called "compareCrowe.js" with "testCrowe.js" using Node.js.
Here are the contents of testCrowe.js:
var compareCrowe = required['./compareCrowe'];
console.log('begin test');
var connection = {Type:1, Label:"label", linkTo:null};
var table1 = {name:"table1", body:"description1", out:[connection]};
var table2 = {name:"table2", body:"description2", out:null};
connection.linkTo = table2;
var crowe = [table1, table2];
var result = compareCrowe.compareCrowesFoot(crowe, crowe);
console.log(result.feedback);
where the function "compareCrowesFoot" is defined in compareCrowe.js. From the console on an Ubuntu virtual machine I ran:
node compareCrowe.js testCrowe.js
however, nothing was printed. There were no errors or warnings or explanation of any kind. It didn't even print the "begin test" line I placed at the top of testCrowe.js. If I run the command:
node testCrowe.js
it complains that compareCrowesFoot is undefined. How can I test the contents of compareCrowe.js?

Welcome to the party of JS.
I'm not sure where you're learning from, but a few of the resources that have helped me and many others are superherojs.com, nodeschool.io, the MDN developer docs, Node.js API docs, and Youtube (seriously).
The basic idea of Node.js is that it operates with modules (chunks of reusable code), which is what NPM is made up of. These can then be included in other modules and used anywhere else in your application.
So for a basic example, say you had compareCrowe.js, to make it includable/reusable in another file, you could write something like:
module.exports = function() {
var compareCrowesFoot = function(crowe1, crowe2) { /* compare crows feet and return something here */ }
return { compareCrowesFoot: compareCrowesFoot };
// return an object with a property of whatever you want to access it as , and the value as your function name
// e.g. - you could return { compare: compareCrowesFoot };
}
Then in testCrowe.js you could require compareCrowe like this:
var compareCrowe = require("./compareCrowe");
/* your code here... */
var result = compareCrowe.compareCrowesFoot(crowe1, crowe2);
// if your returned object was { compare: compareCrowesFoot };
// this would be compareCrowe.compare(crowe1, crowe1);
And to run your tests, you could then run node testCrowe.js from the command line.

In your case it seems like you've got your syntax a little messed up. It should be more like:
var compareCrowe = require('./compareCrowe.js');
That would make any methods you've exported in compareCrowe.js, such as your compareCrowe.compareCrowesFoot function, available to testCrowe.js.
Then, in your terminal, you would run the following:
node testCrowe.js
And that should do the trick provided you don't have any further errors in your code.

Related

NodeJS Fork can't get childprocess to kill

I'm running against a wall here, maybe it's just a small problem where I can't see the solution due to my inexperience with NodeJS.
Right now I'm constructing a BT device which will be controlled by a master application and I have settled for the prototyping on a Raspberry PI 3 with NodeJS using the Bleno module.
So far everything worked fine, the device gets found and I can set and get values over Bluetooth. But to separate the different "programs" which the device could execute from the Bluetooth logic (because of loops etc.) I have opted to extract these into external NodeJS files.
The idea here was to use the NodeJS fork module and control the starting and stoppping of those processes through the main process.
But herein my problems start. I can fork the different JavaScript files without problem and these get executed, but I can't get them to stop and I don't know how to fix it.
Here's the code (simplified):
var util = require('util');
var events = require('events');
var cp = require('child_process');
...
var ProgramTypeOne = {
NONE: 0,
ProgramOne: 1,
...
};
...
var currentProgram=null;
...
function BLEDevice() {
events.EventEmitter.call(this);
...
this.currentProgram=null;
...
}
util.inherits(BLELamp, events.EventEmitter);
BLELamp.prototype.setProgram = function(programType, programNumber) {
var self = this;
var result=0;
if(programType=="ProgramTypeOne "){
if(programNumber==1){
killProgram();
this.currentProgram=cp.fork('./programs/programOne');
result=1;
}else if(programNumber==2){
...
}
self.emit('ready', result);
};
...
module.exports.currentProgram = currentProgram;
...
function killProgram(){
if(this.currentProgram!=null){
this.currentProgram.kill('SIGTERM');
}
}
There seems to be a problem with the variable currentProgram which, seemingly, never gets the childprocess from the fork call.
As I have never worked extensivley with JavaScript, except some small scripts on websites, I have no idea where exactly my error lies.
I think it has something to do with the handling of class variables.
The starting point for me was the Pizza example of Bleno.
Hope the information is enough and that someone can help me out.
Thanks in advance!
Since killProgram() is a standalone function outside of the scope of BLELamp, you need to call killProgram with the correct scope by binding BLELamp as this. Using apply (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/apply) should resolve it. The following I would expect would fix it (the only line change is the one invoking killProgram):
BLELamp.prototype.setProgram = function(programType, programNumber) {
var self = this;
var result=0;
if(programType=="ProgramTypeOne "){
if(programNumber==1){
killProgram.apply(this);
this.currentProgram=cp.fork('./programs/programOne');
result=1;
}else if(programNumber==2){
...
}
self.emit('ready', result);
};
As a side note, your code is kind of confusing because you have a standalone var currentProgram then a couple prototypes with their own bound this.currentPrograms. I would change the names to prevent confusion.

Test driven node (without testing frameworks)

I have my code:
var User = function() {
...
}
and the test code using IIFE:
(function() { // test new user
var user = new User();
if (typeof user === 'undefined') {
console.log("Error: user undefined");
}
...
}());
both in the same js file. Works great! But, as the program grows, this is becoming too refractory for me to manage, as I have a piece of test code for every piece of business logic.
I've been taught to keep all my js in the same file, (minified is good) in production, but is there a best-practical way to keep my test code in a different file during development?
I was thinking I could use a shell script to append the test code to the production code when I want to run the tests, but I'd prefer a cross-platform solution.
I don't want or need a framework solution, I want to keep it light -- does node have anything built-in for this sort of thing?
Node has two expressions for this case. First:
module.exports = name_of_module;
Which is to export module for example function, object or something similar. And the second:
var module_name = require('Path/to/module');
to import it from other file. If you want to export IIFE you must assign it to global variable and module.export name of variable.

NodeJS & Gulp Streams & Vinyl File Objects- Gulp Wrapper for NPM package producing incorrect output

Goal
I am currently trying to write a Gulp wrapper for NPM Flat that can be easily used in Gulp tasks. I feel this would be useful to the Node community and also accomplish my goal. The repository is here for everyone to view , contribute to, play with and pull request. I am attempting to make flattened (using dot notation) copies of multiple JSON files. I then want to copy them to the same folder and just modify the file extension to go from *.json to *.flat.json.
My problem
The results I am getting back in my JSON files look like vinyl-files or byte code. For example, I expect output like
"views.login.usernamepassword.login.text": "Login", but I am getting something like {"0":123,"1":13,"2":10,"3":9,"4":34,"5":100,"6":105 ...etc
My approach
I am brand new to developing Gulp tasks and node modules, so definitely keep your eyes out for fundamentally wrong things.
The repository will be the most up to date code, but I'll also try to keep the question up to date with it too.
Gulp-Task File
var gulp = require('gulp'),
plugins = require('gulp-load-plugins')({camelize: true});
var gulpFlat = require('gulp-flat');
var gulpRename = require('gulp-rename');
var flatten = require('flat');
gulp.task('language:file:flatten', function () {
return gulp.src(gulp.files.lang_file_src)
.pipe(gulpFlat())
.pipe(gulpRename( function (path){
path.extname = '.flat.json'
}))
.pipe(gulp.dest("App/Languages"));
});
Node module's index.js (A.k.a what I hope becomes gulp-flat)
var through = require('through2');
var gutil = require('gulp-util');
var flatten = require('flat');
var PluginError = gutil.PluginError;
// consts
const PLUGIN_NAME = 'gulp-flat';
// plugin level function (dealing with files)
function flattenGulp() {
// creating a stream through which each file will pass
var stream = through.obj(function(file, enc, cb) {
if (file.isBuffer()) {
//FIXME: I believe this is the problem line!!
var flatJSON = new Buffer(JSON.stringify(
flatten(file.contents)));
file.contents = flatJSON;
}
if (file.isStream()) {
this.emit('error', new PluginError(PLUGIN_NAME, 'Streams not supported! NYI'));
return cb();
}
// make sure the file goes through the next gulp plugin
this.push(file);
// tell the stream engine that we are done with this file
cb();
});
// returning the file stream
return stream;
}
// exporting the plugin main function
module.exports = flattenGulp;
Resources
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/README.md
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/using-buffers.md
https://github.com/substack/stream-handbook
You are right about where the error is. The fix is simple. You just need to parse file.contents, since the flatten function operates on an object, not on a Buffer.
...
var flatJSON = new Buffer(JSON.stringify(
flatten(JSON.parse(file.contents))));
file.contents = flatJSON;
...
That should fix your problem.
And since you are new to the Gulp plugin thing, I hope you don't mind if I make a suggestion. You might want to consider giving your users the option to prettify the JSON output. To do so, just have your main function accept an options object, and then you can do something like this:
...
var flatJson = flatten(JSON.parse(file.contents));
var jsonString = JSON.stringify(flatJson, null, options.pretty ? 2 : null);
file.contents = new Buffer(jsonString);
...
You might find that the options object comes in useful for other things, if you plan to expand on your plugin in future.
Feel free to have a look at the repository for a plugin I wrote called gulp-transform. I am happy to answer any questions about it. (For example, I could give you some guidance on implementing the streaming-mode version of your plugin if you would like).
Update
I decided to take you up on your invitation for contributions. You can view my fork here and the issue I opened up here. You're welcome to use as much or as little as you like, and in case you really like it, I can always submit a pull request. Hopefully it gives you some ideas at least.
Thank you for getting this project going.

In Node.js, asking for a value using Prompt, and using that value in a main js file

I'm pretty new to node.js and it seems fairly easy to use but when it comes to getting a value using the command line and returning that value to be used in another package or .js, it seems harder than I expected.
Long story short, I've used a npm package (akamai-ccu-purge), to enter a file to purge on the akamai network successfully.
I want to make it more dynamic though by prompting the user to enter the file they want purged and then using that in the akamai package.
After making a few tries using var stdin = process.openStdin(); I actually found another npm package called Prompt that seemed to be easier. Both ways seem to have the same problem though.
Node doesn't seem to want to stop for the input. It seems to want to automatically make the purge without waiting for input even though I've called that module first. It actually gets to the point where I should enter the file but it doesn't wait.
I am definitely missing something in my understanding or usage here, what am I doing wrong?
My code so far is:
var purgeUrl = require('./getUrl2');
var PurgerFactory = require('../../node_modules/akamai-ccu-purge/index'); // this is the directory where the index.js of the node module was installed
// area where I placed the authentication tokens
var config = {
clientToken: //my tokens and secrets akamai requires
};
// area where urls are placed. More than one can be listed with comma separated values
var objects = [
purgeUrl // I am trying to pull this from the getUrl2 module
];
// Go for it!
var Purger = PurgerFactory.create(config);
Purger.purgeObjects(objects, function(err, res) {
console.log('------------------------');
console.log('Purge Result:', res.body);
console.log('------------------------');
Purger.checkPurgeStatus(res.body.progressUri, function(err, res) {
console.log('Purge Status', res.body);
console.log('------------------------');
Purger.checkQueueLength(function(err, res) {
console.log('Queue Length', res.body);
console.log('------------------------');
});
});
});
The getUrl2 module looks like this:
var prompt = require('../../node_modules/prompt');
//
// Start the prompt
//
prompt.start();
//
// Get property from the user
//
prompt.get(['newUrl'], function (err, result) {
//
// Log the results.
//
console.log('Command-line input received:');
console.log(' http://example.com/custom/' + result.newUrl);
var purgeUrl = 'http://example.com/custom/' + result.newUrl;
console.log(purgeUrl);
module.exports = purgeUrl;
});
Thanks again for the help!
I would probably just allow getURL2 to expose a method that will be invoked in the main module. For example:
// getURL2
var prompt = require('../../node_modules/prompt');
module.exports = {
start: function(callback) {
prompt.start();
prompt.get(['newUrl'], function (err, result) {
// the callback is defined in your main module
return callback('http://example.com/custom/' + result.newUrl);
});
}
}
Then in your main module:
require('./getUrl2').start(function(purgeURL) {
// do stuff with the purgeURL defined in the other module
});
The implementation may differ, but conceptually, you need to make your second module, which requires some sort of input from the user, happen as a result of that input. Callbacks are a common way to do this (as are Promises). However, as prompt is not necessarily exposing a method that would necessitate a Promise, you can do it with plain old callbacks.
You might also want to search around for articles on writing command line tools (sometimes referenced as CLIs) or command line apps with Node. I found the following article to be helpful when trying to figure this out myself:
http://javascriptplayground.com/blog/2015/03/node-command-line-tool/
Also, the command-line-args module worked well for me (though there's a number of other modules out there to choose from):
https://www.npmjs.com/package/command-line-args
Good luck!

Node (Gulp) process.stdout.write to file

I'm trying to have gulp taking care of my unit tests for me, and outputting my test coverage to a .lcov file.
This is what I have so far :
gulp.task('test', function () {
var test = fs.createWriteStream('./test.lcov', {flags: 'a'});
return gulp.src('./assets/js/test/test.js', {read: false})
.pipe(mocha({reporter: 'mocha-lcov-reporter'}))
.pipe(test);
});
The mocha-lcov-reporter code can be found here :
https://github.com/StevenLooman/mocha-lcov-reporter/blob/master/lib/lcov.js
It outputs results through process.stdout.write()
But when I pipe this to a WriteStream I have the following error :
TypeError: Invalid non-string/buffer chunk
at validChunk (_stream_writable.js:152:14)
at WriteStream.Writable.write (_stream_writable.js:181:12)
at Stream.ondata (stream.js:51:26)
at Stream.emit (events.js:95:17)
at drain (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp-mocha/node_modules/through/index.js:36:16)
at Stream.stream.queue.stream.push (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp-mocha/node_modules/through/index.js:45:5)
at Stream.stream (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp-mocha/index.js:27:8)
at Stream.stream.write (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp-mocha/node_modules/through/index.js:26:11)
at write (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp/node_modules/vinyl-fs/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:623:24)
at flow (/Users/braunromain/Documents/dev/should-i-go/node_modules/gulp/node_modules/vinyl-fs/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:632:7)
Looks like gulp-mocha isn't totally set up to be a true through stream, in fact it looks like it just pipes the source into a Mocha instance and lets Mocha do it's thing.
The first thing that comes to my head just to do a simple redirect in bash...
$ gulp test | grep -Ev "^\[[0-9:]{0,8}\]" > ./test.lcov
Of course this assumes all the output that is gulp related will start with [00:00:00] (with the 00 being the current system time). If this isn't the case you may end up with gulp output at the top and bottom of your file.
If you're looking for a more universal answer (read: using nodejs) you can do a rewrite of process.stdout.write. This is probably :( upon by most but it would work. The trick is that you can't overwrite process.stdout as another stream because it is written as a getter internally. You can however rewrite the stdout.write function. As a matter of fact I just did this for a project I'm working on so I can review gulp logs of other developers in case they have issues with the build system.
I chose to go with a not so async solution because unlike most everything else in nodejs, stdout and stderr are both blocking streams and don't act like the asynchronous code you're used to. Using this technique your task would end up looking something like this:
gulp.task('test', function () {
// clear out old coverage file
fs.writeFileSync('./test.lcov', '');
// if you still want to see output in the console
// you need a copy of the original write function
var ogWrite = process.stdout.write;
process.stdout.write = function( chunk ){
fs.appendFile( './test.lcov', chunk );
// this will write the output to the console
ogWrite.apply( this, arguments );
};
return gulp.src('./assets/js/test/test.js', {read: false})
.pipe(mocha({reporter: 'mocha-lcov-reporter'}));
});
I needed to log everything in my gulp process to a file and ended up with this which works ok for me:
var fs = require('fs');
var proc = require('process');
var origstdout = process.stdout.write,
origstderr = process.stderr.write,
outfile = 'node_output.log',
errfile = 'node_error.log';
if (fs.exists(outfile)) { fs.unlink(outfile); }
if (fs.exists(errfile)) { fs.unlink(errfile); }
process.stdout.write = function( chunk ){
fs.appendFile(outfile, chunk.replace(/\x1b\[[0-9;]*m/g, ''));
origstdout.apply(this, arguments);
};
process.stderr.write = function( chunk ){
fs.appendFile(errfile, chunk.replace(/\x1b\[[0-9;]*m/g, ''));
origstderr.apply(this, arguments);
};

Categories

Resources