nodejs commander.js 'alias' not working as expected for "--help" - javascript

I am working with commander.js for my project and I am facing a weird issue when giving an alias for a command. I referred to examples mentioned here: Commander.JS Example
I am looking for creating a git like command so I started with the .command() method. When I clone the above repo and run it locally for the given example of pm, the help option works as expected. Note that in usage section there is only 'pm' and the command 'install' has alias 'i' separated by '|' symbol
Usage: pm [options] [command]
Commands:
install|i [name] install one or more packages
other options ...
But when I run my own test application, my alias of command gets appended with the test application name itself and I get an output like this:
Usage: index|r [options] [command]
Commands:
random random command
Please notice that the alias 'r' is showing with index command instead of 'random' command. If I add more commands to my index.js file, the last alias gets appended to Usage: index|<new alias> instead of actually getting appended with the right command.
I am not able to understand what am I doing wrong. Can somebody please help here? How can I get the correct output when using the -h or --help option ?
index.js
#!/usr/bin/env node
'use strict';
var program = require('commander');
program
.version('1.0.0')
.command('random', 'random command')
.alias('r')
.parse(process.argv);
index-random.js
#!/usr/bin/env node
'use strict';
var program = require('commander');
program
.option('-r, --random <random>', 'Random command option')
.parse(process.argv);
Package.json
{
"name": "commander-test",
"version": "1.0.0",
"description": "Testing commander",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Raghuveer",
"license": "UNLICENSED",
"dependencies": {
"commander": "^2.9.0",
"graceful-readlink": "1.0.0"
}
}
Steps to reproduce:
node index -h
Please let me know if you need more information.

Add alias of the sub command to the corresponding sub command file. In your case you have to add it to index-random.js file like this.
#!/usr/bin/env node
'use strict';
var program = require('commander');
program
.option('-r, --random <random>', 'Random command option')
.alias('r')
.parse(process.argv);
Then when you invoke; node index.js help random
the help will show Usage: index-random|r
However, the issue I am also facing is that if I call the index command only with alias, nothing will happen. (for ex: if I call node index r no output will return).
Let me know if you are successful in calling aliases instead of full command.
Updated: Below are the outputs of two commands.
>node index.js help
Usage: index [options] [command]
Commands:
random random command
help [cmd] display help for [cmd]
Options:
-h, --help output usage information
-V, --version output the version number
>node index.js help random
Usage: index-random|r [options]
Options:
-h, --help output usage information
-r, --random <random> Random command option

Related

Fail Gatsby build if environment variable missing

I have experimented with adding environment variables to my Gatsby project using .env.development and .env.production files and it's working great.
I would like to have my builds fail if one of the environment variables is missing, however I can't seem to see how to enable this functionality.
I have read through the Gatsby environment variables documentation, but can't seem to see how this would work? is this possible?
I believe it uses dotenv/webpack define plugin under the hood.
I’m sure there are other ways to do this, but with some quick tests, this approach seems to be working well for me.
In your gatsby-config.js file, you can choose to explicitly require the dotenv, so you can use those environment variables in your config.
I added the following, and now the Gatsby build will fail unless the specified environment variables are present.
// Load the environment variables, per
// https://www.gatsbyjs.org/docs/environment-variables/#server-side-nodejs
require('dotenv').config({
path: `.env.${process.env.NODE_ENV}`,
})
function checkEnv(envName) {
if (typeof process.env[envName] === 'undefined' || process.env[envName] === '') {
throw `Missing required environment variables: ${envName}`
}
}
try {
checkEnv('NODE_ENV')
checkEnv('EXAMPLE_MISSING_ENV')
checkEnv('EXAMPLE_API_KEY')
} catch (e) {
throw new Error(e)
}
// The rest of the config file
I could imagine customizing this further, ex. logging a warning for a variable with a fallback versus throwing an error for one that is required by your content sourcing plugin or theme. Hope this is helpful as a starting point!
I couldn't find built-in solution for this on Gatsby neither. You may do it manually, but still not too easy.
First problem: If you wanna load your environment from file while running npm script; it can not be loaded right away. But you may trigger a script file, and it can load this environment variables before your check.
lets say build.sh on root directory of project :
source ./.env.development # this line will set env variables
if [ "$API_KEY" = 927349872349798 ] ; then
npm run build
fi
Another problem rises; some developers might want to run it on windows maybe. So better use famous cross-env package.
npm i cross-env
Then everything is ready, add your secure-build :
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write \"**/*.{js,jsx,json,md}\"",
"start": "npm run develop",
"serve": "gatsby serve",
"clean": "gatsby clean",
"test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1",
"secure-build": "cross-env-shell \"./build.sh\""
},
And run it :
npm run secure-build
This solution looks too much for me as we created a build.sh and install a new package. Maybe there is cleaner solution. I am not Gatsby Guru after all.
I added env checking to the onPreInit life cycle hook in gatsby-node.ts:
const envVariablesList = [
"ENV1",
"ENV2",
"ENV3",
];
function envVarChecker(vars: string[]): string | undefined {
return vars.find(
(item) => process.env[item] === undefined || process.env[item] === ""
);
}
export const onPreInit: GatsbyNode["onPreInit"] = ({ actions }) => {
const emptyEnv = envVarChecker(envVariablesList);
if (emptyEnv !== undefined) {
throw new Error(`Env variable: ${emptyEnv} is empty!`);
}
};
It fails build almost at the very beginning (during pre-bootstrap phase) if any of the declared variables is missing

Argument passing to PM2

I want to put in production a node service :
When I launch my application with my arguments like that : node ./backend -c "uf4m6fhnh" -s "SPNLGZsUoSpQ=" -o "8696". Everything works well.
Now I want to put it in production with PM2 :
I have tried the 2 ways to to that (CLI and JSON file) like that :
CLI version :
pm2 start backend.js --node-args="-c uf4lvm6fhnh -s SPNLGZsUoSpQ= -o 8696" --name MyAppName
and also :
pm2 start backend.js --name MyAppName -- "-c uf4lvm6fhnh -s SPNLGZsUoSpQ= -o 8696"
Config file (JSON) :
{
"apps": [
{
"name": "MyAppName ",
"script": "./backend.js",
"node_args": [
"-c",
"uf4lvm6fhnh",
"-s",
"SPNLGZsUoSpQ=",
"-o",
"8696"
]
}
]
}
and then : pm2 start myConfigJson.json
For each of this possible solution, I have the same error in my pm2 logs :
Error: Cannot find module '/home/me/Projects/Project/uf4lvm6fhnh'
(Note that the not found module is my passed argument)
Any ideas ?
Use args instead.
node_args is an alias to interpreter_args which passes arguments to node itself, rather than the script. As a result, your command line ends up calling -c|--check on node itself instead.
See http://pm2.keymetrics.io/docs/usage/pm2-doc-single-page/#programmatic-api and https://nodejs.org/api/cli.html#cli_c_check

Jest unit Testing - No tests found

I am starting out with Jest unit testing.
I keep getting "No tests found" while running a jest unit test.
Error description
yarn test
yarn run v1.3.2
$ jest
No tests found
In E:\Course\Testing JavaScript\Jest Demo
4 files checked.
testMatch: **/__tests__/**/*.js?(x),**/?(*.)(spec|test).js?(x) - 3 matches
testPathIgnorePatterns: \\node_modules\\ - 4 matches
Pattern: - 0 matches
error Command failed with exit code 1.
os : window 7 ,
node v : 8.6,
npm v : 5.4.2
folder structure :
Demo
package.json
__tests__
sum.js
sum.test.js
sum.js file:
function sum(a, b) {
return a + b;
}
module.exports = sum;
sum.test.js
const sum = require("./sum");
test("adds 1 + 2 to equal 3", () => {
expect(sum(1, 2)).toBe(3);
});
package.json
{
"name": "JestDemo",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"jest": "^22.4.3"
},
"scripts": {
"test": "jest"
}
}
So far, i have referred these articles
Stackoverflow-Jest No Tests found
A few more post on github too regarding the same issue but nothing helped.
What i did so far :
1) changed the script to point to folder containing test files:
"scripts": {
"test": "jest __test__" //
from "test": "jest"
}
2) tried changing folder structure.
Could someone help me figure out the problem ?
You appear to have placed both your code and test files in the same directory (__tests__/sum.js and __tests__/sum.test.js). I don't think that's a specific reason for the failure you are seeing but its not the convention.
It should not be necessary to specify a folder to search unless your project is complete. Run jest without any arguments, other than necessary options, and it will search the current directory an below.
The convention is either:
/root
/src
sum.js
/__tests__
sum.js
or:
/root
/src
sum.js
sum.test.js
You can use test or spec in the latter example. root and src are up to you, root is normally the location of package.json and is where you invoke jest from (e.g. myApp). src help differentiate from other app components like build and assets.

npm script pass parameters/arguments to node script using yargs

Is it possible to call out to retrieve a key from yargs when using as a npm script argument?
User types in the OSX terminal:
npm run scaffold --name=blah
which executes in package.json:
"scaffold" : "node ./scaffold/index.js -- "
This results in
const yargs = require('yargs').argv
if (yargs) {
console.log(yargs);
console.log(yargs.name);
process.exit(1)
}
...
result:
{ _: [], '$0': 'scaffold/index.js' }
undefined
This only works if I hard code in package.json "scaffold" : "node scaffold/index.js --name=blah", but I need this to be configurable.
As I stated I am using args, as it appears to make it easy to retrieve keys by name ( as opposed to an array ). Open to suggestions.
What am I missing?
update 11-07-2017
Related: Sending command line arguments to npm script
However, passing in the commandline 1: npm run scaffold name=hello
OR 2: npm run scaffold --name=hello yields:
1: { _: [], '$0': 'scaffold/index.js' }
2: { _: [ 'name=hello' ], '$0': 'scaffold/index.js' }
Still can't see a way to retrieve the yargs.name property. Still undefined.
Update 13-07-2017
For the time being, I have given up. It just seem impossible. I run the script manually in the terminal.
E.g.
node ./scaffold/index.js --name=blah
Image below shows executing of a node script directly as opposed to running through npm scripts. I have added https://www.npmjs.com/package/nopt node module to see if it helps ( it doesn't ). process.argv.name is still undefined when running through npm scripts.
Update 18-07-2017
Added github example: https://github.com/sidouglas/stackoverflow-node-arguments
Update 24-07-2017
Adding the variables before the start of the command works
myvar="hello npm run scaffold as opposed to npm run scaffold myvar="hello world"
As of npm#2.0.0, you can use custom arguments when executing scripts. The special option -- is used by getopt to delimit the end of the options. npm will pass all the arguments after the -- directly to your script:
npm run test -- --grep="pattern"
https://docs.npmjs.com/cli/run-script
I'm not sure that it matters where the variables are added on the command line, and if this is of no concern to you, then this works:
//package.json
{
"name": "npm-test",
"version": "1.0.0",
"description": "",
"main": "index.js",
"dependencies": {},
"devDependencies": {},
"scripts": {
"start": "node index.js"
},
"author": "",
"license": "ISC"
}
Your JS file:
//index.js
console.log('myvar', process.env.myvar);
And your command line command:
myvar="hello world" npm run start
So in the end, just prefix your npm script command with your argument list.
For me the following works on Node 10, 12, 14
npm run yourscript -- -- --name=bla
I do need to use -- --
and
"yourscript": "node bla.js"

execute some code and then go into interactive node

Is there a way to execute some code (in a file or from a string, doesn't really matter) before dropping into interactive mode in node.js?
For example, if I create a script __preamble__.js which contains:
console.log("preamble executed! poor guy!");
and a user types node __preamble__.js they get this output:
preamble executed! poor guy!
> [interactive mode]
Really old question but...
I was looking for something similar, I believe, and found out this.
You can open the REPL (typing node on your terminal) and then load a file.
Like this: .load ./script.js.
Press enter and the file content will be executed. Now everything created (object, variable, function) in your script will be available.
For example:
// script.js
var y = {
name: 'obj',
status: true
};
var x = setInterval(function () {
console.log('As time goes by...');
}, 5000);
On the REPL:
//REPL
.load ./script.js
Now you type on the REPL and interact with the "living code".
You can console.log(y) or clearInterval(x);
It will be a bit odd, cause "As time goes by..." keep showing up every five seconds (or so).
But it will work!
You can start a new repl in your Node software pretty easily:
var repl = require("repl");
var r = repl.start("node> ");
r.context.pause = pauseHTTP;
r.context.resume = resumeHTTP;
From within the REPL you can then call pause() or resume() and execute the functions pauseHTTP() and resumeHTTP() directly. Just assign whatever you want to expose to the REPL's context member.
This can be achieved with the current version of NodeJS (5.9.1):
$ node -i -e "console.log('A message')"
The -e flag evaluates the string and the -i flag begins the interactive mode.
You can read more in the referenced pull request
node -r allows you to require a module when REPL starts up. NODE_PATH sets the module search path. So you can run something like this on your command line:
NODE_PATH=. node -r myscript.js
This should put you in a REPL with your script loaded.
I've recently started a project to create an advanced interactive shell for Node and associated languages like CoffeeScript. One of the features is loading a file or string in the context of the interpreter at startup which takes into account the loaded language.
http://danielgtaylor.github.com/nesh/
Examples:
# Load a string (Javascript)
nesh -e 'var hello = function (name) { return "Hello, " + name; };'
# Load a string (CoffeeScript)
nesh -c -e 'hello = (name) -> "Hello, #{name}"'
# Load a file (Javascript)
nesh -e hello.js
# Load a file (CoffeeScript)
nesh -c -e hello.coffee
Then in the interpreter you can access the hello function.
Edit: Ignore this. #jaywalking101's answer is much better. Do that instead.
If you're running from inside a Bash shell (Linux, OS X, Cygwin), then
cat __preamble__.js - | node -i
will work. This also spews lots of noise from evaluating each line of preamble.js, but afterwords you land in an interactive shell in the context you want.
(The '-' to 'cat' just specifies "use standard input".)
Similar answer to #slacktracer, but if you are fine using global in your script, you can simply require it instead of (learning and) using .load.
Example lib.js:
global.x = 123;
Example node session:
$ node
> require('./lib')
{}
> x
123
As a nice side-effect, you don't even have to do the var x = require('x'); 0 dance, as module.exports remains an empty object and thus the require result will not fill up your screen with the module's content.
Vorpal.js was built to do just this. It provides an API for building an interactive CLI in the context of your application.
It includes plugins, and one of these is Vorpal-REPL. This lets you type repl and this will drop you into a REPL within the context of your application.
Example to implement:
var vorpal = require('vorpal')();
var repl = require('vorpal-repl');
vorpal.use(repl).show();
// Now you do your custom code...
// If you want to automatically jump
// into REPl mode, just do this:
vorpal.exec('repl');
That's all!
Disclaimer: I wrote Vorpal.
There isn't a way do this natively. You can either enter the node interactive shell node or run a script you have node myScrpt.js. #sarnold is right, in that if you want that for your app, you will need to make it yourself, and using the repl toolkit is helpful for that kind of thing
nit-tool lets you load a node module into the repl interactive and have access to inner module environment (join context) for development purposes
npm install nit-tool -g
First I tried
$ node --interactive foo.js
but it just runs foo.js, with no REPL.
If you're using export and import in your js, run npm init -y, then tell node that you're using modules with the "type": "module", line -
{
"name": "neomem",
"version": "1.0.0",
"description": "",
"type": "module",
"main": "home.js",
"keywords": [],
"author": "",
"license": "ISC"
}
Then you can run node and import a file with dynamic import -
$ node
Welcome to Node.js v18.1.0.
Type ".help" for more information.
> home = await import('./home.js')
[Module: null prototype] {
get: [AsyncFunction: get],
start: [AsyncFunction: start]
}
> home.get('hello')
Kind of a roundabout way of doing it - having a command line switch would be nice...

Categories

Resources