Yargs help not display properly while --h is working - javascript

In my CLI when I call node cli.js --help I get this:
Options:
--help Show help [boolean]
--version Show version number [boolean]
However, when I call node cli.js --h i get the expected output:
cli.js
Run a set of user flows and save the result
Commands:
cli.js init Setup .user-flowrc.json
...
Options:
--version Show version number [boolean]
-v, --verbose Run with verbose logging [boolean]
...
-h, --help Show help [boolean]
Examples:
init Setup user-flows over prompts
I have tried to add yargs.parse() Yargs help not displaying all the help options, but it did not change anything.
And I am using .argv at the end so it should only be called once if I understood Why are commands missing from this yargs script help page?.
Similarly i have read yargs example - yargs/docs/examples.md and added the alias which is why the --h is working.
There is the code I think is relevant:
export function setupYargs(
commands: YargsCommandObject[],
options: { [key: string]: Options },
config: Record<string, any> = {}
) {
yargs.options(options)
.parserConfiguration({ 'boolean-negation': true })
.recommendCommands()
.config(config)
.example([
['init', 'Setup user-flows over prompts']
])
.help();
commands.forEach((command) => yargs.command(
command.command,
command.description,
command?.builder || (() => {
}),
command.module.handler
));
return yargs.argv;
}
If there is any other section of the code base which may be useful let me know and I will add it to the question.
--- Edit ---
I have now narrowed down the issue to the setup of the CLI.
const {collect, persist, assert} = readRcConfig(getRcParam());
(async () => runCli({ commands: commands, options: {
...GLOBAL_OPTIONS
}, config: {...collect, ...persist, ...assert} }))();
Where getRcParam() gets the path to the configuration:
export function get(): string {
// we don't rely on yargs option normalization features as this can happen before cli bootstrap
const {rcPath, p, ["rc-path"]: rc_path } = argv as any as ArgvOption<any>;
const pathToCfgRc = rcPath || rc_path || p;
return (pathToCfgRc as any as string) || join(DEFAULT_RC_PATH, DEFAULT_RC_NAME);
}
And is passed as a config option.

Related

displaying domain's NS in Vue client project

I have created an input and button in my project. the user is supposed to enter domain name and get its NS by clicking on button through a node JS function that returns nameserver.
getNs.js file in #/plugins/
export function getNs(domain) {
const dns = require('node:dns');
const dnsPromises = dns.promises;
const options = {
family: 6,
hints: dns.ADDRCONFIG | dns.V4MAPPED,
};
options.all = true;
dnsPromises.resolveNs(domain, options).then((result) => {
console.log('addresses: %j', result);
return result
});
}
declaring function in component script:
importing the plugin:
import {getIp} from "~/services/displayIp";
in methods:
goToSecondStep() {
getIp()
}
the event is happening on the button:
#click="goToSecondStep"
the error i get:
This dependency was not found: friendly-errors 09:54:39
friendly-errors 09:54:39
node:dns in ./services/displayIp.js friendly-errors 09:54:39
friendly-errors 09:54:39
To install it, you can run: npm install --save node:dns
i get the same error even though i run npm install --save node:dns command.
how can i use node js function or plugins in vuejs temeplates.

Yarn Workspaces, workspace does not emit errors or warnings

I have followed the following post in order to create a monorepo using yarn workspaces and craco.
It works really well except one thing: the errors/warnings of the common (components )library are not emitted to the console.
The structure is very simple:
monorepo
|-packages
|-components
|-fe
Fe is the main webApp that uses the components library.
The FE emits all warnings correctly, components does not.
How to make the shared component emit warnings/errors?
Updated:
Steps to reproduce in this repo:
https://github.com/sofoklisM/my-monorepo.git
What you need to change is the context option of the underlying ESLint Webpack plugin that is used by Create React App.
In this case I changed the context of ESLint to the root of the monorepo (yarn workspace root).
Here is an updated craco.config.js that should do the trick:
// craco.config.js
const path = require("path");
const { getLoader, loaderByName } = require("#craco/craco");
const { getPlugin, pluginByName } = require("#craco/craco/lib/webpack-plugins")
const absolutePath = path.join(__dirname, "../components");
module.exports = {
webpack: {
alias: {},
plugins: [],
configure: (webpackConfig, { env, paths }) => {
const { isFound, match } = getLoader(
webpackConfig,
loaderByName("babel-loader")
);
if (isFound) {
const include = Array.isArray(match.loader.include)
? match.loader.include
: [match.loader.include];
match.loader.include = include.concat([absolutePath]);
}
// Change context of ESLint Webpack Plugin
const { match: eslintPlugin } = getPlugin(webpackConfig, pluginByName("ESLintWebpackPlugin"));
eslintPlugin.options['context'] = path.join(__dirname, "../..");
return webpackConfig;
}
}
};
I've also made an updated fork of your reproduction repo here: https://github.com/ofhouse/stackoverflow-65447779

Cypress - Getting error while executing 'cypress open'

I've a Testing framework with node, cypress, mocha, mochawesome and mochawesome-merge as below with this github repo:
and in my package.json I have two scripts as
`"scripts": {
"cy": "./node_modules/.bin/cypress open",
"cy_test": "node cypress.js"
},`
If I run npm run cy_test it works fine in headless state, but if I run npm run cy i get following error:
But If I remove cypress.js from my project then it works as expected.
cypress.js
const cypress = require('cypress')
const marge = require('mochawesome-report-generator')
const { merge } = require('mochawesome-merge')
const currRunTimestamp = getTimeStamp();
const mergedReport = {
reportDir: 'mochawesome-report',
}
const finalReport = {
reportDir: 'reports',
}
cypress.run({
reporter: 'mochawesome',
reporterOptions: {
reportDir: 'mochawesome-report',
overwrite: false,
html: true,
json: true
}
}).then(
() => {
generateReport()
},
error => {
generateReport()
console.error(error)
process.exit(1)
}
)
function generateReport(options) {
return merge(mergedReport).then(report => marge.create(report, finalReport))
}
I think this is a problem with npm on Windows that is messing with file names, because npm is trying to run the script as binary instead of getting it from ./node_modules/.bin.
So, I'll suggest, as first try, if you can, change the name of the cypress.js to something other than cypress. I think this can solve your problem.
If not, as a workaround remove .JS from PATHEXT environment variable and restart the processes that are running the script, including your IDE, if applicable.
Hope it works.

Specify code to run before any Jest setup happens

The tl;dr is:
1) How can I have Jest use the native require function to load all modules in my tests anywhere.
2) Where / how would I go about modifying (ie replacing with the esm loader) https://github.com/standard-things/esm the require function in one place, before any tests run, so all tests will use the modified require.
I'd like to use the esm-loader with my Jest test files. In order to do so, I need to patch the require function globally, before any test code runs, with something like
require = require("#std/esm")(module, { esm: "js", cjs: true });
How do I tell Jest to execute that code before anything else is touched or requested?
I tried pointing both setupTestFrameworkScriptFile and an setupFiles array entry to a file with that in it, but neither worked (though I did confirm that both ran).
Alternatively, I'm firing off these tests with an npm script
"scripts": {
"test": "jest"
}
Is there some CLI magic whereby I can just load a module and then run jest?
Edit - the testEnvironment and resolver options make me wonder if this is ever even using the actual Node require function to load modules, or instead using its own module loader. If so I wonder if this is even possible.
So this one was a bit tough to get working. The solution is quite simple but it took me a while to get it working. The problem is that whenever you use any module in jest
Setup Files
Setup Framework Files
Test Files
Module files
They are all loaded in below way
({"Object.":function(module,exports,require,__dirname,__filename,global,jest){/*Module code inside*/
}});
If you have a look at node_modules/jest-runtime/build/index.js:495:510
const dirname = (_path || _load_path()).default.dirname(filename);
localModule.children = [];
localModule.parent = mockParentModule;
localModule.paths = this._resolver.getModulePaths(dirname);
localModule.require = this._createRequireImplementation(filename, options);
const transformedFile = this._scriptTransformer.transform(
filename,
{
collectCoverage: this._coverageOptions.collectCoverage,
collectCoverageFrom: this._coverageOptions.collectCoverageFrom,
collectCoverageOnlyFrom: this._coverageOptions.collectCoverageOnlyFrom,
isInternalModule,
mapCoverage: this._coverageOptions.mapCoverage },
this._cacheFS[filename]);
this._createRequireImplementation(filename, options); gives every module a custom require object. So you as such don't get the native require function at all, anywhere. Once jest has started every module loaded from then on will have jest's custom require function.
When we load a module, the requireModule methods from the jest-runtime gets called. Below is an excerpt from the same
moduleRegistry[modulePath] = localModule;
if ((_path || _load_path()).default.extname(modulePath) === '.json') {
localModule.exports = this._environment.global.JSON.parse(
(0, (_stripBom || _load_stripBom()).default)((_gracefulFs || _load_gracefulFs()).default.readFileSync(modulePath, 'utf8')));
} else if ((_path || _load_path()).default.extname(modulePath) === '.node') {
// $FlowFixMe
localModule.exports = require(modulePath);
} else {
this._execModule(localModule, options);
}
As you can see if the extension of the file is .node it loads the module directly, else it calls the _execModule. This function is the same code that I posted earlier which does the code transformation
const isInternalModule = !!(options && options.isInternalModule);
const filename = localModule.filename;
const lastExecutingModulePath = this._currentlyExecutingModulePath;
this._currentlyExecutingModulePath = filename;
const origCurrExecutingManualMock = this._isCurrentlyExecutingManualMock;
this._isCurrentlyExecutingManualMock = filename;
const dirname = (_path || _load_path()).default.dirname(filename);
localModule.children = [];
localModule.parent = mockParentModule;
localModule.paths = this._resolver.getModulePaths(dirname);
localModule.require = this._createRequireImplementation(filename, options);
Now when we want to modify require function for our test, we need _execModule to export our code directly. So the code should be similar to loading of a .node modules
} else if ((_path || _load_path()).default.extname(modulePath) === '.mjs') {
// $FlowFixMe
require = require("#std/esm")(localModule);
localModule.exports = require(modulePath);
} else {
But doing that would mean patching the code, which we want to avoid. So what we do instead is avoid using the jest command directly, and create our own jestload.js and running that. The code for loading jest is simple
#!/usr/bin/env node
/**
* Copyright (c) 2014-present, Facebook, Inc. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
cli = require('jest/bin/jest');
Now we want to modify the _execModule before the cli loads. So we add below code
const jestRuntime = require("jest-runtime");
oldexecModule = jestRuntime.prototype._execModule;
jestRuntime.prototype._execModule = function (localModule, options) {
if (localModule.id.indexOf(".mjs") > 0) {
localModule.exports = require("#std/esm")(localModule)(localModule.id);
return localModule;
}
return oldexecModule.apply(this, [localModule, options]);
};
cli = require('jest/bin/jest');
Now time for a test
//__test__/sum.test.js
sum = require('../sum.mjs').sum;
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
test('adds 2 + 3 to equal 5', () => {
expect(sum(3, 2)).toBe(5);
});
And a sum.mjs file
export function sum (x, y) { return x + y }
Now we run the test
The solution is available on below repo
https://github.com/tarunlalwani/jest-overriding-require-function-stackoverflow
You can clone and test the solution by running npm test.
setupFiles worked for me. Add this in package.json:
"jest": {
"setupFiles": ["./my_file.js"]
},
https://jestjs.io/docs/en/configuration.html#setupfiles-array
I tried using node -r #std/esm run.js where run.js is just a script that calls jest, but it does not work and crashes here : https://github.com/facebook/jest/blob/master/packages/jest-runtime/src/script_transformer.js#L305.
From what I understand from this line means that it is not possible because jest compiles the module using the native vm module. The above lines (290):
if (willTransform) {
const transformedSource = this.transformSource(
filename,
content,
instrument,
!!(options && options.mapCoverage));
wrappedCode = wrap(transformedSource.code);
sourceMapPath = transformedSource.sourceMapPath;
} else {
is the code called when you are specifying transforms in your jest config.
Conclusion : until esm are supported ( and they will be under the .mjs extension ) you cannot import es modules in jest without specifying a transform. You could try to monkey patch vm but I would really advise against this option.
Specifying a jest transform is really not that hard, and for es modules it's really as simple as using babel-jest with the right babel config :
Below a package.json with minimal settings
{
"dependencies": {
"babel-jest": "^21.2.0",
"babel-plugin-transform-es2015-modules-commonjs": "^6.26.0",
"jest": "^21.2.1"
},
"jest": {
"testMatch": [
"<rootDir>/src/**/__tests__/**/*.js?(x)",
"<rootDir>/src/**/?(*.)(spec|test).js?(x)"
],
"transform": {
"^.+\\.(js|jsx)$": "<rootDir>/node_modules/babel-jest"
},
"testEnvironment": "node",
"testURL": "http://localhost",
"moduleFileExtensions": [
"js",
"json"
]
},
"babel": {
"plugins": ["babel-plugin-transform-es2015-modules-commonjs"]
}
}

How to whitelist all subdependencies with webpack-node-externals

I'm using webpack to bundle server assets using the target property.
This results in a usable client bundle, and a usable server, which is working great. However it seems that even for the server code, webpack is bundling everything within node_modules. I am attempting to use webpack-node-externals to solve this problem, seen below:
module.exports = [
{
name: "server code, output to ./server",
entry: "./servertest.js",
output: {
filename: "./server/index.js"
},
target: "node",
externals: [
nodeExternals({
includeClientPackages: false
})
]
},
{
name: "client side, output to ./public",
entry: "./app.js",
output: {
filename: "./dist/app.js"
}
}
]
This doesn't work however as its default behavior is to exclude all node_modules from bundling, thus rendering the server useless. There is a whitelist option, for which I have included express, the only dependency of my small test case. It doesn't fail on express, however it fails on a dependency of express, merge-descriptors. And of course if I add merge-descriptors to the whitelist, trying to start the server will fail on another dependency of express. I surely cannot add every dependency and sub-dependency (etc etc) to this whitelist array.
How can I ensure all dependencies of a given requirement are bundled by webpack during a target: 'node' build?
To deal with this, I created a small helper, datwd to get a list of all subdependencies for specific packages:
// webpack.config.js
const nodeExternals = require('webpack-node-externals')
const includeSubdependencies = require('datwd')
module.exports = {
// ...
externals: [
nodeExternals({
// Will include "cookies" and its dependencies; for example:
// `['cookies', 'depd', 'keygrip', 'tsscmp']`
allowlist: includeSubdependencies(['cookies'])
})
]
It relies on npm ls under the hood. Source code:
/* eslint-disable global-require, import/no-dynamic-require */
const { execSync } = require('child_process')
/**
* Returns a flat array of all Node module dependency names for the entire
* dependency tree, optionally filtered by top-level modules. Requires NPM
* and for dependencies to be installed.
*
* #param {Array} moduleFilter - An optional array of top-level module names
* whose dependencies should be included. If specified, any other modules'
* dependencies will be excluded.
* #return {String[]} An array of module names
*/
const getAllDependencies = (moduleFilterInput = []) => {
const moduleFilter = Array.isArray(moduleFilterInput)
? moduleFilterInput
: [moduleFilterInput]
// Get the full dependency tree using NPM, excluding dev dependencies
// and peer dependencies.
const dependencyTree = JSON.parse(execSync('npm ls --prod --json').toString())
// Only get dependencies for specific top-level modules, if specified.
const dependencyTreeFiltered = moduleFilter.length
? {
...dependencyTree,
dependencies: Object.keys(dependencyTree.dependencies)
.filter((key) => moduleFilter.includes(key))
.reduce((obj, key) => {
// eslint-disable-next-line no-param-reassign
obj[key] = dependencyTree.dependencies[key]
return obj
}, {}),
}
: dependencyTree
const allChildDeps = []
const getAllChildDependencies = (depTree) => {
const nextDeps = depTree.dependencies
if (!nextDeps || !Object.keys(nextDeps).length) {
return []
}
Object.entries(nextDeps).forEach(([childDep, childDepTree]) => {
allChildDeps.push(childDep)
getAllChildDependencies(childDepTree)
})
return allChildDeps
}
return getAllChildDependencies(dependencyTreeFiltered)
}

Categories

Resources