Cypress - Getting error while executing 'cypress open' - javascript

I've a Testing framework with node, cypress, mocha, mochawesome and mochawesome-merge as below with this github repo:
and in my package.json I have two scripts as
`"scripts": {
"cy": "./node_modules/.bin/cypress open",
"cy_test": "node cypress.js"
},`
If I run npm run cy_test it works fine in headless state, but if I run npm run cy i get following error:
But If I remove cypress.js from my project then it works as expected.
cypress.js
const cypress = require('cypress')
const marge = require('mochawesome-report-generator')
const { merge } = require('mochawesome-merge')
const currRunTimestamp = getTimeStamp();
const mergedReport = {
reportDir: 'mochawesome-report',
}
const finalReport = {
reportDir: 'reports',
}
cypress.run({
reporter: 'mochawesome',
reporterOptions: {
reportDir: 'mochawesome-report',
overwrite: false,
html: true,
json: true
}
}).then(
() => {
generateReport()
},
error => {
generateReport()
console.error(error)
process.exit(1)
}
)
function generateReport(options) {
return merge(mergedReport).then(report => marge.create(report, finalReport))
}

I think this is a problem with npm on Windows that is messing with file names, because npm is trying to run the script as binary instead of getting it from ./node_modules/.bin.
So, I'll suggest, as first try, if you can, change the name of the cypress.js to something other than cypress. I think this can solve your problem.
If not, as a workaround remove .JS from PATHEXT environment variable and restart the processes that are running the script, including your IDE, if applicable.
Hope it works.

Related

displaying domain's NS in Vue client project

I have created an input and button in my project. the user is supposed to enter domain name and get its NS by clicking on button through a node JS function that returns nameserver.
getNs.js file in #/plugins/
export function getNs(domain) {
const dns = require('node:dns');
const dnsPromises = dns.promises;
const options = {
family: 6,
hints: dns.ADDRCONFIG | dns.V4MAPPED,
};
options.all = true;
dnsPromises.resolveNs(domain, options).then((result) => {
console.log('addresses: %j', result);
return result
});
}
declaring function in component script:
importing the plugin:
import {getIp} from "~/services/displayIp";
in methods:
goToSecondStep() {
getIp()
}
the event is happening on the button:
#click="goToSecondStep"
the error i get:
This dependency was not found: friendly-errors 09:54:39
friendly-errors 09:54:39
node:dns in ./services/displayIp.js friendly-errors 09:54:39
friendly-errors 09:54:39
To install it, you can run: npm install --save node:dns
i get the same error even though i run npm install --save node:dns command.
how can i use node js function or plugins in vuejs temeplates.

Specify code to run before any Jest setup happens

The tl;dr is:
1) How can I have Jest use the native require function to load all modules in my tests anywhere.
2) Where / how would I go about modifying (ie replacing with the esm loader) https://github.com/standard-things/esm the require function in one place, before any tests run, so all tests will use the modified require.
I'd like to use the esm-loader with my Jest test files. In order to do so, I need to patch the require function globally, before any test code runs, with something like
require = require("#std/esm")(module, { esm: "js", cjs: true });
How do I tell Jest to execute that code before anything else is touched or requested?
I tried pointing both setupTestFrameworkScriptFile and an setupFiles array entry to a file with that in it, but neither worked (though I did confirm that both ran).
Alternatively, I'm firing off these tests with an npm script
"scripts": {
"test": "jest"
}
Is there some CLI magic whereby I can just load a module and then run jest?
Edit - the testEnvironment and resolver options make me wonder if this is ever even using the actual Node require function to load modules, or instead using its own module loader. If so I wonder if this is even possible.
So this one was a bit tough to get working. The solution is quite simple but it took me a while to get it working. The problem is that whenever you use any module in jest
Setup Files
Setup Framework Files
Test Files
Module files
They are all loaded in below way
({"Object.":function(module,exports,require,__dirname,__filename,global,jest){/*Module code inside*/
}});
If you have a look at node_modules/jest-runtime/build/index.js:495:510
const dirname = (_path || _load_path()).default.dirname(filename);
localModule.children = [];
localModule.parent = mockParentModule;
localModule.paths = this._resolver.getModulePaths(dirname);
localModule.require = this._createRequireImplementation(filename, options);
const transformedFile = this._scriptTransformer.transform(
filename,
{
collectCoverage: this._coverageOptions.collectCoverage,
collectCoverageFrom: this._coverageOptions.collectCoverageFrom,
collectCoverageOnlyFrom: this._coverageOptions.collectCoverageOnlyFrom,
isInternalModule,
mapCoverage: this._coverageOptions.mapCoverage },
this._cacheFS[filename]);
this._createRequireImplementation(filename, options); gives every module a custom require object. So you as such don't get the native require function at all, anywhere. Once jest has started every module loaded from then on will have jest's custom require function.
When we load a module, the requireModule methods from the jest-runtime gets called. Below is an excerpt from the same
moduleRegistry[modulePath] = localModule;
if ((_path || _load_path()).default.extname(modulePath) === '.json') {
localModule.exports = this._environment.global.JSON.parse(
(0, (_stripBom || _load_stripBom()).default)((_gracefulFs || _load_gracefulFs()).default.readFileSync(modulePath, 'utf8')));
} else if ((_path || _load_path()).default.extname(modulePath) === '.node') {
// $FlowFixMe
localModule.exports = require(modulePath);
} else {
this._execModule(localModule, options);
}
As you can see if the extension of the file is .node it loads the module directly, else it calls the _execModule. This function is the same code that I posted earlier which does the code transformation
const isInternalModule = !!(options && options.isInternalModule);
const filename = localModule.filename;
const lastExecutingModulePath = this._currentlyExecutingModulePath;
this._currentlyExecutingModulePath = filename;
const origCurrExecutingManualMock = this._isCurrentlyExecutingManualMock;
this._isCurrentlyExecutingManualMock = filename;
const dirname = (_path || _load_path()).default.dirname(filename);
localModule.children = [];
localModule.parent = mockParentModule;
localModule.paths = this._resolver.getModulePaths(dirname);
localModule.require = this._createRequireImplementation(filename, options);
Now when we want to modify require function for our test, we need _execModule to export our code directly. So the code should be similar to loading of a .node modules
} else if ((_path || _load_path()).default.extname(modulePath) === '.mjs') {
// $FlowFixMe
require = require("#std/esm")(localModule);
localModule.exports = require(modulePath);
} else {
But doing that would mean patching the code, which we want to avoid. So what we do instead is avoid using the jest command directly, and create our own jestload.js and running that. The code for loading jest is simple
#!/usr/bin/env node
/**
* Copyright (c) 2014-present, Facebook, Inc. All rights reserved.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
cli = require('jest/bin/jest');
Now we want to modify the _execModule before the cli loads. So we add below code
const jestRuntime = require("jest-runtime");
oldexecModule = jestRuntime.prototype._execModule;
jestRuntime.prototype._execModule = function (localModule, options) {
if (localModule.id.indexOf(".mjs") > 0) {
localModule.exports = require("#std/esm")(localModule)(localModule.id);
return localModule;
}
return oldexecModule.apply(this, [localModule, options]);
};
cli = require('jest/bin/jest');
Now time for a test
//__test__/sum.test.js
sum = require('../sum.mjs').sum;
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
test('adds 2 + 3 to equal 5', () => {
expect(sum(3, 2)).toBe(5);
});
And a sum.mjs file
export function sum (x, y) { return x + y }
Now we run the test
The solution is available on below repo
https://github.com/tarunlalwani/jest-overriding-require-function-stackoverflow
You can clone and test the solution by running npm test.
setupFiles worked for me. Add this in package.json:
"jest": {
"setupFiles": ["./my_file.js"]
},
https://jestjs.io/docs/en/configuration.html#setupfiles-array
I tried using node -r #std/esm run.js where run.js is just a script that calls jest, but it does not work and crashes here : https://github.com/facebook/jest/blob/master/packages/jest-runtime/src/script_transformer.js#L305.
From what I understand from this line means that it is not possible because jest compiles the module using the native vm module. The above lines (290):
if (willTransform) {
const transformedSource = this.transformSource(
filename,
content,
instrument,
!!(options && options.mapCoverage));
wrappedCode = wrap(transformedSource.code);
sourceMapPath = transformedSource.sourceMapPath;
} else {
is the code called when you are specifying transforms in your jest config.
Conclusion : until esm are supported ( and they will be under the .mjs extension ) you cannot import es modules in jest without specifying a transform. You could try to monkey patch vm but I would really advise against this option.
Specifying a jest transform is really not that hard, and for es modules it's really as simple as using babel-jest with the right babel config :
Below a package.json with minimal settings
{
"dependencies": {
"babel-jest": "^21.2.0",
"babel-plugin-transform-es2015-modules-commonjs": "^6.26.0",
"jest": "^21.2.1"
},
"jest": {
"testMatch": [
"<rootDir>/src/**/__tests__/**/*.js?(x)",
"<rootDir>/src/**/?(*.)(spec|test).js?(x)"
],
"transform": {
"^.+\\.(js|jsx)$": "<rootDir>/node_modules/babel-jest"
},
"testEnvironment": "node",
"testURL": "http://localhost",
"moduleFileExtensions": [
"js",
"json"
]
},
"babel": {
"plugins": ["babel-plugin-transform-es2015-modules-commonjs"]
}
}

Gulp task for ng-constant multiple environments

I have been trying to get this to work maybe I'm missing something. I am using ng-constant and setting up different environments end point as mentioned in the ng-constants issue
However I am using gulp and the configuration looks like
gulp.task('environmentsapi', function () {
return ngConstant({
stream: true,
development: {
constants: {
"ENV": {"api": "http://1.1.1.1:8082/"}
}
},
production: {
constants: {
"ENV": {"api": "https://productionapplink/"}
}
}
})
// Writes config.js to dist/ folder
.pipe(gulp.dest('dist/scripts/config'));
});
I cant figure out how to call the different end points in the different gulp tasks like the example in the link ngconstant:development etc. How can i run this within the task environmentsapi, since this task is shared in all environment builds. Please let me know how to do this.
gulp.task('build', function () {
runSequence('clean', ['sass', 'scripts', 'bower_components', 'environmentsapi' //How can I run ngconstant:development here? ], 'wiredep')
});
Simply create new tasks that set flags!
Here I'm using the development flag that defaults to true.
var development = true;
gulp.task('prod', function () {
development = false;
});
gulp.task('environmentsapi', function () {
const apiEndpoint = development ? 'http://1.1.1.1:8082/' : 'https://productionapplink/';
return ngConstant({
stream: true,
constants: {
'ENV': {api: apiEndpoint}
}
});
});
Now, using gulp build will build your application with the ENV.api set to 'http://1.1.1.1:8082/', your development endpoint.
And calling gulp prod build will make your output use an ENV.api set to 'https://productionapplink/'.
As discussed in the comments section, the solution above is quite perfect when you only have two environments, but it quickly gets out of hand when the number of environment grows.
In that case, I suggest using a different approach, the Pirate way, using yargs.
Here would be your new gulpfile.js:
const argv = require('yargs').argv;
const endpoints = {
'dev': 'http://1.1.1.1:8082/',
'prod-org': 'https://productionapplink.org/',
'prod-com': 'https://productionapplink.com/',
'prod-gov': 'https://productionapplink.gov/'
};
gulp.task('enviornmentsapi', function () {
const apiEnpdoint = typeof argv.env === 'undefined' ? endpoints.dev : endpoints[argv.env];
return ngConstant({
stream: true,
constants: {
ENV: { api: apiEnpdoint }
}
}).pipe(gulp.dest('dist/scripts/config'));
});
Use it like follows:
gulp build uses the default api URL: 'http://1.1.1.1:8082/'
gulp build --env=prod-org uses 'https://productionapplink.org/'
gulp build --env=prod-com uses 'https://productionapplink.com/'
I hope this could work for you this time!

NodeJS PTY timing commands

I'm trying to use a node process to kick off an interactive docker session then automate some commands to it:
var spawn = require('pty.js').spawn;
var proc = spawn('docker', [ 'run', '-i', '-t', 'mycontainer' ], {
name: 'test',
rows: 30,
cols: 200,
cwd: process.env.HOME,
env: process.env
});
proc.on('data', function (data) {
console.log(data);
});
proc.write('cd /tmp');
proc.write('nvm install 0.10\r');
proc.write('npm install');
This seems to work, the only issue is it seems like it's just writing in all the commands and firing them. I don't seem to have any control over catching the output or errors of individual commands.
I'm curious if there's a better way to approach this?
You can pipe streams to this process, however it is not advised to do so.
const { pipeline } = require('stream');
const { spawn } = require('node-pty')
const proc = spawn('docker', ['run', '--rm', '-ti', 'alpine', '/bin/sh'], {
name: 'xterm-color',
cwd: process.env.HOME,
env: process.env,
encoding: null,
});
pipeline(process.stdin, proc, (err) => err && console.warn(err.message))
pipeline(proc, process.stdout, (err) => err && console.warn(err.message))
The maintainer have suggested to not use pty in like a stream. It's simply a matter of changing the pipeline for something like this.
(async (stream) => {
for await (const chunk of stream) {
proc.write(chunk.toString())
}
})(process.stdin).catch(console.warn)
The gist is that we should pass string into the write function. We also should expect string as its output. Therefore, we should not set any encoding in the object so that it by default outputs utf8 string.
Regarding your initial question. proc.write('ls\r') is the correct way of doing it. Note the trailing \r to virtually press enter. Just like in a normal terminal, when you execute a command, you cannot fire a second one simultaneously. The commands will just queue up and run one after another.
Input:
const { spawn } = require('node-pty')
const proc = spawn('docker', ['run', '--rm', '-ti', '--network=host', 'node', '/bin/sh'], {
name: 'xterm-color',
cwd: process.env.HOME,
env: process.env,
});
proc.write('npm init -y\r')
proc.write('npm i eslint\r')
proc.write('ls node_modules /\r')
const disposable = proc.onData((text) => process.stdout.write(text))
const exitDisposable = proc.onExit(() => {
disposable.dispose()
exitDisposable.dispose()
})
Output:
npm i eslint
ls node_modules /
# Wrote to /package.json:
{
"name": "",
"version": "1.0.0",
"description": "",
"main": "index.js",
"directories": {
"lib": "lib"
},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN #1.0.0 No description
npm WARN #1.0.0 No repository field.
+ eslint#7.1.0
added 136 packages from 82 contributors and audited 136 packages in 9.461s
9 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
# /:
bin etc lib64 node_modules package.json run sys var
boot home media opt proc sbin tmp
dev lib mnt package-lock.json root srv usr
node_modules:
#babel is-extglob
#types is-fullwidth-code-point
...
...
#
You see it wrote ls before npm install was completed but it ran afterwards.
Also note that I used -ti instead of just -t for the docker args.
Looking through the source for the pty.js module, it is clear that your proc.write is really the standard Node net.Socket.write -- https://nodejs.org/api/net.html#net_socket_write_data_encoding_callback
In short, yes, you are just spamming the commands to the socket. You need to wait for each command to finish before executing the next. Thus, you'll need to use the callback parameter of .write to determine when a command has finished and then proceed from there. Something like this may work:
// this is a quick and dirty hack
let cmdcount = 0;
function submitcmd() {
switch (cmdcount) {
case 0:
proc.write('nvm install 0.10\r', 'utf8', submitcmd);
break;
case 1:
proc.write('npm install', 'utf8', submitcmd);
break;
}
cmdcount += 1;
}
proc.write('cd /tmp', 'utf8', submitcmd);

Gulp: target to debug mocha tests

I have a set of gulp.js targets for running my mocha tests that work like a charm running through gulp-mocha. Question: how do I debug my mocha tests running through gulp? I would like to use something like node-inspector to set break points in my src and test files to see what's going on. I am already able to accomplish this by calling node directly:
node --debug-brk node_modules/gulp/bin/gulp.js test
But I'd prefer a gulp target that wraps this for me, e.g.:
gulp.task('test-debug', 'Run unit tests in debug mode', function (cb) {
// todo?
});
Ideas? I want to avoid a bash script or some other separate file since I'm trying to create a reusable gulpfile with targets that are usable by someone who doesn't know gulp.
Here is my current gulpfile.js
// gulpfile.js
var gulp = require('gulp'),
mocha = require('gulp-mocha'),
gutil = require('gulp-util'),
help = require('gulp-help');
help(gulp); // add help messages to targets
var exitCode = 0;
// kill process on failure
process.on('exit', function () {
process.nextTick(function () {
var msg = "gulp '" + gulp.seq + "' failed";
console.log(gutil.colors.red(msg));
process.exit(exitCode);
});
});
function testErrorHandler(err) {
gutil.beep();
gutil.log(err.message);
exitCode = 1;
}
gulp.task('test', 'Run unit tests and exit on failure', function () {
return gulp.src('./lib/*/test/**/*.js')
.pipe(mocha({
reporter: 'dot'
}))
.on('error', function (err) {
testErrorHandler(err);
process.emit('exit');
});
});
gulp.task('test-watch', 'Run unit tests', function (cb) {
return gulp.src('./lib/*/test/**/*.js')
.pipe(mocha({
reporter: 'min',
G: true
}))
.on('error', testErrorHandler);
});
gulp.task('watch', 'Watch files and run tests on change', function () {
gulp.watch('./lib/**/*.js', ['test-watch']);
});
With some guidance from #BrianGlaz I came up with the following task. Ends up being rather simple. Plus it pipes all output to the parent's stdout so I don't have to handle stdout.on manually:
// Run all unit tests in debug mode
gulp.task('test-debug', function () {
var spawn = require('child_process').spawn;
spawn('node', [
'--debug-brk',
path.join(__dirname, 'node_modules/gulp/bin/gulp.js'),
'test'
], { stdio: 'inherit' });
});
You can use Node's Child Process class to run command line commands from within a node app. In your case I would recommend childprocess.spawn(). It acts as an event emitter so you can subscribe to data to retrieve output from stdout. In terms of using this from within gulp, some work would probably need to be done to return a stream that could be piped to another gulp task.

Categories

Resources