Global variables for unit tests in SailsJS app - javascript

I'm working on a Sails app, and for my unit tests, I need to use some variable in ./test/bootstrap.test.js and in ./test/unit/controllers/*.test.js. I think about global variables, but how can I create them ?
I could create something like ./config/mydatatest.js with something like :
module.exports.myconf = {
anyobject: {
bar: "foo"
}
};
But is there any way to create mydatatest.js in the test directory ?

I like the idea of considering test as a specific environment like development or production. You could create an environment-specific file config/env/test.js to put the configuration:
/**
* Test environment settings
*/
module.exports = {
myconf: {
anyobject: {
bar: "foo"
}
}
};
Then, you could add NODE_ENV=test to the command that launch the tests (based on the example from the documentation)
"scripts": {
"start": "node app.js",
"debug": "node debug app.js",
"test": "NODE_ENV=test mocha test/bootstrap.test.js test/unit/**/*.test.js"
},
I use this technique to use sails-memory adapters when I run my tests.

How about insert it in your bootstrap.test.js?
before(function (done) {
Sails.lift({
port : 5031,
log : {
level: 'error'
},
myconf : {
anyobject: {
bar: "foo"
}
}
}, function (err, server) {
sails = server;
done(err);
});
});
Basically it can. If you want to put in separate file/ folder, just require them in your bootstrap.test.js inside your myconf like my example.

Related

Wallaby Does Not Pass Dotenv Variables to Runner

I am trying to use Wallaby in conjunction with the dotenv-flow package. I currently have my wallaby.js config file setup like below:
require("dotenv-flow").config()
module.exports = function (wallaby) {
return {
files: [
'api/*',
'controllers/*',
'config/*',
'firebase/*',
'helpers/*',
'models/*',
'services/*',
'smtp/*',
'sockets/*'
],
tests: [
"test/**/*.test.mjs"
],
testFramework: "mocha",
env: {
type: "node",
params: {
env: "NODE_ENV=test"
}
}
};
};
I've tried a few other ways of writing the file including in esm module format. However, my tests run and my sequelize code complains that it wasn't passed environment variables to use for connecting to the development DB.
You are loading your .env file but then never using it's contents. Another problem is that wallaby doesn't understand the dotenv output so you have to massage it a little bit.
const environment = Object.entries(
require("dotenv-flow").config()['parsed']).
map( x => `${x[0]}=${x[1]}`).join(';'),
Then change your environment to something like this
env: {
runner: 'node',
params: {
env: environment
}
}

Cypress - Getting error while executing 'cypress open'

I've a Testing framework with node, cypress, mocha, mochawesome and mochawesome-merge as below with this github repo:
and in my package.json I have two scripts as
`"scripts": {
"cy": "./node_modules/.bin/cypress open",
"cy_test": "node cypress.js"
},`
If I run npm run cy_test it works fine in headless state, but if I run npm run cy i get following error:
But If I remove cypress.js from my project then it works as expected.
cypress.js
const cypress = require('cypress')
const marge = require('mochawesome-report-generator')
const { merge } = require('mochawesome-merge')
const currRunTimestamp = getTimeStamp();
const mergedReport = {
reportDir: 'mochawesome-report',
}
const finalReport = {
reportDir: 'reports',
}
cypress.run({
reporter: 'mochawesome',
reporterOptions: {
reportDir: 'mochawesome-report',
overwrite: false,
html: true,
json: true
}
}).then(
() => {
generateReport()
},
error => {
generateReport()
console.error(error)
process.exit(1)
}
)
function generateReport(options) {
return merge(mergedReport).then(report => marge.create(report, finalReport))
}
I think this is a problem with npm on Windows that is messing with file names, because npm is trying to run the script as binary instead of getting it from ./node_modules/.bin.
So, I'll suggest, as first try, if you can, change the name of the cypress.js to something other than cypress. I think this can solve your problem.
If not, as a workaround remove .JS from PATHEXT environment variable and restart the processes that are running the script, including your IDE, if applicable.
Hope it works.

How to pass arguments to webpack.conf.js?

I'm following these instructions based on this project (the official Vue Webpack template).
This is what I did:
package.js:
"scripts": {
"dev": "node build/dev-server.js",
"dev-alt": "node build/dev-server.js && set arg=alt&&webpack"
},
webpack.base.config.js:
// npm run dev-alt in the terminal
console.log('ARGS:', process.env.arg)
However ARGS: outputs undefined.
What the correct way to do this?
With Webpack 5.x and above you can no longer pass custom arguments to Webpack like --myarg=val. But you can still pass the supported arguments like --mode=production.
So what's the solution for custom args? Instead we need to write it like this, using the new --env parameter.
"build-project": webpack --mode=production --env myarg=val --env otherarg=val
Note that the custom arguments no longer start with -- after we put --env ahead of them. You'll need to put --env ahead of each custom key/value pair you need to define.
You'll also need to modify your Webpack config to export a function, rather than an object.
See this example code, taken from the docs.
const path = require('path');
module.exports = (env) => {
// Use env.<YOUR VARIABLE> here:
console.log('NODE_ENV: ', env.NODE_ENV); // 'local'
console.log('Production: ', env.production); // true
return {
entry: './src/index.js',
output: {
filename: 'bundle.js',
path: path.resolve(__dirname, 'dist'),
},
};
};
Pass webpack arguments with --key=value in package.json
"scripts": {
"build": "webpack --mode=production --browser=firefox",
}
Access argv in webpack.config.js like this
module.exports = (env, argv) => {
if (argv.mode == "development") {
}
if (argv.browser == "firefox") {
}
};
You can pass whatever arguments you want by:
node my-script.js --myArgs thisIsTheValue
In my-script.js you can retrieve arguments by:
function getArg(key, defaultVal) {
var index = process.argv.indexOf(key),
next = process.argv[index + 1];
defaultVal = defaultVal || null;
return (index < 0) ? defaultVal : (!next || next[0] === "-") ? true : next;
}
var theArgsIWant = getArg('--myArgs', 'this is the default if argument not found');
From the article you described:
"scripts": {
"webpack-quizMaker": "set arg=quizMaker&&webpack",
"webpack-quiz": "set arg=quiz&&webpack"
}
These scritps are doing 2 things:
They are setting an environment variable in a way that only works on Windows if you're not using PowerShell. It's recommend to use cross-env here.
They are running webpack after setting the envinronment variable.
Then, inside the webpack configuration, they are reading the environment variable:
if (process.env.arg == "quiz") {
// do your thing
}
if (process.env.arg == "quizMaker") {
// do your thing
};
I recommend that you install cross-env
npm install --save-dev cross-env
And replace the scripts with this:
"scripts": {
"webpack-quizMaker": "cross-env arg=\"quizMaker\" webpack",
"webpack-quiz": "set arg=\"quiz\" webpack"
}
No need for && anymore because cross-env will call the specified command (webpack) after setting the env variable.
You can try this:
const onlyJS = process.argv.some(argument => argument === 'your argument');

Gulp task for ng-constant multiple environments

I have been trying to get this to work maybe I'm missing something. I am using ng-constant and setting up different environments end point as mentioned in the ng-constants issue
However I am using gulp and the configuration looks like
gulp.task('environmentsapi', function () {
return ngConstant({
stream: true,
development: {
constants: {
"ENV": {"api": "http://1.1.1.1:8082/"}
}
},
production: {
constants: {
"ENV": {"api": "https://productionapplink/"}
}
}
})
// Writes config.js to dist/ folder
.pipe(gulp.dest('dist/scripts/config'));
});
I cant figure out how to call the different end points in the different gulp tasks like the example in the link ngconstant:development etc. How can i run this within the task environmentsapi, since this task is shared in all environment builds. Please let me know how to do this.
gulp.task('build', function () {
runSequence('clean', ['sass', 'scripts', 'bower_components', 'environmentsapi' //How can I run ngconstant:development here? ], 'wiredep')
});
Simply create new tasks that set flags!
Here I'm using the development flag that defaults to true.
var development = true;
gulp.task('prod', function () {
development = false;
});
gulp.task('environmentsapi', function () {
const apiEndpoint = development ? 'http://1.1.1.1:8082/' : 'https://productionapplink/';
return ngConstant({
stream: true,
constants: {
'ENV': {api: apiEndpoint}
}
});
});
Now, using gulp build will build your application with the ENV.api set to 'http://1.1.1.1:8082/', your development endpoint.
And calling gulp prod build will make your output use an ENV.api set to 'https://productionapplink/'.
As discussed in the comments section, the solution above is quite perfect when you only have two environments, but it quickly gets out of hand when the number of environment grows.
In that case, I suggest using a different approach, the Pirate way, using yargs.
Here would be your new gulpfile.js:
const argv = require('yargs').argv;
const endpoints = {
'dev': 'http://1.1.1.1:8082/',
'prod-org': 'https://productionapplink.org/',
'prod-com': 'https://productionapplink.com/',
'prod-gov': 'https://productionapplink.gov/'
};
gulp.task('enviornmentsapi', function () {
const apiEnpdoint = typeof argv.env === 'undefined' ? endpoints.dev : endpoints[argv.env];
return ngConstant({
stream: true,
constants: {
ENV: { api: apiEnpdoint }
}
}).pipe(gulp.dest('dist/scripts/config'));
});
Use it like follows:
gulp build uses the default api URL: 'http://1.1.1.1:8082/'
gulp build --env=prod-org uses 'https://productionapplink.org/'
gulp build --env=prod-com uses 'https://productionapplink.com/'
I hope this could work for you this time!

NodeJS PTY timing commands

I'm trying to use a node process to kick off an interactive docker session then automate some commands to it:
var spawn = require('pty.js').spawn;
var proc = spawn('docker', [ 'run', '-i', '-t', 'mycontainer' ], {
name: 'test',
rows: 30,
cols: 200,
cwd: process.env.HOME,
env: process.env
});
proc.on('data', function (data) {
console.log(data);
});
proc.write('cd /tmp');
proc.write('nvm install 0.10\r');
proc.write('npm install');
This seems to work, the only issue is it seems like it's just writing in all the commands and firing them. I don't seem to have any control over catching the output or errors of individual commands.
I'm curious if there's a better way to approach this?
You can pipe streams to this process, however it is not advised to do so.
const { pipeline } = require('stream');
const { spawn } = require('node-pty')
const proc = spawn('docker', ['run', '--rm', '-ti', 'alpine', '/bin/sh'], {
name: 'xterm-color',
cwd: process.env.HOME,
env: process.env,
encoding: null,
});
pipeline(process.stdin, proc, (err) => err && console.warn(err.message))
pipeline(proc, process.stdout, (err) => err && console.warn(err.message))
The maintainer have suggested to not use pty in like a stream. It's simply a matter of changing the pipeline for something like this.
(async (stream) => {
for await (const chunk of stream) {
proc.write(chunk.toString())
}
})(process.stdin).catch(console.warn)
The gist is that we should pass string into the write function. We also should expect string as its output. Therefore, we should not set any encoding in the object so that it by default outputs utf8 string.
Regarding your initial question. proc.write('ls\r') is the correct way of doing it. Note the trailing \r to virtually press enter. Just like in a normal terminal, when you execute a command, you cannot fire a second one simultaneously. The commands will just queue up and run one after another.
Input:
const { spawn } = require('node-pty')
const proc = spawn('docker', ['run', '--rm', '-ti', '--network=host', 'node', '/bin/sh'], {
name: 'xterm-color',
cwd: process.env.HOME,
env: process.env,
});
proc.write('npm init -y\r')
proc.write('npm i eslint\r')
proc.write('ls node_modules /\r')
const disposable = proc.onData((text) => process.stdout.write(text))
const exitDisposable = proc.onExit(() => {
disposable.dispose()
exitDisposable.dispose()
})
Output:
npm i eslint
ls node_modules /
# Wrote to /package.json:
{
"name": "",
"version": "1.0.0",
"description": "",
"main": "index.js",
"directories": {
"lib": "lib"
},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN #1.0.0 No description
npm WARN #1.0.0 No repository field.
+ eslint#7.1.0
added 136 packages from 82 contributors and audited 136 packages in 9.461s
9 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
# /:
bin etc lib64 node_modules package.json run sys var
boot home media opt proc sbin tmp
dev lib mnt package-lock.json root srv usr
node_modules:
#babel is-extglob
#types is-fullwidth-code-point
...
...
#
You see it wrote ls before npm install was completed but it ran afterwards.
Also note that I used -ti instead of just -t for the docker args.
Looking through the source for the pty.js module, it is clear that your proc.write is really the standard Node net.Socket.write -- https://nodejs.org/api/net.html#net_socket_write_data_encoding_callback
In short, yes, you are just spamming the commands to the socket. You need to wait for each command to finish before executing the next. Thus, you'll need to use the callback parameter of .write to determine when a command has finished and then proceed from there. Something like this may work:
// this is a quick and dirty hack
let cmdcount = 0;
function submitcmd() {
switch (cmdcount) {
case 0:
proc.write('nvm install 0.10\r', 'utf8', submitcmd);
break;
case 1:
proc.write('npm install', 'utf8', submitcmd);
break;
}
cmdcount += 1;
}
proc.write('cd /tmp', 'utf8', submitcmd);

Categories

Resources