How can I write a test that npm scripts run successfully?
I have npm scripts in package.json as follows.
"scripts": {
"start": "node server.js",
"dev": "webpack serve --mode development",
"lint": "eslint src",
...
I'd like to test these npm scripts launch successfully.
For example, how do I test the command npm run dev (webpack serve) launches successfully?
You test them like you test a computer process - independent of JavaScript. You treat the fact webpack/eslint are JavaScript programs as an implementation detail.
Basically - they should only be tested in really really big projects and have a few tests.
A test can look something like:
const util = require('util');
const { exec } = util.promisify(require('child_process').exec);
describe('your run scripts', () => {
it('starts a dev server', () => {
// create a way to stop the server
const ac = new AbortController();
// start the dev server
const devServerPromise = exec('npm run dev', { signal: ac.signal });
// validate the server launched (retry comes from any promise retry package, fetch can be unidici or axios just the same)
// localhost:4200 is an example for where the dev server is running
await retry(() => fetch('http://localhost:4200/'));
ac.abort(); // close the dev erver
});
});
Note that like I started with there is very little return on investment on writing these sort of tests if your project isn't thousands of developers since the issues with stuff like npm run dev are noticed very very quickly by developers and the cost of having another call-site for them that needs to be aware of certain implementation details is high.
Related
I've got a bit of a weird set up here with a codebase that I inherited. It's a CRA app deployed to Vercel but doesn't use Next.js.
Problem: I'm not able to call myapp.com/nonce from Postman or access it in my browser to see the JSON response. There's some configuration stuff that's not quite right, just trying to figure out what that is. Would love any help here!
I have a file structure that looks like this:
Project (create-react-app)
src
bunch of React code
server
index.js
package.json
package.json
(It's not using next.js. If it were, I'd just use the /pages/api/* files)
In my top-level package.json, I have:
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build && cd server && npm install && npm run start",
}
and in my /server/package.json I have just a simple start command: node index.js
My server/index.js file:
const path = require('path');
const express = require("express");
const PORT = process.env.PORT || 3001;
const app = express();
app.use(express.static(path.resolve(__dirname, '../MyProject/build')));
app.get('/nonce', async (req, res) => {
const { address } = req.query;
const quantity = snapshot[address];
return res.json({ quantity, address }); // for testing
})
Edit:
With this as-is, my build phase never actually completes.. It runs a server at 3001 and that's the last log of the Build log
> MyProject-server#0.1.0 start /vercel/path0/server
> node index.js
Server listening on 3001
I'm working on an app that has frontend communicating with backend via socket.io
Right now I have them in two separate apps like this:
.
├──client
│ ├──src
│ │ └──index.js
│ ├──public
│ │ └──index.html
│ ├──webpack.config.js
│ └──package.json
└──server
├──app.js
└──package.json
./server/app.js goes like this
const Express = require('express')();
const server = require('http').Server(Express);
const options = {
cors: {
origin: 'http://localhost:8080',
methods: ['GET','POST'],
credentials: true,
},
};
const Socketio = require('socket.io')(server, options);
Socketio.on('connection', socket => {
socket.on('do-something', () => {
console.log('doing something...');
});
});
server.listen(3000, () => {
console.log('listening on port 3000');
});
./client/src/index.js has these lines
const io = require('socket.io-client');
const socket = io('http://localhost:3000');
const someFunction = () => {
socket.emit('do-something');
}
Pretty standard stuff.
Right now I run them separately from two terminal windows. So basically I have two servers serving a single app. It doesn't seem like the right way to go so I want to try and unite them into a single webpack app to aim for deployment. How do I do that? How do you setup this kind of thing in a real world?
I tried googling on the subject but couldn't find anything. If you know of a post that clearly explains this, share the link please. Muchas gracias & have a nice day :)
I recommend:
Placing webpack.config.js and package.json in the root of your project.
In ./server, distinguished app.js (export express app) and server.js (listen at a port).
Use npm-run-all for simplify your package.json scripts (that provided the commands run-p and run-s)
Add a production ./server.js in the root of your project.
./package.json
"scripts": {
"dev": "run-p dev:*",
"dev:server": "nodemon ./server/server.js",
"dev:webpack": "webpack serve",
"build:webpack": "webpack",
"serve": "node ./server.js"
}
./server/app.js
const Express = require('express')();
const app = require('http').Server(Express);
// Your app definition (socket.io, etc) ...
module.exports = app
./server/server.js
// Your api dev server
const app = require('./app')
app.listen(3000, () => {
console.log('Dev server listening on port 3000');
});
./server.js
// Your prod server
const express = require('express')
const app = require('./server/app')
// Same of your api dev server, but your also serve your frontend build
// ./dist depends on webpack "output-path" parameter
app.use(express.static('./dist'))
app.listen(3000, () => {
console.log('Prod server listening on port 3000');
});
Now, you can:
Run frontend and backend in parallel
npm run dev
Run only frontend
npm run dev:webpack
Run only backend
npm run dev:server
Build frontend
npm run build:webpack
Serve backend and frontend in production (I suggest use pm2)
npm run serve
This architecture works fine in my projects 😊.
You can easily extend it to use Typescript for example...
Sorry for my aproximativ English.
I did not test the code presented, sorry if there are any errors.
Regards
Ok, I give up. I'm answering.
Don't do this. Right now, merging the two repos looks tempting because you're using the same language and maybe some of the same dependencies for the client and the server.
But what happens when
Your boss says you have to port the server part to Python/Java/Go/whatever because reasons?
You want to hand off the client or the server to another team?
You want to open source the client but now the git history has all of your (still proprietary) server code tangled up in it?
You add a CI/CD workflow and now a commit to the client kicks off your pipeline for the server and vice-versa?
N.B. this isn't hypothetical, I've actually seen most of those happen. Now, you may have good answers to all those questions and decide to do it anyway, but if you don't then leave it be: bundling a client and server in the same repo loses you lots of flexibility for very minimal gains.
I am working on a cypress project. I have set up a pipeline in GitLab.
My application only works over private network connected via Open VPN.
Can some one guide me how to add that in .gitlab-ci.yml file ???
My .gitlab-ci.yml is :
image: cypress/base:10
stages:
- test
test:
stage: test
script:
- npm install
- npm run test
and my package.json is as follows:
{
"name": "cypresspackage",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"clean:reports": "rm -R -f cypress/reports && mkdir cypress/reports && mkdir cypress/reports/mochareports",
"pretest": "npm run clean:reports",
"scripts": "cypress run --spec cypress/integration/dummy.feature",
"combine-reports": "mochawesome-merge ./cypress/reports/mocha/*.json > cypress/reports/mochareports/report.json",
"generate-report": "marge cypress/reports/mochareports/report.json -f report -o cypress/reports/mochareports",
"report:copyScreenshots": "cp -r cypress/screenshots cypress/reports/mochareports/assets",
"posttest": "npm run report:copyScreenshots && npm run combine-reports && npm run generate-report",
"test": "npm run scripts || npm run posttest"
},
"author": "",
"license": "ISC",
"devDependencies": {
"cypress": "^6.3.0",
"cypress-audit": "^0.3.0",
"cypress-cucumber-preprocessor": "^4.0.1",
"cypress-multi-reporters": "^1.4.0",
"cypress-xpath": "^1.6.2",
"mocha": "^8.2.1",
"mochawesome": "^6.2.1",
"mochawesome-merge": "^4.2.0",
"mochawesome-report-generator": "^5.1.0"
},
"dependencies": {
"lambdatest-cypress-cli": "^1.0.1"
},
"cypress-cucumber-preprocessor": {
"nonGlobalStepDefinitions": true
}
}
I guess gitlab provides the runner at run time
I guess you are using GitLab's SaaS. This means your VPN would be opened in a non-private environment. For example, some GitLab admins should have access to your VPN connection and, depending on how GitLab is configured in their backyard, some other GitLab users may have access to your private network. I'd avoid that. If you insist on that, you'd better use your project's Secrets feature to save your OpenVPN client authentication, so it remains private.
Is there a option where I can choose the runner?
Sure. You can register a runner running on your own servers (or even at home on-demand). It depends on where and how this runner is being used (Docker? Kubernetes? Debian? etc). Take a look into Registering a GitLab Runner. You'll need to generate a token from your project's configuration and then install the runner using that token.
GitLab CI
Once you have your own runner installed and configured (ensuring it runs when needed), you'll need to configure your VPN start/stop in the pipeline. Here, I copy a piece of code found on GitLab's forum:
before_script:
##
## VPN
## Inspiration from: https://torguard.net/knowledgebase.php?action=displayarticle&id=138
## And http://forum.gitlab.com/t/connect-vpn-during-ci-cd/7585
## Content from Variables to files: https://stackoverflow.com/a/49418265/4396362
## Waiting for opnevpn connect would be better than sleeping, the closest would be https://askubuntu.com/questions/28733/how-do-i-run-a-script-after-openvpn-has-connected-successfully
## Maybe this would work https://unix.stackexchange.com/questions/403202/create-bash-script-to-wait-and-then-run
##
- which openvpn || (apt-get update -y -qq && apt-get install -y -qq openvpn) # Install openvpn if not available.
- cat <<< $CLIENT_OVPN > /etc/openvpn/client.conf # Move vpn config from gitlab variable to config file.
- cat <<< $VPN_U > /etc/openvpn/pass.txt # Move vpn user from gitlab variable to pass file.
- cat <<< $VPN_P >> /etc/openvpn/pass.txt # Move vpn password from gitlab variable to pass file.
- cat <<< "auth-user-pass /etc/openvpn/pass.txt" >> /etc/openvpn/client.conf # Tell vpn config to use password file.
- cat <<< "log /etc/openvpn/client.log" >> /etc/openvpn/client.conf # Tell vpn config to use log file.
- openvpn --config /etc/openvpn/client.conf --daemon # Start openvpn with config as a deamon.
- sleep 30s # Wait for some time so the vpn can connect before doing anything else.
- cat /etc/openvpn/client.log # Print the vpn log.
- ping -c 1 <IP> # Ping the server I want to deploy to. If not available this stops the deployment process.
After this, you can add an after_script section to stop the OpenVPN daemon, or using a special closing job which includes a when: always, to ensure the VPN connection is closed even if the build failed.
You can also try other solutions, depending on your environment.
I don't want to see that message whenever I save a file in my project. I already have
{
"events": {
"start": "node -e 'console.clear()'"
}
}
in my nodemon.json to indicate my project has restarted.
You can tell nodemon to be quiet, by passing the -q argument. According to nodemon --help options, this will:
minimise nodemon messages to start/stop only
Usage: nodemon -q
I have installed nodemon locally in my workspace, but even though it restarts in the terminal after changes are made, it does not refresh the browser page. I have to manually refresh it each time.
I've got Express, Node, React and Webpack running in the environment.
This is how my setup looks like -
My package.json starts up server.js -
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "nodemon server.js"
},
and server.js is -
var express = require('express');
var app = express();
app.use(express.static('public'));
app.listen(3000, function () {
console.log("Express server is up on port 3000");
});
The entry point in the webpack config file is -
module.exports = {
entry: './public/scripts/app.jsx',
output: {
path: __dirname,
filename: './public/scripts/bundle.js'
}
What should I do to fix it?
Update -
I made a video to describe the situation, if it helps.
nodemon is only for restarting the server when your server code changes. It has no functionality to reload your page in the browser. If you want automatic browser reload, you could, for example, run a webpack dev server in addition to your nodemon. webpack dev server is able reload the page in the browser when your client code changes, it can even update the page in the browser without a full page reload, if you use its hot module reloading feature.
in package.json
"scripts": {
"start": "nodemon server.js -e html,js,css"
},
in server js
var reload = require('reload')
app.listen(3000, () => {
console.log(`Listening on port 3000`);
})
reload(app);
in index.html
<body>
<h1>ron</h1>
<script src="/reload/reload.js"></script> <!-- it's necessary -->
</body>
For those needing to use nodemon and reload browser on change as well, I replied here https://stackoverflow.com/a/51089425/7779953
Solution is Nodemon + Browser Sync + Gulp + Express server