Unit Testing Express app in Jenkins - javascript

Ok, I'm searching a couple of hours for a solution to run my Mocha unit tests inside Jenkins, for my Express JS app.
Writing tests is quite easy, but connecting the tests with my app is a bit harder. An example of my current test:
/tests/backend/route.test.js
var should = require('should'),
assert = require('assert'),
request = require('supertest'),
express = require('express');
describe('Routing', function() {
var app = 'http://someurl.com';
describe('Account', function() {
it('should return error trying to save duplicate username', function(done) {
var profile = {
username: 'vgheri',
password: 'test',
firstName: 'Valerio',
lastName: 'Gheri'
};
// once we have specified the info we want to send to the server via POST verb,
// we need to actually perform the action on the resource, in this case we want to
// POST on /api/profiles and we want to send some info
// We do this using the request object, requiring supertest!
request(app)
.post('/api/profiles')
.send(profile)
// end handles the response
.end(function(err, res) {
if (err) {
throw err;
}
// this is should.js syntax, very clear
res.should.have.status(400);
done();
});
});
});
In above example I connect to a running app (see ```var app = 'http://someurl.com'``). Obviously this is not working inside Jenkins, except if I can tell Jenkins to first run the app and then check for a localhost url. But how to do this?
Now if I take a look at https://www.npmjs.org/package/supertest this should be enough to test my express app:
var request = require('supertest')
, express = require('express');
var app = express();
But it's not. I receive 404 errors on all urls which I would like to test.
Anyone who knows how to test my app inside Jenkins?

Is the app you referenced at http://someurl.com the app you're trying to write this test for? If so...
Supertest should allow you to run tests without needing an external server running. The way you accomplish this is by separating out the server routing Express code from the code you'll run that actually starts the server, and use Supertest on the server routing code.
Here is an example:
Given this directory structure:
./package.json
./server/index.js
./app.js
./test/server-test.js
Here are my files:
./package.json
All of these dependencies aren't required for this example, I just pulled this from my package.json I'm using in my project.
{
"name": "StackOverflowExample",
"version": "1.0.0",
"description": "Example for stackoverflow",
"main": "app.js",
"scripts": {
"test": "mocha"
},
"author": "Mark",
"license": "ISC",
"dependencies": {
"body-parser": "^1.15.2",
"ejs": "^2.5.2",
"express": "^4.14.0",
"express-validator": "^2.21.0",
},
"devDependencies": {
"chai": "^3.5.0",
"mocha": "^3.1.2",
"supertest": "^2.0.1"
}
}
./server/index.js
var express = require('express');
var app = express();
app.get('/', function(req, res) {
res.send('Hello World');
});
module.exports = app;
./app.js
var app = require('./server');
var server = app.listen(8000, function() {
console.log('Listening on port 8000');
});
./tests/server-test.js
var request = require('supertest');
var app = require('../server');
var assert = require('assert'); //mocha
var chai = require('chai');
var expect = null;
chai.should();
expect = chai.expect;
describe('GET /', function() {
it('should return a 200', function(done) {
request(app).get('/').expect(200, done);
});
});
I'm using Mocha and Chai here (didn't have time to refactor and make the example simpler, sorry). The main point is that within the it statement you can run your tests.
Doing it this way does not require a separate server running. Supertest magically starts one up for you and runs it based on the express server you exported from the server module. I think this approach should allow you to run your tests seamlessly through Jenkins.
In ./server/index.js, an Express object is created. That object is exported, and imported via require in both ./app.js and ./test/server-test.js. In ./app.js it's used to start a server via listen(). In the test file, Supetest uses it to start a server for testing internally.
When you are running the server in this example, you would run it via ./app.js. That would set the port you choose (8000 in this example) and start listening for connections on that port.
Hope this helps! Good luck!

Related

How to implement Express + Node JS + Browserify properly?

So I'm somewhat new to the whole web development thing with node.js and I'm wondering if someone could help me out with understanding how to implement my application correctly.
So the app is a simple landing page with an email form that takes an email and sends it to the API. I designed this functionality without issue except when I launched my website i'm getting a required not defined error.
I understand that this is because node.js is a server side technology so that when the application goes live, the client doesn't understand what required means.
Through further research, I figured out that I had two options and that's to either implement synchronous dependencies via something like Browserify or take things asynchronously and use something like RequireJS.
Right now I've decided on using Browserify, (unless someone can convince me otherwise) I just need help with figuring out how to implement it for my specific app.
app.js
//The dependenices my node js app needs (also where the require bug occurs)
//------------------------------------------------------------------------------------------------------------------------
const express = require('express'); //Require the express package that we installed with npm install express
const request = require('request'); //Require the express package that we installed with npm install request
const bodyParser = require('body-parser'); //Require the express package that we installed with npm install bodyParser
const path = require('path'); //Require the express package that we installed with npm install path
//------------------------------------------------------------------------------------------------------------------------
const app = express(); //Creates the application with express
//Middleware
app.use(express.json()); //Tells express to use JSON as it's input
app.use(bodyParser.urlencoded({ extended: false })); //Prevents XSS, only will return if the url has specified enconding
app.use(express.static(path.join(__dirname, '/site'))); //Tells the app to use the current path D:\Gaming Galaxy\Website\Website\main\site as the dir for the website
console.log("The directory used is", express.static(path.join(__dirname, '/site')));
app.post('/subscribe', (req, res) => { //Makes a post request to express req is the request and res is the response
const { email, js } = req.body; //Create a constant that makes up of the request's body
const mcData = { //Create a constant JSON object mcData, that contains the email from the above constant and a status message
members: [
{
email_address: email,
status: 'pending'
}
]
}
const mcDataPost = JSON.stringify(mcData); //Turns the JSON object into a string
const options = { //Sets a JSON object of a bunch of options that mailchimp will use
url: 'https://us20.api.mailchimp.com/3.0/lists/f10300bacb',
method: 'POST',
headers: {
Authorization: 'auth f24c3169da044653d1437842e39bece5-us20'
},
body: mcDataPost
}
if (email) { //If there's an email that exists
request(options, (err, response, body) => { //Send a request to mail chimp
if (err) { //If there's an error
res.json({ error: err }) //Print said error
} else { //If there's not an error
if (js) { //If javascript is enabled (boolean)
res.sendStatus(200); //Send a success message
} else {
res.redirect('/success.html'); //If it's disabled, send them to a successful HTML page.
}
}
})
} else {
res.status(404).send({ message: 'Failed' }) //If the email doesn't exist, have it fail
}
});
app.listen(5000, console.log('Server started!')) //Console log that confirms the start of the server
package.json
{
"name": "gaminggalaxy",
"version": "1.0.0",
"main": "site/js/app.js",
"dependencies": {
"body-parser": "^1.19.0",
"commonjs": "^0.0.1",
"express": "^4.17.1",
"index": "^0.4.0",
"node-fetch": "^2.6.6",
"prototype": "^0.0.5",
"request": "^2.65.0",
"requirejs": "^2.3.6",
"uniq": "^1.0.1"
},
"devDependencies": {
"nodemon": "^2.0.15"
},
"scripts": {
"serve": "node app",
"dev": "nodemon app"
},
"keywords": [],
"author": "",
"license": "ISC",
"repository": {
"type": "git",
"url": "git+https://github.com/InvertedTNT/Main.git"
},
"bugs": {
"url": "https://github.com/InvertedTNT/Main/issues"
},
"homepage": "https://github.com/InvertedTNT/Main#readme",
"description": ""
}
index.html (the form itself)
<form action="/subscribe" method="POST">
<div class="newsletter-form-grp">
<i class="far fa-envelope"></i>
<input type="email" name="email" id="email"
placeholder="Enter your email..." required>
</div>
<button id="cta">SUBSCRIBE <i class="fas fa-paper-plane"></i></button>
</form>
folder structure
node_modules
site
-index.html
-css
-js
- app.js
-images
app.js
package-lock.json
package.json
Thank you for your help, I would appreciate any sort of advice on how I can use those dependencies and the overall implementation of browserify.
A browser is an HTTP client.
Express is a framework for building HTTP servers.
HTTP clients make requests to HTTP servers which then send responses back.
Express depends on Node.js. It requires features provided by Node.js (like the ability to listen for network requests) which are not available in browsers.
Browserify can bundle up JavaScript which is written using modules into non-module code that can run in a browser but only if that code does not depend on Node.js-specific features. (i.e. if the JS modules are either pure JS or depend on browser-specific features).
Browserify cannot make Express run inside the browser.
When you run your JS program using Node.js you can then type the URL to the server the program creates into the browser’s address bar to connect to it.

mongoose post.save failing on heroku, works on localhost

I'm trying to simply post to my MongoDB Atlas db via node,express,mongoose and heroku. A Postman POST request, Raw JSON with body:
{
"title": "heroku post",
"description": "post me plsssss"
}
works on localhost with this exact code, but when uploaded via heroku the try/catch block fails at post.save() as the response is the error.
{
"error": "there's an error",
"message": {}
}
But the error is empty and I'm not sure how to debug it. I've put in mongoose.set('debug', true); in app.js and i've modified my package.json: "start": "node app.js DEBUG=mquery", but those have made no extra output that I am seeing. Is there any other way to know why the post.save() is throwing an error, some logs that I am not utilising.. and how can I see those logs? Or if you just know what the issue is?
App.js
//use dotenv to store secret keys
require("dotenv").config();
const express = require('express');
const mongoose = require('mongoose');
const app = express();
const cors = require('cors');
//connect to DB
mongoose.connect("mongodb+srv://grushevskiy:intercom#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {
console.log('connected to DB!')
})
mongoose.set('debug', true);
const db = mongoose.connection;
db.once('open', () => {
console.log('connection opened')
});
//import routes for middleware
const postsRoute = require('./routes/posts');
//midleware routes
app.use('/posts', postsRoute)
//ROUTES
app.get('/', (req,res) => {
res.send('we are on home')
})
//MIDDLEWARES
//cors
app.use(cors());
//decode url special characters
app.use(express.urlencoded({ extended: true }));
//parse json POSTs
app.use(express.json());
//How do we start listening to the server
app.listen( process.env.PORT || 3000);
posts.js
const express = require ('express')
const router = express.Router();
const Post = require('../models/Post')
//SUBMIT A POST
router.post('/', async (req,res) => {
const post = new Post({
title: req.body.title,
description: req.body.description
});
console.log(post)
try {
const savedPost = await post.save();
res.json(savedPost);
console.log(savedPost)
} catch (err) {
res.json({ error: "there's an error", message: err, })
}
})
module.exports = router;
Post.js Model
const mongoose = require('mongoose')
const PostSchema = mongoose.Schema({
title: {
type: String,
required: true
},
description: {
type: String,
required: true
},
date: {
type: Date,
default: Date.now
}
})
module.exports = mongoose.model('Post', PostSchema)
When I type heroku logs --tail there are no errors, also initially, the 'connected to DB!' message comes in a bit late.. I'm wondering if maybe this is an issue with async/await? My package.json:
{
"name": "22-npmexpressrestapi",
"version": "1.0.0",
"engines": {
"node": "14.15.3",
"npm": "6.14.9"
},
"description": "",
"main": "index.js",
"scripts": {
"start": "node app.js DEBUG=mquery",
"start:dev": "node app.js"
},
"author": "",
"license": "ISC",
"devDependencies": {
"express": "^4.17.1",
"nodemon": "^2.0.7"
},
"dependencies": {
"dotenv": "^8.2.0",
"express": "4.17.1",
"cors": "^2.8.5",
"mongoose": "^5.11.11"
}
}
After reading your question carefully I have a couple of suggestions that I think you might want to try. First, let me explain myself:
In my experience, when one learns how to build REST APIs with Node.js, using mongoose to communicate with a MongoDB cluster, this is what the sequence of events of their index.js file looks like:
Execute a function that uses environment variables to establish a connection with the database cluster. This function runs only once and fills de mongoose instance that the app is going to use with whatever models that have been designed for it.
Set up the app by requiring the appropriate package, defining whatever middleware it's going to require, and calling to all the routes that have been defined for it.
After the app has incorporated everything it's going to need in order to run properly, app.listen() is invoked, a valid port is provided and... voila! You've got your server running.
When the app and the databsae are simple enough, this works like a charm. I have built several services using these very same steps and brought them to production without noticing any miscommunication between mongoose and the app. But, if you check the App.js code you provided, you'll realize that there is a difference in your case: you only connect to your Mongo cluster after you set up your app, its middleware, and its routes. If any of those depend on the mongoose instance or the models to run, there is a good chance that by the time Heroku's compiler gets to the point where your routes need to connect to your database, it simply hasn't implemented that connection (meaning hasn't run the mongoose.connect() part) jet.
Simple solution
I think that Heroku is simply taking a little bit longer to run your whole code than your local machine. Also, if your local version is connecting to a locally stored MongoDB, there is a good chance things run quite quicker than in a cloud service. Your statement
the 'connected to DB!' message comes in a bit late
gives me hope that this might be the case. If that were so, then I guess that moving
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {console.log('connected to DB!')});
to the second line of App.js, right after you call Dotenv, would do the trick. Your mongoose instance would go into App already connected with the database and models that App is going to use.
Asynchronous solution
Even if what I previously wrote fixed your problem, I would like to expand the explanation because, in my experience, that version of the solution wouldn't be right either. The way of doing things that I exposed at the beginning of this answer is ok... As long as you keep your MongoDB cluster and your mongoose instance simple. Not very long ago I had to implement a service that involved a much more complex database pattern, with several different databases being used by the same App, and many of its routes depending directly on those databases' models.
When I tried to use the logic I described before with such a problem, involving heavily populated databases and several models contained in the mongoose instance, I realized that it took Node far less building the App, its middleware, and its routes than connecting to mongoose and loading all the models that that very same App required for it to run. Meaning that I got my Server connected to PORT **** message before the database was actually connected. And that caused trouble.
That made me realize that the only way to do things properly was to ensure asynchronicity, forcing App to run only after all the databases and the models were loaded into the mongoose instance, and feeding it that very same instance. This way, I ended up having an index.js that looked like this:
const Dotenv = require("dotenv").config();
const { connectMongoose } = require("./server/db/mongoose");
const wrappedApp = require("./app");
connectMongoose().then((mongoose) => {
const App = wrappedApp(mongoose);
App.listen(process.env.PORT, () => {
console.log(
`Hello, I'm your server and I'm working on port ${process.env.PORT}`
);
});
});
A file with several functions govering the connections to my databases and modeles called mongoose.js. The function that implements all the connections in the right order looks is:
const connectMongoose = async (mongoose = require("mongoose")) => {
try {
// Close any previous connections
await mongoose.disconnect();
// Connect the Main database
console.log(
"--> Loading databases and models from MongoDB. Please don't use our server jet. <--"
);
mongoose = await connectMother(mongoose);
// Connect all the client databases
mongoose = await connectChildren(mongoose);
console.log(
"--> All databases and models loaded correctly. You can work now. <--"
);
return mongoose;
} catch (e) {
console.error(e);
}
};
module.exports = { connectMongoose };
With this, the file app.js returns a function that needs to be fed a mongoose instance and sets up all the Express environment:
module.exports = (Mongoose) => {
const Express = require("express"); //Enrouting
// Other required packages go here
// EXPRESS SETTINGS
const App = Express();
App.set("port", process.env.PORT || 3000); //Use port from cloud server
// All the middleware App needs
App.use(Flash());
// Routing: load your routes
// Return fully formed App
return App;
};
Only using this pattern I made sure that the chain of events made sense:
Connect all the databases and load their models.
Only after the last database has been connected and its last model, loaded, set up the App, its middleware, and your routes.
Use App.listen() to serve the App.
If your database isn't very large, then the first part of the answer might do the trick. But it's always useful to know the whole story.
I am able to reach an empty error message with your code when i add wrong connection string.
console.log in this callback is incorrect:
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {
console.log('connected to DB!')
})
if you wanna know if you are connected to db use this:
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" });
const db = mongoose.connection;
db.once('open', () => {
console.log('connection opened')
});
I noticed you are using dotenv maybe you are not adding variables to your heroku host.

Does log4js require any extra code to work on an Apache server?

I'm trying to add Log4js-Node to a Node.js server running on Apache. Here's my code:
const path = require("path");
const express = require("express");
const log4js = require('log4js');
const app = express();
const logger = log4js.getLogger();
logger.level = "debug";
const port = 443;
log4js.configure({
appenders: { everything: { type: 'file', filename: 'logs.log', flags: 'w' } },
categories: { default: { appenders: ['everything'], level: 'ALL' } }
});
const server = app.listen(port, () => {
logger.debug("listening to requests on port " + port);
});
app.get("/log", (req, res) => {
res.sendFile(path.join(__dirname + "/logs.log"));
});
When I run the script on Node.js on my computer and navigate to localhost:443/log I see what I expect, which is this:
[2020-03-17T22:50:43.145] [DEBUG] default - listening to requests on port 443
But when I run the code on a remote server it crashes and I get this in the error page (with part of the path replaced by me with "[removed]"):
App 25925 output: at Server. ([removed]/index.js:27:9)
App 25925 output: at Logger. [as debug] ([removed]/12/lib/node_modules/log4js/lib/logger.js:124:10)
App 25925 output: at Logger.log ([removed]/12/lib/node_modules/log4js/lib/logger.js:73:12)
App 25925 output: at Logger._log ([removed]/12/lib/node_modules/log4js/lib/logger.js:90:16)
App 25925 output: at Object.send ([removed]/12/lib/node_modules/log4js/lib/clustering.js:97:15)
App 25925 output: [removed]/12/lib/node_modules/log4js/lib/clustering.js:97
App 25925 output: at Object. ([removed]/12/lib/node_modules/log4js/lib/clustering.js:8:13)
I'm using A2 Hosting which uses Apache 2.4.41. I opted for Node.js 12.9.0, and Log4js 6.1.2. The package.json should be the same on both my computer and the server, and I've run npm install on both.
Is this just an issue with Log4js and the server, or have I missed something somewhere?
This was actually a relatively simple fix. The path referenced by the last error in the stack trace is a Log4js module that implements clustering support through Node's "cluster" module. The line "8" referenced is cluster = require("cluster"). It's wrapped in a try/catch block like this:
try {
cluster = require("cluster"); //eslint-disable-line
} catch (e) {
debug("cluster module not present");
disabled = true;
}
The installation of Node.js on my computer came with the "cluster" module, however as far as I can tell, the server I'm using doesn't support it. Also, the version of Node I'm using on my computer is newer than what I'm using on the server (so I've now installed 12.9 on my machine). I believe the older version of Node doesn't bother trying to catch the exception and tries to load the cluster module, fails, and then throws the error.
So the simple fix was to comment out most of the "try/catch" block, leaving just the contents of "catch" like this:
// try {
// cluster = require("cluster"); //eslint-disable-line
// } catch (e) {
debug("cluster module not present");
disabled = true;
// }
If someone has a better fix, I'm open to suggestions.
The same response of #skittleswrapper,thx, it work for me.
I use Node.js 14.18.1 with log4js 6.3.0.
But i wondering what'is the necessary of this module 'cluster' and if we can
add it to our app in other way.

Get request in openshift node js app

I'm building an openshift node js app which has to communicate with youtube data API. Its deployment is a success when I do "git push" with the require commented.
/*
var request = require('request');
*/
When I uncomment it, I get this error :
remote: Waiting for application port (8080) become available ...
remote: Application 'eln' failed to start (port 8080 not available)
remote: -------------------------
remote: Git Post-Receive Result: failure
remote: Activation status: failure
remote: Activation failed for the following gears:
remote: 573c3e177628e146d400004e (Error activating gear: CLIENT_ERROR: Failed to execute: 'control start' for /var/lib/openshift/573c3e177628e146d400004e/nodejs
Am I doing it bad? How can I fix it?
thank you.
Edit 1: Adding listening code, I didn't modify it (it was already here when I've created the app).
self.ipaddress = process.env.OPENSHIFT_NODEJS_IP;
self.port = process.env.OPENSHIFT_NODEJS_PORT || 8080;
/**
* Start the server (starts up the sample application).
*/
self.start = function()
{
// Start the app on the specific interface (and port).
self.app.listen(self.port, self.ipaddress, function()
{
console.log('%s: Node server started on %s:%d ...', Date(Date.now() ), self.ipaddress, self.port);
});
};
This is my app.js basic code that is working on openshift.
#!/bin/env node
ipaddress = process.env.OPENSHIFT_NODEJS_IP;
if (typeof ipaddress === "undefined") {
// Log errors on OpenShift but continue w/ 127.0.0.1 - this
// allows us to run/test the app locally.
console.warn('No OPENSHIFT_NODEJS_IP var, using 127.0.0.1');
ipaddress = "127.0.0.1";
};
var server = app.listen(8080, ipaddress, function() {
console.log('Listening on port %d', server.address().port);
});
Can you try that?
Update
After trying it on openshift with request I also got this error but it was because package.json didn't have request under dependencies.
My dependencies now look like this and it works fine:
"dependencies": {
"ejs": "^2.4.1",
"express": "~3.4.4",
"request": "latest" // this is added
},

Testing nodejs with mocha and supertest. How to get the app?

I am trying to test a nodejs app with supertest, and I cannot get a single route to run. I've narrowed the problem down. In my test file, I start with this:
var app = express(); // this is the problem, this isn't really the app, right?
// testing this dummy works fine, but I want the real one
app.get('/user', function(req, res){
res.status(200).json({ name: 'tobi' });
});
describe('GET /user', function(){
it('respond with json', function(done){
request(app)
.get('/user')
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200, done);
})
})
...and this test passes. But of course, that dummy /user route is not the one I want to test. I want to test routes defined in my "real" app, defined at ./server.js.
Even if I simply move the dummy app.get('/user', function... definition to my "real" server.js file, the one that I launch with nodemon, this simple test fails with a 404.
So what does this line really do: var app = express();, and how do I get hold of the app configured in my server.js file so I can test it?
You need to export your app from server.js. Once you do this, you can require the app in your test files.
server.js
var express = require('express')
var app = express();
// Your routes here...
// At end of file
module.exports = app;
test.js (in a test directory)
var api = require('../server.js'),
request = require('supertest')(api);
describe('noshers', function() {
it('doesn\'t allow GET requests', function(done) {
request
.get('/foo')
.expect(405, done);
});
});

Categories

Resources