How to use Jest global Setup and Teardown in a nodeJS project? - javascript

I added tests to my node js project using jest but for each test suite there's a beforeAll method that creates a new test server and connects to a mongo database and an afterAll method that closes both test server and the database. I would like to perform the above tasks globally for all the test suites not one at a time. Below is a sample of my code.
app.js
const express = require("express");
const app = express();
const { connectToDb } = require("./startup/db");
require("./startup/routes")(app);
connectToDb();
...
const port = process.env.PORT || 3000;
if (process.env.NODE_ENV !== "test") {
app.listen(port, () => winston.info(`Listening on port ${port}...`));
}
module.exports = app;
auth.test.js
const request = require("supertest");
const http = require("http");
const { disconnectDb } = require("../../startup/db");
describe("auth middleware", () => {
let server;
beforeAll((done) => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
});
afterAll((done) => {
server.close(done);
disconnectDb();
});
it("should return 401 if no token is provided", async () => {
const res = request(server)
.post("/api/genres")
.set("x-auth-token", "")
.send({ name: "genre1" });
expect(res.status).toBe(401);
});
...
jest.config.js
module.exports = {
testEnvironment: "node",
};

Try with this jest.config.js:
module.exports = {
testEnvironment: "node",
globalSetup: '<rootDir>/src/testSetup.ts'
};
And in testSetup.ts you can do:
// testSetup.ts
module.exports = async () => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
};

use this config: setupFiles: ['./tests/setup.js']
your setup file should look like this:
// setup.js
(async () => {
const app = require('../app.js')
global.app = app
})()
then you will be able to use app globally in every test suite

I had the same problem, I wanted to make one database connection before all test files and close the connection after all tests in all files.
But....I did not achieve what I wanted and MAYBE we don't need to do this.
I found a solution to launch functions beforeAll(),afterAll() etc... really before ALL TEST FILES and after ALL TEST FILES etc..
So you define these functions once in a certain file and they run for every test file.
To do that, all we need is to create a setupFile.ts and add path to this file in jest.config or in package.json "setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
Here is an example of my jest configuration.
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
"rootDir": "src",
"verbose": true,
"clearMocks": true,
"testMatch": [
"**/**/*.test.ts"
]
},
Here is an example of setupFile.ts
import usersCollection from "../../database/user-schema";
import mongoose from "mongoose";
beforeAll(async () => {
try {
await mongoose.connect(process.env.MONGODB_URL!);
await usersCollection.deleteMany({});
} catch (error) {
console.log(error);
}
});
afterAll(async () => {
try {
await mongoose.disconnect();
} catch (error) {
console.log(error);
}
});
It means that we will establish a connection to the database FOR EVERY TEST FILE BEFORE ALL TESTS IN THAT FILE and close connection after all tests in every test file.
What I realized for myself:
In real life we have many test files and not every file needs a connection to a database.
It's perfectly fine to open a connection to a database in files which need a connection and close after all tests in that file, for example integration tests when we test API endpoints.
In other tests to not use real database for many unit tests we can consider to mock(simulate) a database. It's another very interesting topic 😊
If I say something wrong you can correct me
P.S
I also want to mention what is written in the Mongoose documentation
Do not use globalSetup to call mongoose.connect() or
mongoose.createConnection(). Jest runs globalSetup in a separate
environment, so you cannot use any connections you create in
globalSetup in your tests.
https://mongoosejs.com/docs/jest.html

Related

Confused by module exports/requires

I am working on a piece where I am basically refactoring existing code. I have two files: index and server. My index is:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
// Now that the nonsense is over... load up the server entry point
require('./server');
My server is like:
import path from 'path';
import Loadable from 'react-loadable';
...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (require.main === module) {
// called directly i.e. $ node index.js
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
} else {
// required as a module => executed on aws lambda
module.exports = server;
}
};
main();
The server should run a REST service when executed locally, and export the server instance for the inject method. This way a proxy can attach to it when running under AWS Lambda.
I used the same setup before, multiple times. Only the two pieces were in the same file eg the server was inside the index. A single file version works fine - require.main comparison tells the program how it is running and module.exports exposes the server instance [server] with the needed inject method when running under Lambda and runs the REST service with a direct invocation.
However, since I need to import react-loadable this time, I split the files in two pieces.
Now, I probably need to figure out how the code is running inside index.js as server.js is not being invoked directly then pass it to the server. It is probably not too difficult.
My main problem is that when if I do console.log(require('./server')) from index, it prints {}. While the server instance is created successfully, and the inject() is present; I am somehow unable to export it from the server.js file and import into index.js correctly, and therefore [re-]export it from index.js for the proxy to attach.
Obviously, I am doing something incorrectly. To me, seems that my require does not have the server instance because the instance gets created after the require finished. Since my main() is async, it is plausible.
What is the right way to accomplish this?
First, notice that your main() function in ./server.js is async and you're defining module.exports from within that asynchronous function. Second, you're calling require('./server.js') from ./index.js without waiting for the asynchronous work to finish. Node resolves require()-d modules as a blank object immediately (that's the {} you're getting), and then extends the object when any async or cyclic material becomes available. So that's why you're seeing what you're seeing.
Which solutions will or will not fit your use case will depend on the details of how your AWS/direct invocation is supposed to work. Here's a suggestion:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
// ...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
process.env.serverRunLocally = require.main === module;
// Now that the nonsense is over... load up the server entry point
require('./server').then(listen => listen());
import path from 'path';
import Loadable from 'react-loadable';
// ...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (process.env.serverRunLocally) {
// called directly i.e. $ node index.js
return () => {
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
}
} else {
// required as a module => executed on aws lambda
return server;
}
};
module.exports = main();

How does one mock the createClient method of the node redis module

I am trying to avoid a dependency on a running redis-server by using the redis-mock module in place of redis module when creating a new client. Thus far I have found it impossible to mock the relevant method: createClient
I have gone through the sinon documentation on stubs and an example run through (found after some Googling) and based on these I have set up:
an example app
// src/app.js
'use strict';
// import modules
const express = require('express')
, bluebird = require('bluebird')
, redis = require('redis')
;
// promisify redis
bluebird.promisifyAll(redis.RedisClient.prototype);
bluebird.promisifyAll(redis.Multi.prototype);
// define constants
const app = express()
, client = redis.createClient()
, port = 3000
;
// set some values
client
.setAsync('12345', JSON.stringify({vacancyId:12345}))
.catch(err => console.log(`[ERROR]: error setting value - ${err}`));
// define routes
app.get('/api/vacancy/:vacancyId', (req, res) => {
client
.getAsync(req.params.vacancyId)
.then(val => res.send(val))
.catch(err => console.log(`[ERROR]: error getting value - ${err}`))
});
// listen on port
app.listen(port);
// export the app
module.exports = app;
and corresponding test
// test/app.js
'use strict';
// import modules
const chai = require('chai')
, chaiAsPromised = require('chai-as-promised')
, chaiHttp = require('chai-http')
, redis = require('redis')
, redisMock = require('redis-mock')
, sinon = require('sinon')
, app = require('../src/app.js')
;
// configure chai
chai.use(chaiAsPromised);
chai.use(chaiHttp);
// define constants
const expect = chai.expect
, response = JSON.stringify({vacancyId:12345})
;
// now test
describe.only('App', function() {
before(function() {
sinon
.stub(redis.RedisClient.prototype, 'createClient')
.callsFake(function() {
console.log('[TEST]: i never get here :(');
return redisMock.createClient();
});
});
describe('/api/vacancy/:vacancyId', function() {
it('should return the expected response', function() {
return expect(chai.request(app).get('/api/vacancy/12345'))
.to.eventually
.have.include({status:200})
.and
.nested.include({text:response});
});
});
});
I would expect the test to pass (and it does when I remove the stub and point at a running redis server):
> scratch-node#1.0.0 test /Users/nonyiah/.src/scratch-node
> mocha --exit
App
/api/vacancy/:vacancyId
✓ should return the expected response
1 passing (41ms)
but instead I get the following error:
> scratch-node#1.0.0 test /Users/nonyiah/.src/scratch-node
> mocha --exit
App
1) "before all" hook in "App"
0 passing (10ms)
1 failing
1) App
"before all" hook in "App":
TypeError: Cannot stub non-existent own property createClient
at Sandbox.stub (node_modules/sinon/lib/sinon/sandbox.js:308:19)
at Context.<anonymous> (test/app.js:26:8)
npm ERR! Test failed. See above for more details.
What is the correct way to achieve this mocking?
Turns out I was creating the stub after the app was loaded. I need to move the instantiation of the app:
// , app = require('../src/app.js')
to after the stub is created:
describe('/api/vacancy/:vacancyId', function() {
it('should return the expected response', function() {
let app = require('../src/app.js');
return expect(chai.request(app).get('/api/vacancy/12345'))
.to.eventually
.have.include({status:200})
.and
.nested.include({text:response});
});
});

What is the proper way to reuse a MongoDB connection?

I am needing to reuse a MongoDB connection in multiple different files for my Electron app but am having an issue that I don't understand. This led me to create a module to handle this.
db.js
const MongoClient = require('mongodb').MongoClient;
require('dotenv').config();
let db;
function connect () {
return MongoClient.connect(`mongodb://${process.env.DB_HOST}?authSource=${process.env.DB_NAME}`, {
auth: {
user: process.env.DB_USER,
password: process.env.DB_PASS
},
useNewUrlParser: true
}).then(client => {
db = client.db(process.env.DB_NAME);
}).catch(error => {
console.error(error);
});
}
function getDB () {
return db;
}
module.exports = { connect, getDB };
I them required this in my main file which is one of the files it is needed in.
app.js
const mongoDB = require(path.resolve(`${__dirname}/assets/js/db`));
let db;
app.on('ready', async () => {
await mongoDB.connect();
db = mongoDB.getDB();
setTimeout(createWindow, 0);
});
This works and I can use it in the createWindow function. a few seconds after, the app load the index.html file along with the index.js file and creates the window. index.js is another file I need to use it in so I require it there as well.
index.js
const mongoDB = require(path.resolve(`${__dirname}/js/db`));
const db = mongoDB.getDB();
console.log(db);
This results in db being undefined. Isn't the first require in app.js supposed to be cached?

How to export node express app for chai-http

I have an express app with a few endpoints and am currently testing it using mocha, chai, and chai-http. This was working fine until I added logic for a pooled mongo connection, and started building endpoints that depended on a DB connection. Basically, before I import my API routes and start the app, I want to make sure I'm connected to mongo.
My problem is that I'm having trouble understanding how I can export my app for chai-http but also make sure there is a DB connection before testing any endpoints.
Here, I am connecting to mongo, then in a callback applying my API and starting the app. The problem with this example is that my tests will start before a connection to the database is made, and before any endpoints are defined. I could move app.listen and api(app) outside of the MongoPool.connect() callback, but then I still have the problem of there being no DB connection when tests are running, so my endpoints will fail.
server.js
import express from 'express';
import api from './api';
import MongoPool from './lib/MongoPool';
let app = express();
let port = process.env.PORT || 3000;
MongoPool.connect((err, success) => {
if (err) throw err;
if (success) {
console.log("Connected to db.")
// apply express router endpoints to app
api(app);
app.listen(port, () => {
console.log(`App listening on port ${port}`);
})
} else {
throw "Couldnt connect to db";
}
})
export default app;
How can I test my endpoints using chai-http while making sure there is a pooled connection before tests are actually executed? It feels dirty writing my application in a way that conforms to the tests I'm using. Is this a design problem with my pool implementation? Is there a better way to test my endpoints with chai-http?
Here is the test I'm running
test.js
let chai = require('chai');
let chaiHttp = require('chai-http');
let server = require('../server').default;;
let should = chai.should();
chai.use(chaiHttp);
//Our parent block
describe('Forecast', () => {
/*
* Test the /GET route
*/
describe('/GET forecast', () => {
it('it should GET the forecast', (done) => {
chai.request(server)
.get('/api/forecast?type=grid&lat=39.2667&long=-81.5615')
.end((err, res) => {
res.should.have.status(200);
done();
});
});
});
});
And this is the endpoint I'm testing
/api/forecast.js
import express from 'express';
import MongoPool from '../lib/MongoPool';
let router = express.Router();
let db = MongoPool.db();
router.get('/forecast', (req, res) => {
// do something with DB here
})
export default router;
Thank you for any help
After receiving some good feedback, I found this solution works best for me, based on Gomzy's answer and Vikash Singh's answer.
In server.js I'm connecting to the mongo pool, then emitting the 'ready' event on the express app. Then in the test, I can use before() to wait for 'ready' event to be emitted on the app. Once that happens, I'm good to start executing the test.
server.js
import express from 'express';
import bodyParser from 'body-parser';
import MongoPool from './lib/MongoPool';
let app = express();
let port = process.env.PORT || 5000;
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
(async () => {
await MongoPool.connect();
console.log("Connected to db.");
require('./api').default(app);
app.listen(port, () => {
console.log(`Listening on port ${port}.`)
app.emit("ready");
});
})();
export default app;
test.js
//Require the dev-dependencies
import chai from 'chai';
import chaiHttp from 'chai-http';
import server from '../src/server';
let should = chai.should();
chai.use(chaiHttp);
before(done => {
server.on("ready", () => {
done();
})
})
describe('Forecast', () => {
describe('/GET forecast', () => {
it('it should GET the forecast', (done) => {
chai.request(server)
.get('/api/forecast?type=grid&lat=39.2667&long=-81.5615')
.end((err, res) => {
res.should.have.status(200);
done();
});
});
});
});
Express app is an instance of EventEmitter so we can easily subscribe to events. i.e app can listen for the 'ready' event.
Your server.js file will look like below,
import express from 'express';
import api from './api';
import MongoPool from './lib/MongoPool';
let app = express();
let port = process.env.PORT || 3000;
app.on('ready', function() {
app.listen(3000, function() {
console.log('app is ready');
});
});
MongoPool.connect((err, success) => {
if (err) throw err;
if (success) {
console.log('Connected to db.');
// apply express router endpoints to app
api(app);
// All OK - fire (emit) a ready event.
app.emit('ready');
} else {
throw 'Couldnt connect to db';
}
});
export default app;
Just create a function below to connect to mongo and, make it returns a promise.
then use await to wait for it to connect and return. the function could be like that
function dbconnect(){
return new Promise(function(resolve, reject){
MongoPool.connect((err, success) => {
if (err) reject(err);
if (success) {
resolve({'status' : true})
} else {
reject(new Error({'status' : false}))
}
})
})
}
And then, use
await dbconnect();
api(app);
app.listen(port, () => {
console.log(`App listening on port ${port}`);
})
now await line will wait for the function to connect to DB and then return success or error in case of failure.
This is a kind of solution you can use, but I would not recommend you to do this, what we actually do is.
create services and use those services in routes, don't write DB code directly in routes.
and
while writing tests for routes mock/stub those services, and test services separately in other test cases, where you just pass DB object and service will add functions on that DB objects, so in tests you can connect to DB and pass that object to those services to test functions, it will give you additional benefit, if you want to use dummy/test DB for testing you can set that in test cases.
Use Before function in your tests like below :
describe('Forecast', () => {
before(function(done){
checkMongoPool(done); // this function should wait and ensure mongo connection is established.
});
it('/GET forecast', function(cb){
// write test code here ...
});
});
And you can check mongodb connection like this below methods:
Method 1: just check the readyState property -
mongoose.connection.readyState == 0; // not connected
mongoose.connection.readyState == 1; // connected`
Method 2: use events
mongoose.connection.on('connected', function(){});
mongoose.connection.on('error', function(){});
mongoose.connection.on('disconnected', function(){});
You can use running server instead of a express instance.
Start your server with a private port, then take tests on the running server.
ex: PORT=9876 node server.js
In your test block, use chai.request('http://localhost:9876') (replace with your protocol, server ip...) instead of chai.request(server).
If you're using native mongodb client you could implement reusable pool like:
MongoPool.js
// This creates a pool with default size of 5
// This gives client; You can add few lines to get db if you wish
// connection is a promise
let connection;
module.exports.getConnection = () => {
connection = MongoClient(url).connect()
}
module.exports.getClient = () => connection
Now in your test you could,
const { getConnection } = require('./MongoPool')
...
describe('Forecast', () => {
// get client connection
getConnection()
...
In your route:
...
const { getClient } = require('./MongoPool')
router.get('/forecast', (req, res) => {
// if you made sure you called getConnection() elsewhere in your code, client is a promise (which resolves to mongodb connection pool)
const client = getClient()
// do something with DB here
// then you could do something like client.db('db-name').then(//more).catch()
})

Mocha API Testing: getting 'TypeError: app.address is not a function'

My Issue
I've coded a very simple CRUD API and I've started recently coding also some tests using chai and chai-http but I'm having an issue when running my tests with $ mocha.
When I run the tests I get the following error on the shell:
TypeError: app.address is not a function
My Code
Here is a sample of one of my tests (/tests/server-test.js):
var chai = require('chai');
var mongoose = require('mongoose');
var chaiHttp = require('chai-http');
var server = require('../server/app'); // my express app
var should = chai.should();
var testUtils = require('./test-utils');
chai.use(chaiHttp);
describe('API Tests', function() {
before(function() {
mongoose.createConnection('mongodb://localhost/bot-test', myOptionsObj);
});
beforeEach(function(done) {
// I do stuff like populating db
});
afterEach(function(done) {
// I do stuff like deleting populated db
});
after(function() {
mongoose.connection.close();
});
describe('Boxes', function() {
it.only('should list ALL boxes on /boxes GET', function(done) {
chai.request(server)
.get('/api/boxes')
.end(function(err, res){
res.should.have.status(200);
done();
});
});
// the rest of the tests would continue here...
});
});
And my express app files (/server/app.js):
var mongoose = require('mongoose');
var express = require('express');
var api = require('./routes/api.js');
var app = express();
mongoose.connect('mongodb://localhost/db-dev', myOptionsObj);
// application configuration
require('./config/express')(app);
// routing set up
app.use('/api', api);
var server = app.listen(3000, function () {
var host = server.address().address;
var port = server.address().port;
console.log('App listening at http://%s:%s', host, port);
});
and (/server/routes/api.js):
var express = require('express');
var boxController = require('../modules/box/controller');
var thingController = require('../modules/thing/controller');
var router = express.Router();
// API routing
router.get('/boxes', boxController.getAll);
// etc.
module.exports = router;
Extra notes
I've tried logging out the server variable in the /tests/server-test.js file before running the tests:
...
var server = require('../server/app'); // my express app
...
console.log('server: ', server);
...
and I the result of that is an empty object: server: {}.
You don't export anything in your app module. Try adding this to your app.js file:
module.exports = server
It's important to export the http.Server object returned by app.listen(3000) instead of just the function app, otherwise you will get TypeError: app.address is not a function.
Example:
index.js
const koa = require('koa');
const app = new koa();
module.exports = app.listen(3000);
index.spec.js
const request = require('supertest');
const app = require('./index.js');
describe('User Registration', () => {
const agent = request.agent(app);
it('should ...', () => {
This may also help, and satisfies #dman point of changing application code to fit a test.
make your request to the localhost and port as needed
chai.request('http://localhost:5000')
instead of
chai.request(server)
this fixed the same error message I had using Koa JS (v2) and ava js.
The answers above correctly address the issue: supertest wants an http.Server to work on. However, calling app.listen() to get a server will also start a listening server, this is bad practice and unnecessary.
You can get around by this by using http.createServer():
import * as http from 'http';
import * as supertest from 'supertest';
import * as test from 'tape';
import * as Koa from 'koa';
const app = new Koa();
# add some routes here
const apptest = supertest(http.createServer(app.callback()));
test('GET /healthcheck', (t) => {
apptest.get('/healthcheck')
.expect(200)
.expect(res => {
t.equal(res.text, 'Ok');
})
.end(t.end.bind(t));
});
Just in case, if someone uses Hapijs the issue still occurs, because it does not use Express.js, thus address() function does not exist.
TypeError: app.address is not a function
at serverAddress (node_modules/chai-http/lib/request.js:282:18)
The workaround to make it work
// this makes the server to start up
let server = require('../../server')
// pass this instead of server to avoid error
const API = 'http://localhost:3000'
describe('/GET token ', () => {
it('JWT token', (done) => {
chai.request(API)
.get('/api/token?....')
.end((err, res) => {
res.should.have.status(200)
res.body.should.be.a('object')
res.body.should.have.property('token')
done()
})
})
})
Export app at the end of the main API file like index.js.
module.exports = app;
We had the same issue when we run mocha using ts-node in our node + typescript serverless project.
Our tsconfig.json had "sourceMap": true . So generated, .js and .js.map files cause some funny transpiling issues (similar to this). When we run mocha runner using ts-node. So, I will set to sourceMap flag to false and deleted all .js and .js.map file in our src directory. Then the issue is gone.
If you have already generated files in your src folder, commands below would be really helpful.
find src -name ".js.map" -exec rm {} \;
find src -name ".js" -exec rm {} \;
I am using Jest and Supertest, but was receiving the same error. It was because my server takes time to setup (it is async to setup db, read config, etc). I needed to use Jest's beforeAll helper to allow the async setup to run. I also needed to refactor my server to separate listening, and instead use #Whyhankee's suggestion to create the test's server.
index.js
export async function createServer() {
//setup db, server,config, middleware
return express();
}
async function startServer(){
let app = await createServer();
await app.listen({ port: 4000 });
console.log("Server has started!");
}
if(process.env.NODE_ENV ==="dev") startServer();
test.ts
import {createServer as createMyAppServer} from '#index';
import { test, expect, beforeAll } from '#jest/globals'
const supertest = require("supertest");
import * as http from 'http';
let request :any;
beforeAll(async ()=>{
request = supertest(http.createServer(await createMyAppServer()));
})
test("fetch users", async (done: any) => {
request
.post("/graphql")
.send({
query: "{ getQueryFromGqlServer (id:1) { id} }",
})
.set("Accept", "application/json")
.expect("Content-Type", /json/)
.expect(200)
.end(function (err: any, res: any) {
if (err) return done(err);
expect(res.body).toBeInstanceOf(Object);
let serverErrors = JSON.parse(res.text)['errors'];
expect(serverErrors.length).toEqual(0);
expect(res.body.data.id).toEqual(1);
done();
});
});
Edit:
I also had errors when using data.foreach(async()=>..., should have use for(let x of... in my tests

Categories

Resources