Unit Testing with ExpressJS - javascript

I am trying to unit test my ExpressJS routes. It looks something like this
server.js
var boards = require('./routes/BoardsRoute.js');
app.get('/api/v1/boards/:id', boards.getBoard);
BoardRoutes.js
exports.getBoard = function(req, res) {
BoardModel.find({
name: 'Kanban Board'
}, function(err, columns) {
if (!err) {
return res.send(columns);
} else {
return console.log(err);
}
});
return res.send(board);
};
I would like to Mock out the BoardModel as this is the call to the Mongoose Model (aka Database call). As I assume that unit tests should not be making calls to a database and have no server running.
Should I be testing the getBoards completely seperatly to the server.js app.get() call.
(As these requests apt.get will be covered by integration tests/e2e tests and they are HTTP requests)
All the documentation and frameworks that I can see either have to have Express Server running in order to unit test the route and this particular exports.getBoard.
Things that I have tried to do,
Use Sinon.js to mock a fakeServer so that I could test the HTTP Request and the method getBoard.
Use SuperAgent and Super Test to make the requests out to a server. (I am uncomfortable with this as unit tests should not have a server running).
I am trying to use Mocha to test these routes.

I would test that the route is wired up correctly first, then test that the call to the route does what you expect.
To test the route here's one way you could do it. Create a routes module that just wires up the routes:
var routes = {};
routes.create = function(app) {
app.get('/api/v1/boards/:id', boards.getBoard);
}
module.exports = routes;
You can then mock the app object and check the get method gets called correctly. (I tend to use Sinon.js for this.) The create method will need to be called from your app module:
var app = express();
... // set up express
routes.create(app);
Now you can focus on testing getBoard on its own like any other module.

Related

Express.js - prepend sub-route to all defined routes

Let's say I have an Express app defined in a file, say server.js like this:
const app = express();
app.use('/foo', foo);
app.use('/bar', bar);
module.exports = app;
I import this Express app in another file, say index.js:
const app = require('./server');
const port = process.env.PORT || 3000;
const listen = (port) => {
app.listen(port, () => {
console.log(`Backend listening on port ${port}!`);
});
};
listen(port);
Now, the routes that are available for this app are /foo and /bar.
Is there a way to edit configuration in the index.js file so that the routes become /api/foo and /api/bar? Without touching server.js file.
Use case:
I have a Nuxt.js app with a backend that is loaded into the Nuxt app via serverMiddleware property in nuxt.config.js like this:
serverMiddleware: [
...
{ path: '/api', handler: '~/server.js' },
],
This has the effect similar to what I described above: it imports the express app from server.js app and prepends all its routes with /api.
However, often I don't want to develop the frontend part of the Nuxt app, I just want to do changes on the backend. For this purpose I have a helper file like index.js above, which runs backend only. (Frontend often takes long time to compile, that's why I don't want to compile it when I don't need to.)
This creates a problem that all the routes are slightly different - they lack the /api at the beginning. The routes are being used in different tools like Postman etc. and suddenly they wouldn't work.
My current solution is to define index.js file in the same way as server.js file with all routes defined like I want them - instead of app.use('/foo', foo); there's app.use('/api/foo', foo); etc. but this has its own problems, e.g. if I change server.js I have to change index.js. I am looking for something more elegant.
According to the express 4.0 docs https://expressjs.com/en/4x/api.html#app.use you can use an application instance the same as you would a router. In short, just use the export of your server.js as a middleware at the route in which you want to insert it, as opposed to directly calling .listen() on it.
Here is some demo code that worked for me:
const express = require('express');
const app_inner = express();
app_inner.use('/foo', (req,res) => res.send('foo'));
const app_outer = express();
app_outer.use('/foo2', app_inner);
app_outer.listen(9999);
// web browser at localhost:9999/foo2/foo returns 'foo' as expected

How to get functions from express routes for unit testing?

I have a JS controller with some routes exported. I'd like to get at the functions used in the routes for unit testing.
Note I've seen a lot of blogs advocating creating real http requests to unit test express routes. That's not exactly unit testing in my book. It is more unit testing.
I want to call the functions directly in my tests, thus leaving (most of) the framework out of the test.
For instance:
controller.js:
var express = require('express');
var router = express.Router();
function foo(req, res) { ... }
function bar(req, res) { ... }
router.route("/api/foo").get(foo);
router.route("/api/bar").post(bodyParser, bar);
module.exports = router;
In my unit test, I'd like to call foo(res, res) directly (I'll make some mock req and res objects)
Easiest way to do what you want is to add each function to the module.exports. That way you can require the controller into your test and call the functions directly.
// controller.js
module.exports = router;
module.exports._spec = {
foo: foo,
bar: bar
};
// controlerSpec.js
var app = require('controller');
app._spec.foo(mockReq, mockRes);

Achieve best performance when passing variables through different node.js modules/files

In a node application it's common that you start loading some modules like this:
var express = require('express')
, app = express()
, mongodb = require('mongodb')
, http = require('http');
Now, let's say you have a routes.js file which looks like this:
module.exports = function(app) {
app.get('/', function(req, res) {
res.render('homepage');
});
// other routes
};
So when you call that routes.js file, you pass the app var:
require('./routes')(app);
My question is: what is happening there? Is there resource consumption when passing the app variable to the routes module? I know that node.js caches modules, but I would like to know how that affects variables passed between them and I wonder if the following approach is efficient:
Let's start loading some modules, but let's do in a different way:
var _ = {};
_.express = require('express');
_.app = _.express();
_.mongodb = require('mongodb');
_.http = require('http');
Routes.js:
module.exports = function(_) {
_.app.get('/', function(req, res) {
res.render('homepage');
});
// other routes
};
Call to routes.js:
require('./routes')(_);
Obviously the _ variable will be a large one, but it will include anything you may need in any module. So I wonder if the size of the passed variable affects performance, in which case it would be just stupid to pass more data than needed.
I seek for achieving the best achievable performance in my applications while I try to simplify things when writing code, so any advice that may help with this, or any explanation about how this works behind the scenes in node.js will be appreciated.
Thanks in advance.
See here.
tl;dr: objects are passed by reference, not by value. So you are not expanding your memory consumption when you pass the same object multiple times to multiple modules.

Ember CLI with http-mock and pretender, how to manage mock data

I'm new to Javascript in general, especially EmberJS and Ember CLI. I'm trying to figure out what is the best practice to manage mock data within a Ember CLI based project. I'm on 0.1.14 EmberCLI now.
According to Ember CLI documentation, http-mock is the preferred way of providing mock data for Ember Data models. So I use generated http-mock and added some test data to it.
var testData = [
...
];
module.exports = function(app) {
var express = require('express');
var todosRouter = express.Router();
todosRouter.get('/', function(req, res) {
res.send({
"todos": testData
});
});
Then I found out that the http-mock does not work during integration, so I added Pretende.
import Ember from 'ember';
import { test } from 'ember-qunit';
import Pretender from 'pretender';
import startApp from '../helpers/start-app';
var App;
var server;
var testData = [
...
];
module('An Integration test', {
setup: function() {
App = startApp();
server = new Pretender(function(){
this.get('/api/todos', function(request){
return [ 200, {"Content-Type": "application/json"},
JSON.stringify({'todos': testData) ];
});
});
},
teardown: function() {
...
}
});
test('3 items loaded at startup', function() {
...
});
Both http-mock and integration test work fine, then I tried put the test data in a separate module so that they can be shared. The thing is that the integration test uses ES6 style module import, and http-mock uses CommonJS style module definition, and I don't know how to combine them into the project.
If I construct the data into an ES6 module,
var mockData = [
...
];
export default {
all: mockData
};
it works with integration test but http-mock complains about unexpected reserved word "export". If a convert it to a CommonJS style module.export, then I can't see the data in the integration test.
Now the questions:
I think if Broccoli 'compile' ES6 module into CommonJS format it should work with http-mock, but I have no idea how to do that. Is this the right direction to go, and how?
Do I have to use both http-mock and Pretender in the same project? Can I use one for both development and integration test?
I'm stuck here and comments, suggestions and code samples are really appreciated.
I'm not sure, but I think that this Ember CLI addon, called ember-cli-mirage might do what you're looking for. I haven't used it myself yet, but I have heard good things about it and it seems to take care of the case where you want to use a mock server for development and testing without keeping two sets of fixtures.

Node.js, Express and Dependency Injection

I'm in early stages of a node.js project, and I'm looking to improve the overall app organization. In the past I worked with Symfony2 (PHP) and now I code a lot in Angular, both of which relly heavily on DI. So, I really like the idea of applying the same principles in my node.js project.
I know the existence of packages like Rewire, but for now I'll like to try the DI approach. The issue is, how to achieve an equilibrium to keep the lightweight feeling that gives working with node with the solidity of a well tested dependency injected app (I know that well tested is what gives the solidity ;-)).
Node modules
One of the issues, would be how to manage the external modules, what to do if some object needs fs module? As Vojta Jina (from AngularJS) states in this article:
So the best way that works for me right now is something like this: Modules are stateless. They only contain definitions of classes/functions/constants.
So, I suppose that I would have to inject everything:
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.doSomething: function () {
// this.fs...
};
module.exports = Foo;
Somewhere:
var fs = require('fs');
var Foo = require('./Foo');
var foo = new Foo(fs);
foo.doSomething();
Express
Since Express uses apply() to call the handlers the context is lost and we can't use this. So we're left with these:
// foo.js
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.index = function () {
var self = this;
return function (req, res, next) {
// self.fs...
};
};
module.exports = Foo;
// bar.js
module.exports.index = function (fs) {
return function (req, res, next) {
// fs...
};
};
// app.js
var express = require('express');
var fs = require('fs');
var app = express();
var Foo = require('./foo');
var foo = new Foo(fs);
var bar = require('./bar');
app.get('/foo', foo.index());
app.get('/bar', bar.index(fs));
So...
Has someone taken this approach? What about the use of DI frameworks? (like di.js) And how to keep the experience lean? All ideas are welcome. Thanks!
You have some good thoughts to which I'd like to add:
Having stateless modules will help you to scale your app horizontally. If all state is in a database it will be easy to run multiple node.js instances in parallel.
I also prefer to inject everything. Otherwise the time will come when I would like to write a unit test and it gets hard because I have a hardcoded (not injected) dependencies I can't mock.
To keep this lightweight feeling when working with node you need an approach for dependency injection that doesn't add too much complexity. Your express example above reminds me of a talk by Vojta Jina in which he makes an important point about the wiring part of dependency injection. (Watch minute 3:35 to 8:05) I can't explain it any better than Vojtja does in his talk but basically he says that we need a di framework that takes care of the wiring (what is injected into what). Otherwise the code we manually write to set up the wiring won't be maintainable. Also each unit test would need such wiring code. And that IMO is where manual dependency injection is not an option anymore.
When you use a di framework (or a di container as many people say) the basic idea is that each individual module states which dependencies it requires and through which id it can be required by other modules. Then the di framework can be invoked to initialize the module that serves as an entry point (e.g. app.js) and the framework will look up all dependencies and takes over the hard work of injecting the appropriate module instances.
There are many di frameworks for node.js to which I'd like to add my own: "Fire Up!" If you would use it your example would look like this:
foo.js
// Fire me up!
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.index = function () {
var self = this;
return function (req, res, next) {
// self.fs...
};
};
module.exports = {
implements: 'foo',
inject: ['require(fs)'],
_constructor: Foo
};
bar.js
// Fire me up!
module.exports = {
implements: 'bar',
inject: ['require(fs)'],
factory: function (fs) {
return {
index: function (req, res, next) {
// fs...
}
};
}
};
app.js
// Fire me up!
module.exports = {
implements: 'app',
inject: ['require(express)', 'foo', 'bar']
};
module.exports.factory = function (express, foo, bar) {
var app = express();
app.get('/foo', foo.index());
app.get('/bar', bar.index);
};
index.js
var fireUpLib = require('fire-up');
var fireUp = fireUpLib.newInjector({
basePath: __dirname,
modules: ['./lib/**/*.js'] // foo.js, bar.js, app.js are on this path
});
fireUp('app'); // This is where the injection is kicked off.
When running node index.js you get the following output:
fireUp# INFO Requested: app, implemented in: lib/app.js
fireUp# INFO |-- Requested: require(express)
fireUp# INFO |-- Requested: foo, implemented in: lib/foo.js
fireUp# INFO |-- Requested: require(fs)
fireUp# INFO |-- Requested: bar, implemented in: lib/bar.js
fireUp# INFO |-- Requested: require(fs)
If this looks worth trying out you might be interested in the Getting Started section which shows an example based on express.
Hope that helps!
You can check https://www.npmjs.com/package/plus.container
This is close to DIC in Symfony

Categories

Resources