How to get functions from express routes for unit testing? - javascript

I have a JS controller with some routes exported. I'd like to get at the functions used in the routes for unit testing.
Note I've seen a lot of blogs advocating creating real http requests to unit test express routes. That's not exactly unit testing in my book. It is more unit testing.
I want to call the functions directly in my tests, thus leaving (most of) the framework out of the test.
For instance:
controller.js:
var express = require('express');
var router = express.Router();
function foo(req, res) { ... }
function bar(req, res) { ... }
router.route("/api/foo").get(foo);
router.route("/api/bar").post(bodyParser, bar);
module.exports = router;
In my unit test, I'd like to call foo(res, res) directly (I'll make some mock req and res objects)

Easiest way to do what you want is to add each function to the module.exports. That way you can require the controller into your test and call the functions directly.
// controller.js
module.exports = router;
module.exports._spec = {
foo: foo,
bar: bar
};
// controlerSpec.js
var app = require('controller');
app._spec.foo(mockReq, mockRes);

Related

Node.js factory function for axios redis dependency injection

I have a node express app that is making calls to an external API using axios. I'm using redis for caching. It works fine, but I ran into a number of problems with unit tests - specifically trying to mock or inject mocks for redis client and an axios instance
My approach to making the route testable was to create a factory function for the route. It works, but I am unsure if there might be side/adverse effects, or if I'm using a bad practice and missing a standard solution
In api.js I require the details-route, passing in the axiosInstance and redisClient instance
// api.js
const detailsRouter = require('./details-route')(axiosInstance, redisClient);
router.use('/details', detailsRouter );
module.exports = router;
//details-route.js
const express = require('express');
const router = express.Router();
const fp = require('lodash/fp');
// local modules
const constants = require('../constants');
const axiosUtil = require('../axios-util');
const getRouter = (axiosInstance, redisClient) => {
router.get('/details', (req, res) => {
redisClient.get(req.originalUrl, (err, reply) => {
if (reply) {
// etc
}
// etc
}
}
return router;
}
module.exports = getRouter;
Note: I'm using redis-mock and axios-mock-adapter in my unit tests, and I've looked into using rewire (but it's not workable for wrapping the internal redis client)

Achieve best performance when passing variables through different node.js modules/files

In a node application it's common that you start loading some modules like this:
var express = require('express')
, app = express()
, mongodb = require('mongodb')
, http = require('http');
Now, let's say you have a routes.js file which looks like this:
module.exports = function(app) {
app.get('/', function(req, res) {
res.render('homepage');
});
// other routes
};
So when you call that routes.js file, you pass the app var:
require('./routes')(app);
My question is: what is happening there? Is there resource consumption when passing the app variable to the routes module? I know that node.js caches modules, but I would like to know how that affects variables passed between them and I wonder if the following approach is efficient:
Let's start loading some modules, but let's do in a different way:
var _ = {};
_.express = require('express');
_.app = _.express();
_.mongodb = require('mongodb');
_.http = require('http');
Routes.js:
module.exports = function(_) {
_.app.get('/', function(req, res) {
res.render('homepage');
});
// other routes
};
Call to routes.js:
require('./routes')(_);
Obviously the _ variable will be a large one, but it will include anything you may need in any module. So I wonder if the size of the passed variable affects performance, in which case it would be just stupid to pass more data than needed.
I seek for achieving the best achievable performance in my applications while I try to simplify things when writing code, so any advice that may help with this, or any explanation about how this works behind the scenes in node.js will be appreciated.
Thanks in advance.
See here.
tl;dr: objects are passed by reference, not by value. So you are not expanding your memory consumption when you pass the same object multiple times to multiple modules.

Get access to Express mountpath from inside a module

In an attempt to build a truly modular express app (one that can run stand alone or as a part of another app) I need to find out the cleanest way to find the mount path from inside the subapp. For a short example, lets say that there are two files: main.js and subapp.js
main.js
var express = require('express');
var app = express();
var localApp = express();
var subapp = require('./subapp');
app.use('/foo', subapp);
app.use('/bar', localApp);
console.log(localApp.mountpath); // yes, this prints '/bar' as expected
...
subapp.js
var express = require('express');
var app = express();
var truePath = app.mountpath; // I wish this would point to '/foo', but instead points to '/'
...
module.exports = app;
What is the best way (as in cleanest) to find the mountpath from inside the module? I'm doing this trying to solve this problem: Access to mountpath variable from inside a template using express in a non hardwired way.
As shown in the example, tried already with app.mountpath without success
As answered by alsotang, this is actually a problem of execution sequence, but it can be solved in what I think is a clean way. There is an event that is fired after the module is mounted, so you can do:
var truePath = = "/";
app.on('mount', function (parent) {
truePath = app.mountpath;
});
Where in real life truePath could be app.locals.truePath, so it can be accessed from inside the views.
eh..It's a problem of execution sequence.
In your subapp.js, the app.mountpath statement is before module.exports = app.
But only you export the app, then the app be mounted, then it would be set the mountpath property.
so, you should retrieve mountpath after the app be mounted by outer express.
My suggestion are two:
set the mountpath in your subapp.js. And outer express read this property.
perhaps you think 1 is not truly modular. so you can alternatively define a config file. and both main.js and subapp.js read mountpath from the config.
Try req.baseUrl:
app.js
var express = require('express');
var app = express();
var foo = require('./foo');
app.use('/foo', foo);
app.listen(3000);
foo.js
var express = require('express');
var router = express.Router();
router.get('/', function(req, res) {
console.log(req.baseUrl); // '/foo'
res.send('foo');
});
module.exports = router;

Node.js, Express and Dependency Injection

I'm in early stages of a node.js project, and I'm looking to improve the overall app organization. In the past I worked with Symfony2 (PHP) and now I code a lot in Angular, both of which relly heavily on DI. So, I really like the idea of applying the same principles in my node.js project.
I know the existence of packages like Rewire, but for now I'll like to try the DI approach. The issue is, how to achieve an equilibrium to keep the lightweight feeling that gives working with node with the solidity of a well tested dependency injected app (I know that well tested is what gives the solidity ;-)).
Node modules
One of the issues, would be how to manage the external modules, what to do if some object needs fs module? As Vojta Jina (from AngularJS) states in this article:
So the best way that works for me right now is something like this: Modules are stateless. They only contain definitions of classes/functions/constants.
So, I suppose that I would have to inject everything:
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.doSomething: function () {
// this.fs...
};
module.exports = Foo;
Somewhere:
var fs = require('fs');
var Foo = require('./Foo');
var foo = new Foo(fs);
foo.doSomething();
Express
Since Express uses apply() to call the handlers the context is lost and we can't use this. So we're left with these:
// foo.js
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.index = function () {
var self = this;
return function (req, res, next) {
// self.fs...
};
};
module.exports = Foo;
// bar.js
module.exports.index = function (fs) {
return function (req, res, next) {
// fs...
};
};
// app.js
var express = require('express');
var fs = require('fs');
var app = express();
var Foo = require('./foo');
var foo = new Foo(fs);
var bar = require('./bar');
app.get('/foo', foo.index());
app.get('/bar', bar.index(fs));
So...
Has someone taken this approach? What about the use of DI frameworks? (like di.js) And how to keep the experience lean? All ideas are welcome. Thanks!
You have some good thoughts to which I'd like to add:
Having stateless modules will help you to scale your app horizontally. If all state is in a database it will be easy to run multiple node.js instances in parallel.
I also prefer to inject everything. Otherwise the time will come when I would like to write a unit test and it gets hard because I have a hardcoded (not injected) dependencies I can't mock.
To keep this lightweight feeling when working with node you need an approach for dependency injection that doesn't add too much complexity. Your express example above reminds me of a talk by Vojta Jina in which he makes an important point about the wiring part of dependency injection. (Watch minute 3:35 to 8:05) I can't explain it any better than Vojtja does in his talk but basically he says that we need a di framework that takes care of the wiring (what is injected into what). Otherwise the code we manually write to set up the wiring won't be maintainable. Also each unit test would need such wiring code. And that IMO is where manual dependency injection is not an option anymore.
When you use a di framework (or a di container as many people say) the basic idea is that each individual module states which dependencies it requires and through which id it can be required by other modules. Then the di framework can be invoked to initialize the module that serves as an entry point (e.g. app.js) and the framework will look up all dependencies and takes over the hard work of injecting the appropriate module instances.
There are many di frameworks for node.js to which I'd like to add my own: "Fire Up!" If you would use it your example would look like this:
foo.js
// Fire me up!
function Foo(fs) {
this.fs = fs;
}
Foo.prototype.index = function () {
var self = this;
return function (req, res, next) {
// self.fs...
};
};
module.exports = {
implements: 'foo',
inject: ['require(fs)'],
_constructor: Foo
};
bar.js
// Fire me up!
module.exports = {
implements: 'bar',
inject: ['require(fs)'],
factory: function (fs) {
return {
index: function (req, res, next) {
// fs...
}
};
}
};
app.js
// Fire me up!
module.exports = {
implements: 'app',
inject: ['require(express)', 'foo', 'bar']
};
module.exports.factory = function (express, foo, bar) {
var app = express();
app.get('/foo', foo.index());
app.get('/bar', bar.index);
};
index.js
var fireUpLib = require('fire-up');
var fireUp = fireUpLib.newInjector({
basePath: __dirname,
modules: ['./lib/**/*.js'] // foo.js, bar.js, app.js are on this path
});
fireUp('app'); // This is where the injection is kicked off.
When running node index.js you get the following output:
fireUp# INFO Requested: app, implemented in: lib/app.js
fireUp# INFO |-- Requested: require(express)
fireUp# INFO |-- Requested: foo, implemented in: lib/foo.js
fireUp# INFO |-- Requested: require(fs)
fireUp# INFO |-- Requested: bar, implemented in: lib/bar.js
fireUp# INFO |-- Requested: require(fs)
If this looks worth trying out you might be interested in the Getting Started section which shows an example based on express.
Hope that helps!
You can check https://www.npmjs.com/package/plus.container
This is close to DIC in Symfony

Unit Testing with ExpressJS

I am trying to unit test my ExpressJS routes. It looks something like this
server.js
var boards = require('./routes/BoardsRoute.js');
app.get('/api/v1/boards/:id', boards.getBoard);
BoardRoutes.js
exports.getBoard = function(req, res) {
BoardModel.find({
name: 'Kanban Board'
}, function(err, columns) {
if (!err) {
return res.send(columns);
} else {
return console.log(err);
}
});
return res.send(board);
};
I would like to Mock out the BoardModel as this is the call to the Mongoose Model (aka Database call). As I assume that unit tests should not be making calls to a database and have no server running.
Should I be testing the getBoards completely seperatly to the server.js app.get() call.
(As these requests apt.get will be covered by integration tests/e2e tests and they are HTTP requests)
All the documentation and frameworks that I can see either have to have Express Server running in order to unit test the route and this particular exports.getBoard.
Things that I have tried to do,
Use Sinon.js to mock a fakeServer so that I could test the HTTP Request and the method getBoard.
Use SuperAgent and Super Test to make the requests out to a server. (I am uncomfortable with this as unit tests should not have a server running).
I am trying to use Mocha to test these routes.
I would test that the route is wired up correctly first, then test that the call to the route does what you expect.
To test the route here's one way you could do it. Create a routes module that just wires up the routes:
var routes = {};
routes.create = function(app) {
app.get('/api/v1/boards/:id', boards.getBoard);
}
module.exports = routes;
You can then mock the app object and check the get method gets called correctly. (I tend to use Sinon.js for this.) The create method will need to be called from your app module:
var app = express();
... // set up express
routes.create(app);
Now you can focus on testing getBoard on its own like any other module.

Categories

Resources