I have a piece of middleware that checks a users authentication based on a passed parameter. The middleware uses a model that implements promises to find and return the user to be set into the request params.
The problem is, when running tests, if an assertion fails the test times out, presumably because the exception thrown by the failing assertion is not able to be handled by Mocha.
I'm doing the assertions inside of the next() function - When testing that the key was not provided, it works correctly, I assume because it's not being run in the context of the promise.
# Authenticator
var Authentication = function(model) {
this.model = model;
};
Authentication.prototype.resolve = function(customerKey) {
return this.model.authenticate(customerKey);
};
module.exports = function(model) {
return function(req, res, next) {
if (!req.query.hasOwnProperty('customerKey')) {
throw new Error('There was no auth key provided');
}
var auth = new Authentication(model);
auth.resolve(req.query.customerKey)
.then(function(customer) {
if (!customer) {
throw new Error('There was no customer found with the supplied auth key');
}
req.params.auth = customer;
})
.done(next, next);
};
};
# Test
var should = require('chai').should(),
authentication = require('src/api/middleware/authentication'),
model = require('src/models/customer'),
auth = authentication(model);
describe('middleware/authentication', function() {
it('should set the user to the request if the customerKey is valid', function(done) {
var request = {
query: {
customerKey: 'thisIsValid'
},
params: {}
};
var response = function() {
// This is a no-op
};
var next = function(response) {
response.should.be.instanceOf(String); // If this assertion fails, the test times out and doesn't work
response.should.have.property('name');
response.name.should.be.a('string');
done();
};
// Actually calls the auth
auth(request, response, next);
});
});
Related
I'm trying to create unit tests for my auth middleware in an Express app.
The middleware is as simple as this:
const jwt = require('jsonwebtoken');
const auth = (req, res, next) => {
const tokenHeader = req.headers.auth;
if (!tokenHeader) {
return res.status(401).send({ error: 'No token provided.' });
}
try {
const decoded = jwt.verify(tokenHeader, process.env.JWT_SECRET);
if (decoded.id !== req.params.userId) {
return res.status(403).json({ error: 'Token belongs to another user.' });
}
return next();
} catch (err) {
return res.status(401).json({ error: 'Invalid token.' });
}
}
module.exports = auth;
And this is my test, where I want to ensure that if the token is ok everything will go smoothly and the middleware just calls next():
it('should call next when everything is ok', async () => {
req.headers.auth = 'rgfh4hs6hfh54sg46';
jest.mock('jsonwebtoken/verify', () => {
return jest.fn(() => ({ id: 'rgfh4hs6hfh54sg46' }));
});
await auth(req, res, next);
expect(next).toBeCalled();
});
But instead of returning the object with and id field as desired, the mock always returns undefined. I have tried returning the object instead of jest.fn() but it didn't work too.
I know there are some similar threads here on stack overflow but unfortunately none of the solutions proposed worked for me.
If more context is needed, here is my full test suite. Thanks in advance.
One way to solve this is to mock the jsonwebtoken module and then use mockReturnValue on the method to be mocked. Consider this example:
const jwt = require('jsonwebtoken');
jest.mock('jsonwebtoken');
jwt.verify.mockReturnValue({ id: 'rgfh4hs6hfh54sg46' });
it('should correctly mock jwt.verify', () => {
expect(jwt.verify("some","token")).toStrictEqual({ id: 'rgfh4hs6hfh54sg46' })
});
My login component briefly displays before being removed by an error message about an undefined object in a promise.
Here is the promise definition:
static init(): Promise<any> {
KeycloakClientService.auth.loggedIn = false;
return new Promise((resolve, reject) => {
const keycloakConfig = {
url: environment.KEYCLOAK_URL,
realm: environment.KEYCLOAK_REALM,
clientId: environment.KEYCLOAK_CLIENTID,
'ssl-required': 'external',
'public-client': true
};
const keycloakAuth: any = new Keycloak(keycloakConfig);
keycloakAuth.init({onLoad: 'check-sso'})
.success(() => {
KeycloakClientService.auth.loggedIn = true;
KeycloakClientService.auth.authz = keycloakAuth;
KeycloakClientService.auth.logoutUrl = environment.KEYCLOAK_URL
+ '/realms/' + environment.KEYCLOAK_REALM + '/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
console.log('=======>> The keycloak client has been initiated successfully');
resolve('Succeeded in initiating the keycloak client');
})
.error(() => {
reject('Failed to initiate the keycloak client');
});
});
}
It is called by:
KeycloakClientService.init()
.then(
() => {
console.log('The keycloak client has been initialized');
}
)
.catch(
(error) => {
console.log(error);
window.location.reload();
}
);
The console shows both messages:
The keycloak client has been initiated successfully
The keycloak client has been initialized
I'm using Angular 6.0.4 and tried following this blog
Any way to work around this error so as to keep my login form displayed ?
UPDATE: I tried using an observable instead of a promise but the issue remained the same:
public init(): Observable<any> {
KeycloakClientService.auth.loggedIn = false;
return new Observable((observer) => {
const keycloakConfig = {
'url': environment.KEYCLOAK_URL,
'realm': environment.KEYCLOAK_REALM,
'clientId': environment.KEYCLOAK_CLIENTID,
'ssl-required': 'external',
'public-client': true
};
const keycloakAuth: any = new Keycloak(keycloakConfig);
keycloakAuth.init({ 'onLoad': 'check-sso' })
.success(() => {
KeycloakClientService.auth.loggedIn = true;
KeycloakClientService.auth.authz = keycloakAuth;
KeycloakClientService.auth.logoutUrl = environment.KEYCLOAK_URL
+ '/realms/' + environment.KEYCLOAK_REALM + '/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
console.log('The keycloak auth has been initialized');
observer.next('Succeeded in initiating the keycloak client');
observer.complete();
})
.error(() => {
console.log('The keycloak client could not be initiated');
observer.error('Failed to initiate the keycloak client');
});
});
}
The whole source code is available on GitHub
UPDATE: Following an answer below, I also tried to use a then() and a catch() keywords but the error remained the exact same:
keycloakAuth.init({ 'onLoad': 'check-sso' })
.then(() => {
KeycloakClientService.auth.loggedIn = true;
KeycloakClientService.auth.authz = keycloakAuth;
KeycloakClientService.auth.logoutUrl = environment.KEYCLOAK_URL
+ '/realms/' + environment.KEYCLOAK_REALM + '/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
console.log('The keycloak auth has been initialized');
observer.next('Succeeded in initiating the keycloak client');
observer.complete();
})
.catch(() => {
console.log('The keycloak client could not be initiated');
observer.error('Failed to initiate the keycloak client');
});
This is a wild guess, but maybe it's a conflict with Angular's zones. Since this is a security library it might not like that Angular has replaced core functions with proxies. For example, NgZone modifies window.setTimeout and the HTTP methods.
So you could try running this code outside of zones. The only problem here is that you're using a static function, and will have to make this an injectable service so that you can access NgZone
#Injectable()
export class KeycloakClientService {
public constructor(private zone: NgZone) {
}
public init(): Promise<any> {
KeycloakClientService.auth.loggedIn = false;
return new Promise((resolve, reject) => {
this.zone.runOutsideAngular(() => {
const keycloakConfig = {
url: environment.KEYCLOAK_URL,
realm: environment.KEYCLOAK_REALM,
clientId: environment.KEYCLOAK_CLIENTID,
'ssl-required': 'external',
'public-client': true
};
const keycloakAuth: any = new Keycloak(keycloakConfig);
keycloakAuth.init({onLoad: 'check-sso'})
.success(() => {
KeycloakClientService.auth.loggedIn = true;
KeycloakClientService.auth.authz = keycloakAuth;
KeycloakClientService.auth.logoutUrl = environment.KEYCLOAK_URL
+ '/realms/' + environment.KEYCLOAK_REALM + '/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
console.log('=======>> The keycloak client has been initiated successfully');
resolve('Succeeded in initiating the keycloak client');
})
.error(() => {
reject('Failed to initiate the keycloak client');
});
});
}
}
}
The change here is to use zone.runOutsideAngular
If you remove the success block, where do you run your logic within success?
I read some of their source code, I think this is why success causes the problem:
Within keycloak.js, there is a function createNativePromise():
function createNativePromise() {
var p = {
setSuccess: function(result) {
p.success = true;
p.resolve(result);
},
setError: function(result) {
p.success = false;
p.reject(result);
}
};
p.promise = new Promise(function(resolve, reject) {
p.resolve = resolve;
p.reject = reject;
});
p.promise.success = function(callback) {
p.promise.then(callback);
return p.promise;
}
p.promise.error = function(callback) {
p.promise.catch(callback);
return p.promise;
}
return p;
}
And it's used this way(simplified code):
function refreshToken() {
var promise = createNativePromise();
...
if (refreshTokenFailed) {
promise.setError(true);
}
...
return promise.promise;
}
The problem is, promise.setError() calls promise.reject(result), so the promise is rejected, it's expecting a catch.
But within promise.success, there is a promise.promise.then(callback);, and nobody is catching this promise.
This is why you get the Uncaught (in promise): [object Undefined], and in my case, i always get Uncaught (in promise): true.
Solution:
Notice that promise.promise is a real Promise, so we can use then and catch instead of success and error.
The drawback is, the typescript type will be wrong.
We have observed a similar error about the promise object undefined. The situation was our local application was working fine with the local keycloak standalone server but faced this error when the local application trying to connect with a keycloak server hosted on the ABC server (ABC is used as a reference here to give any arbitrary name).
This issue was resolved when we hosted the application and the keycloak server both on the ABC server.
It looks like there are some time sync issues in different machines due to which the promise object is not returned.
Building my first HAPI api back-end, and ran into something odd. When I hit an endpoint the first time (GET /api/item/{name}, I can see in my console that the handler functions are being run (mongo queries), and then the reply is sent (there's an on response plug-in logging those). Great. If I hit the endpoint again with a different parameter, I see the response from the first call going out right away, and then the handler functions are hit. And, in fact, the client is getting the same response as from the first call.
And I'm not even sure what would be most helpful to post up here.
Here's most of the entry point js (missing config for environment and winston):
const Hapi = require('hapi');
server = new Hapi.Server();
var mongo_connect = 'mongodb://' + options.mongo_creds + options.mongo_host + ':' + options.mongo_port + '/' + options.mongo_db;
const dbOpts = {
url: mongo_connect,
settings: {
poolSize: options.mongo_pool
},
decorate: true
};
server.connection({ port: options.server_port });
var routes = require('./routes');
if (options.env === "dev") {
server.on('response', function (request) {
winston.log('verbose', `[launch_api] ${request.info.remoteAddress}: ${request.method.toUpperCase()} ${request.url.path} --> ${request.response.statusCode}`);
});
}
server.register({
register: require('hapi-mongodb'),
options: dbOpts
}, function(err) {
if (err) {
winston.log('error', "[launch_api] Unable to register db pool");
throw err;
}
server.route(routes);
server.start(function(err) {
if (err) {
throw err;
}
winston.log('info', `[launch_api] Server running at: ${server.info.port}`);
});
});
Routes are pulled together in an index.js in the routes folder, but each file there looks something like:
'use strict';
var controller = require('../controllers/item-controller')
// Routes for Item
module.exports = [
{
method: 'POST',
path: '/api/item',
config: controller.create
},
{
method: 'GET',
path: '/api/items',
config: controller.fetchAll
},
{
method: 'GET',
path: '/api/item/{name}',
config: controller.find
}
];
The controllers all look something like this (just showing the find function for brevity, since this is already long)
const Boom = require('boom');
const Joi = require('joi');
const Item = require('../models/item');
module.exports = {
find: {
handler: function(request, reply) {
Item.initFromName(request.params.name).then( function(newItem) {
if (newItem == null) {
reply(Boom.notFound());
}
else {
reply(newItem);
}
}, function(err) {
reply(Boom.badImplementation());
});
}
}
}
Lastly, the models tend to follow this, er, model (again, cutting out all the prototype extensions, and just keeping the one class function in this route)
const deferred = require('deferred')()
const winston = require('winston');
const collection_name = "items";
var Item = function() {
this.name = "";
this.description = "";
};
// private
function fillFromDB(obj, record) {
obj._id = record._id;
obj.name = record.name;
obj.description = record.description;
}
// Constructor
module.exports.init = function() {
return new Item();
};
module.exports.initFromName = function(name) {
var item = new Item();
const db = server.mongo.db;
db.collection(collection_name).findOne({name: name}).then( function(opResult) {
winston.log("debug","Item.loadFromName opResult is: " + opResult);
if (opResult != undefined) {
winston.log("debug","Item.loadFromName opResult json is: " + JSON.stringify(opResult));
fillFromDB(item, opResult);
deferred.resolve(item);
}
else {
winston.log("debug","Resolving with null" );
deferred.resolve();
}
}, function(err) {
winston.log("error", "Item.loadFromName mongo error: " + err);
deferred.reject();
});
return deferred.promise;
};
so with all that, if I hit my endpoint with curl with a name that is not present in the collection, I get a 404 as expected. If I then hit with a name that is, I still get the 404.
This input:
$ curl -X GET http://192.168.99.100:3000/api/item/not_here
{"statusCode":404,"error":"Not Found"}
$ curl -X GET http://192.168.99.100:3000/api/item/here
{"statusCode":404,"error":"Not Found"}
produces this log:
debug: Item.loadFromName opResult is: null
debug: Resolving with null
verbose: [launch_api] 192.168.99.1: GET /api/item/not_here --> 404
verbose: [launch_api] 192.168.99.1: GET /api/item/here --> 404
debug: Item.loadFromName opResult is: [object Object]
debug: Item.loadFromName opResult json is: {"_id":"58b622908ea4d1cee2f46462","name":"here","description":"this item is here"}
Note that the opposite direction works, too. If I stop and start node, and then hit the endpoint with the name that is present, all subsequent calls will get that same object returned. I just can't figure out where this caching is happening.
The problem was the global instantiation of the deferred object. Not sure where I saw that, but it's a bad idea.
In the model, change the first line to just the require
const deferred = require('deferred');
Then, in the function create your deferred object there.
module.exports.initFromName = function(name) {
const defer = deferred();
var item = new Item();
const db = server.mongo.db;
db.collection(collection_name).findOne({name: name}).then( function(opResult) {
winston.log("debug","Item.loadFromName opResult is: " + opResult);
if (opResult != undefined) {
winston.log("debug","Item.loadFromName opResult json is: " + JSON.stringify(opResult));
fillFromDB(item, opResult);
defer.resolve(item);
}
else {
winston.log("debug","Resolving with null" );
defer.resolve();
}
}, function(err) {
winston.log("error", "Item.loadFromName mongo error: " + err);
defer.reject();
});
return defer.promise;
};
I probably don't understand something about JS, but I'm having an issue writing Purest response to the page body. Like here:
var koa = require('koa')
, session = require('koa-session')
, mount = require('koa-mount')
, koaqs = require('koa-qs')
, accesslog = require('koa-accesslog')
, router = require('koa-router')()
, app = koa();
var Grant = require('grant-koa')
, grant = new Grant(require('./config.json'))
app.keys = ['grant']
app.use(accesslog())
.use(session(app))
.use(mount(grant))
.use(router.routes())
.use(router.allowedMethods());
koaqs(app)
router.get('/handle_facebook_callback', function *(next) {
getProfile(this.query.access_token);
})
var config = {
"facebook": {
"https://graph.facebook.com": {
"__domain": {
"auth": {
"auth": {"bearer": "[0]"}
}
},
"{endpoint}": {
"__path": {
"alias": "__default"
}
}
}
}
}
var request = require('request')
, purest = require('purest')({request})
, facebook = purest({provider: 'facebook', config})
function getProfile(access_token, responseToBody){
facebook.get('me')
.auth(access_token)
.request(function (err, res, body) {
this.body=JSON.stringify(body,null,2);
})
}
if (!module.parent) app.listen(3000);
console.log('oh!GG is running on http://localhost:3000/');
I would assume in facebook.request function "this.body=JSON.stringify(body,null,2);" part should write the response into the body, however it doesn't.
What exactly is the issue here?
The route (a generator) isn't waiting for getProfile to finish. You need yield.
Right now in your snippet, it executes getProfile, which returns immediately to the generator, the generator finishes, Koa sees that you haven't set this.body, so it defaults to a 404 response.
Once the callback in getProfile finally fires at some later point, the response has already been sent and you get the error.
The general solution for getting a callback-style function to work with Koa (i.e. making it so you can yield it) is to wrap it in a Promise:
function getProfile (access_token) {
return new Promise(function (resolve, reject) {
facebook.get('me')
.auth(access_token)
.request(function (err, res, body) {
if (err) return reject(err)
resolve(body)
})
})
}
router.get('/handle_facebook_callback', function * (next) {
const profile = yield getProfile(this.query.access_token)
this.type = 'application/json'
this.body = JSON.stringify(profile, null, 2)
})
getProfile now returns a Promise which you can yield.
Also, notice that I changed it so that getProfile resolves with the profile object, and the Koa handler is the one that stitches together this.body and the JSON.
This is generally how you want to do things in Koa so that all of your response mutation happens inside the handler in one place.
I am new to NodeJS and coming from a PHP environment I am trying to figure out how to work with multiple callbacks. I do understand the basics about callback and I think it does make sens when writing modules. My problem is when comes the time to use those modules how to organize all the callbacks. Below is my implementation a request reset password controller method (I am using SailsJS). This is a first draft of my code. It was mainly to test a method of organizing callbacks. What do you guys think of this structure ? Is there a better way do it?
var _ = require('lodash');
var moment = require('moment');
var mailer = require("../../services/Mailer");
var crypto = require('../../services/crypto');
var forms = require("forms"),
fields = forms.fields,
validators = forms.validators;
module.exports = {
// Request reset user password: submit form and send email
request_process : function(req, res, next) {
var form = createForm();
form.handle(req, {
// there is a request and the form is valid
// form.data contains the submitted data
success: function (form) {
var user = null;
var username = form.data.username;
User.findOne({'username' : username}, foundUser);
function foundUser( err, _user){
if(err)
res.send(500, 'User not found');
user = _user;
if user.isPasswordRequestNonExpired()
return res.view('user/resetting/passwordAlreadyRequested');
if !user.passwordRequestToken
user.passwordRequestToken = crypto.randomToken();
renderEmail(null, user);
}
function renderEmail(err, user){
res.render('email/resetting_check_email', {'user': user, }, sendEmail );
}
function sendEmail(err, template){
if(err)
return res.send(500, "Problem with sending email");
Mailer.send( user, "Reset Password", template, sentEmail);
}
function sentEmail(err, response){
if(err)
return res.send(500, "Error sending email");
user.passwordRequestedAt = moment().format();
user.save(finish);
}
function finish(err){
if(err)
return res.send(500);
res.view();
}
},
// the data in the request didn't validate,
// calling form.toHTML() again will render the error messages
error: function (form) {
console.log("registration error", form);
res.locals.form = form;
return res.render('user/registration/register');
},
// there was no form data in the request
empty: function (form) {
console.log("registration empty", form);
res.locals.form = form;
return res.render('user/registration/register');
}
},
// Tell the user to check his email provider
check_email : function(req, res, next) {
// if empty req.params.email
// redirect request view
// res.view('check_email.ejs')
},
// Reset user password
reset : function(req, res, next){
// find userByPasswordToken
// if !user
// res.view ('invalid or expired "confirmation token".')
// user.update password
// res.view('reset.ejs');
},
Node.js callback basics:
Most of the functions (Node and its Libraries (called modules)), are of asynchronous (async) nature.
These functions have a common signature with callback as the last argument: function(arguments.....callback).
The callback is just another JavaScript function. ( yes, in Javascript, functions can be passed around as arguments to other functions). Node.js typical callbacks have a signature with first argument as error (if one happened): callback(err,outputs......).
example: first argument is a string, second an object (defined inline) and the last is a function (defined inline).
doSomeWork('running',{doFast:true,repeat:20}, function(err,result){
if(err){
console.log('ohnoes!);
} else {
console.log('all done : %s',result);
}
});
which is equivalent to:
var type = 'running';
var options = {doFast:true,repeat:20};
var callback = function(err,result){
if(err){
console.log('ohnoes!);
} else {
console.log('all done : %s',result);
}
};
doSomeWork(type,options,callback);
So the basic contract here is give a function its arguments and pass a callback to be invoked, when it is done. The passed call back will be invoked some where in future when there is something to return, error or the results.
Multiple nested callbacks are generally less readable and complex:
function uploadAll(path,callback){
listFiles(path,function(err,files){
if(err){
callback(err);
}else{
var uploaded = [];
var error;
for(var i = 0 ; i < files.length; i++){
uploadFile(files[i],function(err,url){
if(err){
error = err;
break;
}else{
uploaded.push(url);
}
});
}
callback(error,uploaded);
}
});
};
But fortunately there are modules like async that help organize callbacks:
function uploadAll(path,callback){
async.waterfall(
[
function(cb){
listFiles(path,cb);
},
function(files,cb){
async.map(files,uploadFile,cb);
}
],callback);
}
Furthermore, there is a Promises pattern as well. Future versions support generators which provide many new async patterns.
you can use async or q to manage the callback pyramids