Hi I'm trying to subscribe to AWS AppSync using JS with this example https://docs.aws.amazon.com/appsync/latest/devguide/building-a-client-app-javascript.html
When querying user table, I got the result just fine. But I can't seem to subscribe to new results.
What I got is a 404 error:
https://some-iot-url.iot.eu-west-1.amazonaws.com/mqtt?X-Amz-Algorithm=..... 404 (Not Found)
with the following error:
errorCode:7
errorMessage:"AMQJS0007E Socket error:undefined."
invocationContext:undefined
The strange thing about the IOT url is that it doesn't match with my IOT url in the IoT core dashboard. Is this expected?
More info: I bundled the code with webpack (but I guess this has nothing to do with the error)
Here's my full code
config.js
Object.defineProperty(exports, "__esModule", { value: true });
var config = {
AWS_ACCESS_KEY_ID: '',
AWS_SECRET_ACCESS_KEY: '',
HOST: 'my-host.appsync-api.eu-west-1.amazonaws.com',
REGION: 'eu-west-1',
PATH: '/graphql',
ENDPOINT: '',
};
config.ENDPOINT = "https://" + config.HOST + config.PATH;
exports.default = config;
app.js
/**
* This shows how to use standard Apollo client on Node.js
*/
global.WebSocket = require('ws');
global.window = global.window || {
setTimeout: setTimeout,
clearTimeout: clearTimeout,
WebSocket: global.WebSocket,
ArrayBuffer: global.ArrayBuffer,
addEventListener: function () { },
navigator: { onLine: true }
};
global.localStorage = {
store: {},
getItem: function (key) {
return this.store[key]
},
setItem: function (key, value) {
this.store[key] = value
},
removeItem: function (key) {
delete this.store[key]
}
};
require('es6-promise').polyfill();
require('isomorphic-fetch');
// Require exports file with endpoint and auth info
const aws_exports = require('./aws-exports').default;
// Require AppSync module
const AUTH_TYPE = require('aws-appsync/lib/link/auth-link').AUTH_TYPE;
const AWSAppSyncClient = require('aws-appsync').default;
const url = aws_exports.ENDPOINT;
const region = aws_exports.REGION;
const type = AUTH_TYPE.API_KEY;
// If you want to use API key-based auth
const apiKey = 'my-api-key';
// If you want to use a jwtToken from Amazon Cognito identity:
const jwtToken = 'xxxxxxxx';
// // If you want to use AWS...
// const AWS = require('aws-sdk');
// AWS.config.update({
// region: aws_exports.REGION,
// credentials: new AWS.Credentials({
// accessKeyId: aws_exports.AWS_ACCESS_KEY_ID,
// secretAccessKey: aws_exports.AWS_SECRET_ACCESS_KEY
// })
// });
// const credentials = AWS.config.credentials;
// Import gql helper and craft a GraphQL query
const gql = require('graphql-tag');
const query = gql(`
query AllUser {
listUsers(first: 20) {
__typename
items{
id
userId
username
}
}
}`);
// Set up a subscription query
const subquery = gql(`
subscription NewUser {
subscribeToNewUsers {
__typename
id
userId
username
}
}`);
// Set up Apollo client
const client = new AWSAppSyncClient({
url: url,
region: region,
auth: {
type: type,
apiKey: apiKey,
}
});
client.hydrated().then(function (client) {
//Now run a query
console.log('querying')
client.query({ query: query })
.then(function logData(data) {
console.log('results of query: ', data);
})
.catch(console.error);
//Now subscribe to results
const observable = client.subscribe({ query: subquery });
console.log(observable)
const realtimeResults = function realtimeResults(data) {
console.log('realtime data: ', data);
};
observable.subscribe({
next: realtimeResults,
complete: console.log,
error: console.log,
});
});
Hey sorry for the slow response. If you are running this in a browser context can you try commenting out the following lines of code:
/*
global.WebSocket = require('ws');
global.window = global.window || {
setTimeout: setTimeout,
clearTimeout: clearTimeout,
WebSocket: global.WebSocket,
ArrayBuffer: global.ArrayBuffer,
addEventListener: function () { },
navigator: { onLine: true }
};
global.localStorage = {
store: {},
getItem: function (key) {
return this.store[key]
},
setItem: function (key, value) {
this.store[key] = value
},
removeItem: function (key) {
delete this.store[key]
}
};
require('isomorphic-fetch');
*/
// You should be able to keep this.
require('es6-promise').polyfill();
This sample has some work arounds to make it work nicely with node but it seems like there are a few issues in its current state. Let me know if that works. Thanks!
Related
Where am i going wrong here?
Using mocha, chai, sinon and proxyquire for an express server and sequelize ORM linked with a postgres database
I am trying to test a login controller route from my express server
Before I show the file which I want to run my test on here is what "../services/authService.js" file looks like
../services/authService
const UserService = require("./userService");
module.exports = class AuthService extends UserService {
};
// so UserService will have the method findByEmail
// UserService class looks like this and it is coming from another file require("./userService.js) as stated above
/*
class UserService {
async findByEmail(email) {
try {
const user = await User.findOne({ where: { email: email }});
if (user) {
return user;
}
throw new Error("User not found");
} catch (err) {
err.code = 404;
throw err
}
}
}
*/
And here is the auth-controller.js file which I want to run the test on
auth-controller.js
const bcrypt = require('bcryptjs');
const AuthService = require("../services/authService"); // is a class which extends from another calls see the code above
const authService = new AuthService();
const jwtGenerator = require('../utils/jwtGenerator');
const createError = require("http-errors");
exports.loginRoute = async (req, res, next) => {
try {
req.body.password = String(req.body.password);
// db query trying to force a sinon.stub to resolve a fake value. But code wont pass here hence 500 error
const userQuery = await authService.findByEmail(req.body.email);
const compare = await bcrypt.compare(req.body.password, userQuery.password);
if (!compare) throw createError(401, 'Incorrect password.');
const user = {
id: userQuery.id, role: userQuery.is_admin ? "Administrator" : "User", email: userQuery.email, Authorized: true
}
const token = jwtGenerator(user);
return res
.cookie("access_token", token, {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
}).status(200).json({ message: "Logged in successfully 😊 👌", user, token });
} catch (error) {
next(error);
}
}
This code works in production but I cannot seem to test it. I used proxyquire to require the modules that the function uses. I have a big problem in making proxyquire work when it comes to my class AuthService here is my test file. As proxyquire is not working with classes some how. proxyquire is not using make AuthServiceMock at all cant figure out why.
First of these are my helper variables which I will use in the test file
../test-utils/user-helper
const createAccessToken = (payload) => jwt.sign(payload, TOKEN, {expiresIn: "1h"});
let loginDetail = {
email: "admin#test.com",
password: "123456"
};
let loginAdminUser = {
id: 1,
email: "admin#test.com",
password: "123456",
is_admin: true
}
const loginUser = {
id: 1,
email: "admin#test.com",
password: "123456",
is_admin: true
}
const adminUser = {
id: 1,
email: 'admin#test.com',
password: '123456',
is_admin: true,
first_name: 'john',
last_name: 'doe',
created_at: "2020-06-26T09:31:36.630Z",
updated_at: "2020-06-26T09:31:49.627Z"
}
module.exports = {
createAccessToken,
loginDetail,
loginAdminUser,
loginUser,
adminUser
}
And here is the test file I placed comments espcially around proxyquire when I am trying to use it as this is giving me some issues when it comes to using it with classes. And as well it is not calling mocked/stubbed npm modules for some reason
auth-controller.spec.js
"use strict";
const _ = require("lodash");
const path = require("path");
const proxyquire = require("proxyquire").noCallThru().noPreserveCache();
const chai = require("chai");
const { expect } = chai;
const sinon = require("sinon");
const sinonChai = require("sinon-chai");
chai.use(sinonChai);
// const AuthServiceOriginalClass = require("../../services/authService"); If i use this directly in proxyquire it calls the original class
const { createAccessToken, loginDetail, loginAdminUser, loginUser, adminUser } = require("../test-utils/user-helper");
const controllerPath = path.resolve('./controllers/authController.js');
describe("login route", () => {
let proxy, authService, bcryptStub, fakeCallback, fakeReq, fakeRes, fakeNext, resolveFn, token;
let result, bcryptStubbing, response;
class UserServiceMock {
async findByEmail(email) {
try {
if (email) {
return loginAdminUser;
}
} catch (error) {
throw error;
}
}
}
class AuthServiceMock extends UserServiceMock {};
bcryptStub = {
compare: function() { return true }
};
let tokeen = (kk) => {
return createAccessToken(kk);
}
// token = sinon.mock(createAccessToken(loginAdminUser)); // ?? which 1 to use?
token = sinon.spy(createAccessToken); // ?? which 1 to use?
// token = sinon.stub(createAccessToken) ?? which 1 to use?
proxy = proxyquire(controllerPath, {
"../services/authService.js": AuthServiceMock, // seems like this is not called at all
// "../services/authService.js": AuthServiceOriginalClass, // commented out if use this instead it calls the original class instant
"bcryptjs": bcryptStub,
"../utils/jwtGenerator": token,
// "#noCallThru": true // keep on or off?
});
before("Stub my methods", () => {
authService = new AuthServiceMock();
// If I call the entire loginRoute I want this stub authTry to be called inside of it and resolve that object value
authTry = sinon.stub(authService, "findByEmail").withArgs(loginDetail.email).resolves(loginAdminUser);
sinon.stub(bcryptStub, "compare").resolves(true); // force it to return true as that seems to be like the code of authController.js
// sinon.stub(token, "createAccessToken")
});
before("call the function loginRoute", async () => {
// fakeCallback = new Promise((res, rej) => {
// resolveFn = res
// });
fakeReq = {
body: {
email: loginDetail.email,
password: loginDetail.password
}
};
fakeRes = {
cookie: sinon.spy(),
status: sinon.spy(),
json: sinon.spy()
}
fakeNext = sinon.stub();
await proxy.loginRoute(fakeReq, fakeReq, fakeNext).then((_result) => {
result = _result;
});
console.log("result")
console.log(result) // undefined
console.log("result")
});
it("login route test if the stubs are called", async () => {
expect(authService.findByEmail).to.have.been.called // never called
// expect(bcryptStubbing).to.have.been.called // never called
// expect(response.status).to.deep.equal(200); // doesn't work
}).timeout(10000);
after(() => {
sinon.reset()
});
});
Where am i going wrong here in the test?
I am writing a lambda function to add hosts to a SQS queue for a rolling restart. The code I have written works individually, but not together. Even when I hard code values in the constructor. This doesn't appear to be a memory/CPU. I tried running the function with 1GB of memory, even though it only uses about 80MB. The average execution time for the individual functions is about 0.5 seconds (shouldn't take more than about 1.5 seconds to execute in total). I did trying running this function with a 30 second timeout, but it still timed out.
I work behind a corporate proxy, and have to hand jam the code. I don't have an IDE or intellisense on my internet facing network. There may be typos here, but not in the actual code. I have omitted my module imports and variable declarations to save time. It isn't relevant to the issue at hand.
EDIT: I added the module imports and variable declarations to the first example to hopefully alleviate some confusion.
Here are just a few things I have tried. This does not work (timing out):
// Custom lambda layer
const { marklogic, aws } = require('nodejs-layer-lib');
const { HOSTS, DOMAIN, PORT, USERNAME, PASSWORD, RESTART_QUEUE_NAME } = process.env;
const params = [
'format=json'
];
const options = {
port: PORT,
params: params,
httpOptions: {
headers: {
'Authorization': `Basic ${Buffer.from(`${USERNAME}:${PASSWORD}`).toString('base64')}`
},
method: 'GET'
}
};
const taskServers = (HOSTS.split(',') || []).map(host => {
const _host = host.split(':');
return {
id: _host[0],
name: `http://${_host[1].toLowerCase()}.${DOMAIN}`
};
});
exports.handler = async () => {
let hosts, queueUrl, addToQueueResults;
try {
hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
} catch (e) { console.error('hosts', e); }
try {
queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
} catch (e) { console.error('queueUrl ', e); }
try {
addToQueueResults = await aws.sqs.addMessages(queueURL, hosts);
} catch (e) { console.error('addToQueueResults ', e); }
return {
status: 200,
body: addToQueueResults
};
}
This does not work (timing out):
// Modules imports and variable declarations here...
exports.handler = async () => {
const hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
const queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
const addToQueueResults = await aws.sqs.addMessages(queueURL, hosts);
return {
status: 200,
body: addToQueueResults
};
}
This does not work (timing out):
// Modules imports and variable declarations here...
exports.handler = async () => {
const hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
const queueUrl = await aws.sqs.getQueueUrlByName('my-queue-name');
const addToQueueResults = await aws.sqs.addMessages('http://queueurl.com', ['anything', 'in', 'here']); // Doesn't even need the queueUrl or hosts anymore
return {
status: 200,
body: addToQueueResults
};
}
This works. It will return the host objects I am expecting in the response:
// Modules imports and variable declarations here...
exports.handler = async () => {
const hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
return {
status: 200,
body: hosts
};
}
This works. It will get the queue url, then add messages to my SQS queue and return the SQS response:
// Modules imports and variable declarations here...
exports.handler = async () => {
const queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
const addToQueueResults = await aws.sqs.addMessages(queueUrl , ['anything', 'in', 'here']);
return {
status: 200,
body: addToQueueResults
};
}
I tried implementing the Async handler in AWS Lambda function handler in Node.js and reviewed many AWS Lambda execution troubleshooting documents. The marklogic management API runs on port 8002 by default and I think the aws-sdk module uses http/https (80/443), so I don't think the ports are getting tied up.
What am I missing here?
EDIT 2: This has something to do with how promises are handled with AWS Lambda. I cannot find much information about this. Even following the instructions in AWS Lambda function handler in Node.js for "Async Handlers" I cannot get this to work. It works perfectly fine locally with or without my custom lambda layer.
Node.js runtime: 12.x (I didn't mention this before)
This also doesn't work (timing out):
// Modules imports and variable declarations here...
exports.handler = async function (event) {
const promise = function () {
return new Promise(async function (resolve, reject) {
try {
const hosts = await marklogic.hosts.getHosts(taskServers, options) || [];
const queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
const addToQueueResults = await aws.sqs.addMessages(queueUrl, hosts);
resolve({
status: 200,
body: addToQueueResults
});
} catch (error) {
reject({
status: 500,
error: error
});
}
});
};
return promise(); // Throws error without constructor despite the AWS doc example
}
Unless someone AWS Lambda genius has ran into a similar issue before with Node.js, I am just going to convert it into 2 lambda functions and use Step Functions to process them.
There was a typo in queueUrl (I imagine not that, but worth a try!)
Please run:
// Custom lambda layer
const { marklogic, aws } = require('nodejs-layer-lib');
const { HOSTS, DOMAIN, PORT, USERNAME, PASSWORD, RESTART_QUEUE_NAME } = process.env;
const params = [
'format=json'
];
const options = {
port: PORT,
params,
httpOptions: {
headers: {
Authorization: `Basic ${Buffer.from(`${USERNAME}:${PASSWORD}`).toString('base64')}`
},
method: 'GET'
}
};
const taskServers = (HOSTS.split(',') || []).map(host => {
const _host = host.split(':');
return {
id: _host[0],
name: `http://${_host[1].toLowerCase()}.${DOMAIN}`
};
});
exports.handler = async () => {
let hosts, queueUrl, addToQueueResults;
try {
hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
} catch (e) { console.error('hosts', e); }
try {
queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
} catch (e) { console.error('queueUrl ', e); }
try {
addToQueueResults = await aws.sqs.addMessages(queueUrl, hosts);
} catch (e) { console.error('addToQueueResults ', e); }
return {
status: 200,
body: JSON.stringify(addToQueueResults)
};
};
// keeping the same format.. ^^
If no luck - what is on my mind, is as aws-sdk is included out of the box in lambda.. it's not customary to require it extraneously via a layer and although it may not look to be imported at top level by marklogic, it may be bundled deep within marklogic, then when you import AWS and change config (in the layer) it overwrites it
Let's find out..:
Step 1:
So, this you say should work.. if we ignore the AWS import, and just import marklogic?
// Custom lambda layer
// const { marklogic, aws } = require('nodejs-layer-lib'); // ignoring AWS for now
const { marklogic } = require('nodejs-layer-lib');
const { HOSTS, DOMAIN, PORT, USERNAME, PASSWORD, RESTART_QUEUE_NAME } = process.env;
const params = [
'format=json'
];
const options = {
port: PORT,
params,
httpOptions: {
headers: {
Authorization: `Basic ${Buffer.from(`${USERNAME}:${PASSWORD}`).toString('base64')}`
},
method: 'GET'
}
};
const taskServers = (HOSTS.split(',') || []).map(host => {
const _host = host.split(':');
return {
id: _host[0],
name: `http://${_host[1].toLowerCase()}.${DOMAIN}`
};
});
exports.handler = async () => {
// let hosts, queueUrl, addToQueueResults;
let hosts;
try {
hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
console.log('hosts => ', hosts);
// queueUrl = await aws.sqs.getQueueUrlByName(RESTART_QUEUE_NAME);
// addToQueueResults = await aws.sqs.addMessages(queueUrl, hosts);
return {
status: 200,
body: JSON.stringify(hosts)
};
} catch (error) {
console.log('error => ', error);
throw error;
}
};
Ok, so if that works..:
Step 2 (Please set the region for SQS and also hard code in the queueUrl):
// Custom lambda layer
// const { marklogic, aws } = require('nodejs-layer-lib');
const { marklogic } = require('nodejs-layer-lib');
const AWS = require('aws-sdk');
AWS.config.update({ region: 'eu-west-1' }); // Please set region accordingly
const sqs = new AWS.SQS({ apiVersion: '2012-11-05' });
const { HOSTS, DOMAIN, PORT, USERNAME, PASSWORD, RESTART_QUEUE_NAME } = process.env;
const params = [
'format=json'
];
const options = {
port: PORT,
params,
httpOptions: {
headers: {
Authorization: `Basic ${Buffer.from(`${USERNAME}:${PASSWORD}`).toString('base64')}`
},
method: 'GET'
}
};
const taskServers = (HOSTS.split(',') || []).map(host => {
const _host = host.split(':');
return {
id: _host[0],
name: `http://${_host[1].toLowerCase()}.${DOMAIN}`
};
});
exports.handler = async () => {
let hosts, addToQueueResults;
try {
hosts = (await marklogic.hosts.getHosts(taskServers, options) || []);
console.log('hosts => ', hosts);
const queueUrl = 'Please hard code the queueUrl for now';
const sqsParams = {
MessageBody: hosts,
QueueUrl: queueUrl
};
addToQueueResults = await sqs.sendMessage(sqsParams).promise();
console.log('addToQueueResults => ', addToQueueResults);
return {
status: 200,
body: JSON.stringify(addToQueueResults)
};
} catch (error) {
console.log('error => ', error);
throw error;
}
};
IF.. that doesn't work.. then Step 3.. move the require of marklogic to below the require of AWS and setting the region in this last example.. (so any deeply nested marklogic AWS logic we're unaware of now overwrites your AWS require..) re-run it.. fingers crossed :-)
I currently use The Twilio Node Helper Library to do various API calls whether it may be to create assistants/services, list them, remove them and various other things when it comes to uploading tasks, samples, fields to create a chatbot on Twilio Autopilot.
An example of one some of these functions include:
async function createAssistant(name, client){
var assistantUid = ""
await client.autopilot.assistants
.create({
friendlyName: name,
uniqueName: name
})
.then(assistant => assistantUid = assistant.sid);
return assistantUid
}
async function getAccount(client){
var valid = true
try {
await client.api.accounts.list()
} catch (e) {
valid = false
}
return valid
}
async function connectToTwilio(twilioAccountSid, twilioAuthToken) {
try{
var client = twilio(twilioAccountSid, twilioAuthToken);
} catch (e){
throw new TwilioRequestError(e)
}
var valid = await getAccount(client)
if(valid && client._httpClient.lastResponse.statusCode === 200){
return client
} else{
throw new Error("Invalid Twilio Credentials")
}
}
where client is the client object returned from require("twilio")(twilioAccountSid, twilioAuthToken).
I was wondering what would the best way of mocking this API to allow me to emulate creating assistants, returning their uniqueNames etc..
I was wondering that I may just define some class like
class TwilioTestClient{
constructor(sid, token){
this.sid = sid
this.token = token
this.assistants = TwilioAssistant()
this.services = TwilioServices()
}
}
Where TwilioAssitant and TwilioServices will be additional classes.
Any help would be greatly appreciated!
I struggled with mocking Twilio for a long time. In fact I previously architected my application such that I could mock a wrapper around the Twilio Node Helper just to avoid mocking the actual library. But recent changes to the architecture meant that was no longer an option. This morning I finally got a mock of the Twilio Node Helper Library working. I'm not familiar with the portions of the Twilio library you are using, but I'm hopeful the example here will help you.
We have a function to check if a phone number is mobile, call it isMobile.js.
const Twilio = require("twilio");
const isMobile = async (num) => {
const TwilioClient = new Twilio(process.env.TWILIO_SID, process.env.TWILIO_AUTH_TOKEN);
try {
const twilioResponse = await TwilioClient.lookups.v1
.phoneNumbers(num)
.fetch({ type: "carrier", mobile_country_code: "carrier" });
const { carrier: { type } = {} } = twilioResponse;
return type === "mobile";
} catch (e) {
return false;
}
};
module.exports = isMobile;
Then build a mock for Twilio in __mocks__/twilio.js
const mockLookupClient = {
v1: { phoneNumbers: () => ({ fetch: jest.fn(() => {}) }) }
};
module.exports = class Twilio {
constructor(sid, token) {
this.lookups = mockLookupClient;
}
};
In the test file isMobile.test.js
jest.mock("twilio");
const Twilio = require("twilio");
const isMobile = require("./isMobile");
const mockFetch = jest.fn();
const mockPhoneNumbers = jest.fn(() => ({
fetch: mockFetch
}));
describe("isMobile", () => {
const TwilioClient = new Twilio(process.env.TWILIO_SID, process.env.TWILIO_AUTH_TOKEN);
const lookupClient = TwilioClient.lookups.v1;
lookupClient.phoneNumbers = mockPhoneNumbers;
beforeEach(() => {
mockFetch.mockReset();
});
test("is a function", () => {
expect(typeof isMobile).toBe("function");
});
test("returns true for valid mobile number", async () => {
const validMobile = "+14037007492";
mockFetch.mockReturnValueOnce({
carrier: { type: "mobile", mobile_country_code: 302 }, // eslint-disable-line camelcase
phoneNumber: validMobile
});
expect(await isMobile(validMobile)).toBe(true);
});
test("returns false for non-mobile number", async () => {
const invalidMobile = "+14035470770";
mockFetch.mockReturnValueOnce({
carrier: { type: "not-mobile", mobile_country_code: null }, // eslint-disable-line camelcase
phoneNumber: invalidMobile
});
expect(await isMobile(invalidMobile)).toBe(false);
});
});
I'm setting up logging in an app with winston and occassionally when I run tests a separate file is created with the date 12/31/1969. Is there something explicit I need to put in the creation of the transport so that it knows what the current date is?
What's very interesting is this seems to be a system wide anomaly as the _log method, which doesn't use the new Date() syntax, but the moment.js library also results in a 12-31-1969 inside the log file:
My logger:
class Logger{
constructor(configs){
if (!configs) configs = {};
this.logDirectory = configs.directory ? path.join(__dirname, configs.directory) : path.join(__dirname, '../logs') ;
this.initialize();
this.date = moment().format("YYYY-MM-DD HH:mm:ss");
}
initialize() {
this._createTransportObj();
this._createLoggerObj();
}
_createTransportObj() {
const DailyRotateFile = winston.transports.DailyRotateFile;
this._transport = new DailyRotateFile({
filename: path.join(this.logDirectory, '/log-%DATE%.log'),
datePattern: 'YYYY-MM-DD',
level: 'info'
});
}
_createLoggerObj() {
this._logger = winston.createLogger({
level: 'info',
format: winston.format.json(),
transports: [this._transport],
exitOnError: true
});
if (nodeEnv !== 'production') {
this._logger.add(new winston.transports.Console({
format: winston.format.simple()
}));
}
}
_log(type, msg, options) {
const logMsg = {};
const timestamp = moment().format("YYYY-MM-DD HH:mm:ss");
logMsg.level = type || 'info';
logMsg.time = timestamp;
logMsg.msg = msg || '';
logMsg.desc = options.description || '';
// get the user that made the request if available
if (options.user) logMsg.user = options.user;
// get the url endpoint that was hit if available
if (options.url) logMsg.url = options.url;
// if an error is sent through, get the stack
// and remove it from msg for readability
if (msg.stack) {
logMsg.stack = msg.stack;
msg = msg.message ? msg.message : msg;
}
// get the ip address of the caller if available
if (options.ip) logMsg.ip = options.ip;
// get the body of the request if available
if (options.body) logMsg.body = options.body;
// get the query string of the request if available
if (options.query) logMsg.query = options.query;
// get the params string of the request if available
if (options.params) logMsg.params = options.params;
const jsonString = JSON.stringify(logMsg);
this._logger.log(type, logMsg);
}
info(msg, options) {
return this._log('info', msg, options);
}
error(msg, options) {
return this._log('error', msg, options);
}
warn(msg, options) {
return this._log('warn', msg, options);
}
verbose(msg, options) {
return this._log('verbose', msg, options);
}
debug(msg, options) {
return this._log('debug', msg, options);
}
silly(msg, options) {
return this._log('silly', msg, options);
}
}
module.exports = { Logger };
I'm currently only testing it in a promise handler that my routes flow through:
const asyncHandler = fn => (req, res, next) => {
const logger = new Logger();
Promise.resolve(fn(req, res, next))
.then(result => {
if (req.body.password) delete req.body.password;
logger.info(result,
{ user: req.user.username,
url: req.originalUrl,
body: req.body,
description: '200:OK Response sent back successfully'
});
return res.status(200).json({ result })
})
.catch(e => {
console.log(e);
return res.status(400).json({ error: e.message })
});
};
module.exports = asyncHandler;
UPDATE*
ok, so it seems to not be the logger itself. I ran a batch of tests and noticed it's always the same route that triggers the date change. What's weird is I can't seem to figure out what's happening.
The route is:
and the app.use() statement is as follows:
finally the admin_access middleware is simple enought:
I've figured out if I break the endpoint in the app.js file before it hits admin_access the date is correct. However if I break in admin_access the date is 12-31-1969. So what could be happening between the two? Is there something I could be setting unintentionally on this route?
Figured it out. Turns out the package sinon-test was changing the system time.
I had my test setup like this:
const sinon = require('sinon');
const sinonTest = require('sinon-test');
const test = sinonTest(sinon);
describe('Test suite for route: /video/admin/create', ()=>{
let agent;
beforeEach(async function() {
// create temporary admin
admin.permissionId = 1;
await new NewUser().createUser(admin);
//login as admin
agent = chai.request.agent(server);
await agent
.post('/auth')
.send({ username: admin.username, password: admin.password });
});
afterEach(async function() {
// destroy temp admin
await new UserManager(admin).hardRemove();
});
it('should fail to create a video for not including video info', test(async function(){
const newVideo = { title: 'Test Video' };
const response = await agent.post('/video/admin/create')
.send(newVideo);
expect(response.status).to.equal(400);
}));
it('should create a video', test(async function(){
const newVideo = {
title: 'Test Video',
description: 'This is a description',
embed: 'https://youtu.be/SKbHjjZXdmc',
source: 'youtube',
type: 'lecture',
category: 'physics'
};
const response = await agent.post('/video/admin/create')
.send(newVideo);
// validations
expect(response.status).to.equal(200);
const { result } = response.body;
expect(result.title).to.equal(newVideo.title);
expect(result.description).to.equal(newVideo.description);
expect(result.embed).to.equal(newVideo.embed);
}));
});
When I unwrapped the test from the test() function everything worked correctly.
I am writing test case for API where I have passed params and main file method is calling http request to get the data, So core() is basically calling the API and getting the response. This is just backend code that we have to test using jasemine.
First how to see response in test case if that's matched with success that is defined in test.
What is a correct way to write test in below scenario.
balance.spec.ts
import { GetAccountBalance } from './GetAccountBalance.node'
import { Promise } from 'es6-promise'
import { HttpRequest } from '../../../core/http/HttpRequest.node'
import * as sinon from 'sinon'
import { getValidationError } from '../../../common/ValidationErrorFactory'
// for easy mocking and cleanup
const sandbox = sinon.createSandbox()
afterAll(function afterTests () {
sandbox.restore()
})
describe('getAccountBalance', function () {
// the module under test
const module = new GetAccountBalance()
const testURL = 'https://essdd.max.com:4535'
const urlPath = '/webServices/instrument/accountBalance'
const fullURL = testURL + urlPath
const options = { isJSON: true }
let result
const stubbedEC = sandbox.spy(getValidationError)
const stubbedHttp = sandbox.createStubInstance(HttpRequest)
const success = {
header: {
serviceName: 'accountBalance',
statusCode: '0000',
statusDesc: 'SUCCESS',
},
response: {
balanceAccount: '6346.44',
},
}
const params = { Id: '21544', appName: 'instrucn', channelName: 'Web' }
describe('core() is called with correct params', function () {
beforeEach(function () {
result = module.core(params, stubbedHttp)
})
it('it should return the response from server', function () {
// Invoke the unit being tested as necessary
result.then((data) => {
expect(data).toBe(success)
})
})
})
})
getaccountBalnce.ts
public core(args: IAccountBalanceParam, httpRequest: HttpRequestBase): Promise<any> {
console.log('enter core');
const DBPLurl: string = this.constructDBPLurl(args);
const DBPLrequest: IAccountBalanceDBPLParam = this.constructDBPLrequest(args);
return Promise.resolve(httpRequest.makeRequest(DBPLurl, DBPLrequest, {isJSON: true}));
}