I am trying to use the Discord.js library in an Express RESTful API. I'm wondering how I should share the client between controllers, because the client is initialized asynchronously and apparently it is bad practice to call client.login(...) multiple times. In other words, I have an asynchronous initialization method that I cannot call multiple times and I need to access this client across multiple controllers. Here's what I'm doing right now:
discord.helper.js
const Discord = require('discord.js');
const client = new Discord.Client();
client.login(process.env.DISCORD_BOT_TOKEN);
export default client;
My issue is that because the client.login() is asynchronous but can only be called one time, I cannot import this file with the assumption that the bot has already been initialized. Any ideas on how I should structure this module so that I can import it multiple times with the assumption that it has already been initialized?
Client#login is asynchronous but it does not return a Client instance once resolved, See here
You can safely assume the client is available as long as it was able to login, however if possible I would make your Express server accept a instance of the client instead.
import { createServer } from "./server"
import { Client } from "discord.js"
const client = new Client()
const app = startServer(client)
client.login(process.env.DISCORD_BOT_TOKEN)
app.listen(3000, () => {
console.log("Express server is listening on port 3000")
});
Example of createServer
import express from "express"
export const createServer = client => {
const app = express()
app.get("/", (_, res) => {
res.send(`${client.user.username} says hello`)
})
return app
}
Related
I've been playing around with setting up a basic atlassian-connect-express (ACE) application, and have modified the starter code provided by the ACE package to be suitable for serverless deployment. One of the problems I faced after doing this was that routing is now divided into stages, e.g. /dev, /prod. I did a bit of research and found that a way to deal with this would be to use an express Router and mount it to the appropriate endpoint for the stage being deployed to. The problem I then faced is that the authentication middleware provided by ACE seems to be application level and then can't be used by each router.
Typically the routes were added to the express app like this:
import ace from 'atlassian-connect-express';
import express from 'express';
import routes from './routes';
const app = express();
const addon = ace(app);
app.use(addon.middleware());
routes(app, addon);
and in ./routes/index.js
export default function routes(app, addon) {
// Redirect root path to /atlassian-connect.json,
// which will be served by atlassian-connect-express.
app.get('/', (req, res) => {
res.redirect('/atlassian-connect.json');
});
// This is an example route used by "generalPages" module (see atlassian-connect.json).
// Verify that the incoming request is authenticated with Atlassian Connect.
app.get('/hello-world', addon.authenticate(), (req, res) => {
// Rendering a template is easy; the render method takes two params:
// name of template and a json object to pass the context in.
res.render('hello-world', {
title: 'Atlassian Connect'
});
});
// Add additional route handlers here...
}
I've changed ./routes/index.js to work as a router object and export that, however this leaves me unable to use the addon.authenticate() middleware
import ace from 'atlassian-connect-express';
import express from 'express';
import routes from './routes';
const app = express();
const addon = ace(app);
app.use('/dev', require('./routes'));
and in ./routes/index.js
const express = require('express');
const router = express.Router();
// Redirect root path to /atlassian-connect.json,
// which will be served by atlassian-connect-express.
router.get('/', (req, res) => {
res.redirect('/atlassian-connect.json');
});
// This is an example route used by "generalPages" module (see atlassian-connect.json).
// Verify that the incoming request is authenticated with Atlassian Connect.
router.get('/hello-world', addon.authenticate(), (req, res) => {
// Rendering a template is easy; the render method takes two params:
// name of template and a json object to pass the context in.
res.render('hello-world', {
title: 'Atlassian Connect'
});
});
module.exports = router;
Obviously having no knowledge of addon, the router cannot use that authentication middleware.
Is it possible to pass that middleware through to the router when attaching it to the application? If not, is there another way I can handle URL prefixes without using a router?
I am new to testing. I am using JEST to test my nodejs API's. When i am writing all the tests in one file its running properly without any error but when i am separating it its giving me port is already under use As for each file its running different node instance.
Both the files i am writing this to test
const supertest = require('supertest');
const app = require('../index');
describe('API Testing for APIs', () => {
it('Healthcheck endpoint', async () => {
const response = await supertest(app).get('/healthcheck');
expect(response.status).toBe(200);
expect(response.body.status).toBe('ok');
});
});
How i can separate my test in different files to organise my tests in better way or is there anyway to organise test files.
PS - please suggest what are the best practises to write NodeJS api tests.
When you use Express with Jest and Supertest, you need to separate in two different files your Express application definition and the application listen. Supertest doesn't run on any port. It simulates an HTTP request response to your Express application. It'll be something like this:
File: app.js
const express = require('express');
const app = express();
app.get('/healthcheck', (req, res) => {
res.json({msg: 'Hello!'});
};
module.exports = app;
File: index.js
const app = require('./app');
app.listen(3000);
File: index.test.js
const request = require('supertest');
const app = require('./app');
test('Health check', async () => {
const response = await request(app)
.get('/healthcheck')
.send()
.expect(200);
expect(response.body.msg).toBe('Hello!');
};
I have a node express app that is making calls to an external API using axios. I'm using redis for caching. It works fine, but I ran into a number of problems with unit tests - specifically trying to mock or inject mocks for redis client and an axios instance
My approach to making the route testable was to create a factory function for the route. It works, but I am unsure if there might be side/adverse effects, or if I'm using a bad practice and missing a standard solution
In api.js I require the details-route, passing in the axiosInstance and redisClient instance
// api.js
const detailsRouter = require('./details-route')(axiosInstance, redisClient);
router.use('/details', detailsRouter );
module.exports = router;
//details-route.js
const express = require('express');
const router = express.Router();
const fp = require('lodash/fp');
// local modules
const constants = require('../constants');
const axiosUtil = require('../axios-util');
const getRouter = (axiosInstance, redisClient) => {
router.get('/details', (req, res) => {
redisClient.get(req.originalUrl, (err, reply) => {
if (reply) {
// etc
}
// etc
}
}
return router;
}
module.exports = getRouter;
Note: I'm using redis-mock and axios-mock-adapter in my unit tests, and I've looked into using rewire (but it's not workable for wrapping the internal redis client)
I have a app.js
const express = require('express');
const app = express();
const server = require('./server.js');
// app.use
const io = require('socket.io').listen(server);
io.on('connection', function (socket) {
...
});
module.exports = app;
And a server.js
const app = require('./app');
const server = app.listen(5000 || process.env.PORT, () => {
console.log('App listening on port 5000!');
})
module.exports = server;
If I put the server in a separated file the socket is not working, but if I start the server inside the app.js the socket works.
What I'm doing wrong?
The issue here is that you have a circular dependency where app.js is loading server.js and server.js is loading app.js. You can't do that for this type of code.
It has an issue because you're trying to load server.js from within app.js and then in the process of loading server.js, it attempts to load app.js and get its exports, but app.js hasn't finished loading yet and thus hasn't even returned its exports yet. So, the loader either thinks there are no exports or recognizes the circular request (I'm not sure which), but in either case the exports from app.js don't work because of the circular requires.
There are several different ways to solve this. The two most common ways are:
Break some code into a common third module that each of these can load and only have one of these load the other.
Rather than having server.js load app to get the app object, have app.js pass the app object to server.js in a constructor function rather than trying to execute at module load time.
Here's how the constructor function idea would work:
app.js
const express = require('express');
const app = express();
// load server.js and call it's constructor, passing the app object
// that module constructor function will return the server object
const server = require('./server.js')(app);
// app.use
const io = require('socket.io').listen(server);
io.on('connection', function (socket) {
...
});
module.exports = app;
server.js
// export constructor function that must be called to initialize this module
module.exports = function(app) {
const server = app.listen(5000 || process.env.PORT, () => {
console.log('App listening on port 5000!');
});
return server;
};
So, rather than server.js trying to load the app.js module to get the app object, the app object is "pushed" to it with a constructor function. This prevents the circular dependency.
In Node.js I'm using Socker.io in my main.js like this
const io = require('socket.io')(http);
Also Im using a "sub"-file like api.js which I want to use to delecate some of my business logic away from the main. So I imported also this one like
const api = require('./api.js');
In my api.js how can I now use the socket.io framework? Can I access the instance from above from a different file? Or do I have to pass the "io"-object like this: api.myFoo(io);
Every place that you say require('module') you will get the same instance of that module.
But here when you want to share the return value of a function, then you have to export it explicitly:
const io = require('socket.io')(http);
module.exports = io;
in some module, and require it in another modules by:
const io = require('./your-module');
Other option would be to pass it as an argument to other modules, like this:
const io = require('socket.io')(http);
const api = require('./api.js')(io);
but in that case your api.js would have to export a function that takes io as an argument:
module.exports = (io) => {
return ... // return whatever was exported before
};