I am working on a piece where I am basically refactoring existing code. I have two files: index and server. My index is:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
// Now that the nonsense is over... load up the server entry point
require('./server');
My server is like:
import path from 'path';
import Loadable from 'react-loadable';
...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (require.main === module) {
// called directly i.e. $ node index.js
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
} else {
// required as a module => executed on aws lambda
module.exports = server;
}
};
main();
The server should run a REST service when executed locally, and export the server instance for the inject method. This way a proxy can attach to it when running under AWS Lambda.
I used the same setup before, multiple times. Only the two pieces were in the same file eg the server was inside the index. A single file version works fine - require.main comparison tells the program how it is running and module.exports exposes the server instance [server] with the needed inject method when running under Lambda and runs the REST service with a direct invocation.
However, since I need to import react-loadable this time, I split the files in two pieces.
Now, I probably need to figure out how the code is running inside index.js as server.js is not being invoked directly then pass it to the server. It is probably not too difficult.
My main problem is that when if I do console.log(require('./server')) from index, it prints {}. While the server instance is created successfully, and the inject() is present; I am somehow unable to export it from the server.js file and import into index.js correctly, and therefore [re-]export it from index.js for the proxy to attach.
Obviously, I am doing something incorrectly. To me, seems that my require does not have the server instance because the instance gets created after the require finished. Since my main() is async, it is plausible.
What is the right way to accomplish this?
First, notice that your main() function in ./server.js is async and you're defining module.exports from within that asynchronous function. Second, you're calling require('./server.js') from ./index.js without waiting for the asynchronous work to finish. Node resolves require()-d modules as a blank object immediately (that's the {} you're getting), and then extends the object when any async or cyclic material becomes available. So that's why you're seeing what you're seeing.
Which solutions will or will not fit your use case will depend on the details of how your AWS/direct invocation is supposed to work. Here's a suggestion:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
// ...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
process.env.serverRunLocally = require.main === module;
// Now that the nonsense is over... load up the server entry point
require('./server').then(listen => listen());
import path from 'path';
import Loadable from 'react-loadable';
// ...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (process.env.serverRunLocally) {
// called directly i.e. $ node index.js
return () => {
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
}
} else {
// required as a module => executed on aws lambda
return server;
}
};
module.exports = main();
Related
I added tests to my node js project using jest but for each test suite there's a beforeAll method that creates a new test server and connects to a mongo database and an afterAll method that closes both test server and the database. I would like to perform the above tasks globally for all the test suites not one at a time. Below is a sample of my code.
app.js
const express = require("express");
const app = express();
const { connectToDb } = require("./startup/db");
require("./startup/routes")(app);
connectToDb();
...
const port = process.env.PORT || 3000;
if (process.env.NODE_ENV !== "test") {
app.listen(port, () => winston.info(`Listening on port ${port}...`));
}
module.exports = app;
auth.test.js
const request = require("supertest");
const http = require("http");
const { disconnectDb } = require("../../startup/db");
describe("auth middleware", () => {
let server;
beforeAll((done) => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
});
afterAll((done) => {
server.close(done);
disconnectDb();
});
it("should return 401 if no token is provided", async () => {
const res = request(server)
.post("/api/genres")
.set("x-auth-token", "")
.send({ name: "genre1" });
expect(res.status).toBe(401);
});
...
jest.config.js
module.exports = {
testEnvironment: "node",
};
Try with this jest.config.js:
module.exports = {
testEnvironment: "node",
globalSetup: '<rootDir>/src/testSetup.ts'
};
And in testSetup.ts you can do:
// testSetup.ts
module.exports = async () => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
};
use this config: setupFiles: ['./tests/setup.js']
your setup file should look like this:
// setup.js
(async () => {
const app = require('../app.js')
global.app = app
})()
then you will be able to use app globally in every test suite
I had the same problem, I wanted to make one database connection before all test files and close the connection after all tests in all files.
But....I did not achieve what I wanted and MAYBE we don't need to do this.
I found a solution to launch functions beforeAll(),afterAll() etc... really before ALL TEST FILES and after ALL TEST FILES etc..
So you define these functions once in a certain file and they run for every test file.
To do that, all we need is to create a setupFile.ts and add path to this file in jest.config or in package.json "setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
Here is an example of my jest configuration.
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
"rootDir": "src",
"verbose": true,
"clearMocks": true,
"testMatch": [
"**/**/*.test.ts"
]
},
Here is an example of setupFile.ts
import usersCollection from "../../database/user-schema";
import mongoose from "mongoose";
beforeAll(async () => {
try {
await mongoose.connect(process.env.MONGODB_URL!);
await usersCollection.deleteMany({});
} catch (error) {
console.log(error);
}
});
afterAll(async () => {
try {
await mongoose.disconnect();
} catch (error) {
console.log(error);
}
});
It means that we will establish a connection to the database FOR EVERY TEST FILE BEFORE ALL TESTS IN THAT FILE and close connection after all tests in every test file.
What I realized for myself:
In real life we have many test files and not every file needs a connection to a database.
It's perfectly fine to open a connection to a database in files which need a connection and close after all tests in that file, for example integration tests when we test API endpoints.
In other tests to not use real database for many unit tests we can consider to mock(simulate) a database. It's another very interesting topic 😊
If I say something wrong you can correct me
P.S
I also want to mention what is written in the Mongoose documentation
Do not use globalSetup to call mongoose.connect() or
mongoose.createConnection(). Jest runs globalSetup in a separate
environment, so you cannot use any connections you create in
globalSetup in your tests.
https://mongoosejs.com/docs/jest.html
I have a distributed system and all JS files are exposed through HTTP. So a normal module would look like this:
http://example.com/path/to/main.js
import * as core from 'http://local.example.com/path/to/core.js';
import * as redux from 'http://cdn.example.com/redux.js#version';
// code
export default {
...
}
So each import will be using either a local resource to the system or possibly remotely available resources using CDN.
Thought when I run webpack, I get this error:
trying to parse a local generated file with such content:
import * as main from 'http://example.com/path/to/main.js';
ERROR in ./src/index.js Module not found: Error: Can't resolve
'http://example.com/path/to/main.js' in '/home/.../index.js'
Is it possible to tell webpack to fetch the urls and include them inside the bundle... While packaging cdn urls isn't a big deal for now, I'd be happy if I could simply ignore the ones with a certain url.
Thought being able to bundle remote all the http:// located files would be a good start.
Also, any remote resource linking to other resources should recursively load remotely linked resources too.
Here's my current webpack config (thought nothing much to see here):
const path = require('path');
module.exports = {
mode: 'development',
entry: './src/index.js',
output: {
filename: 'main.js',
path: path.resolve(__dirname, 'dist'),
},
module: {
rules: [
]
},
};
Edit: after reading a bit, I started writing a resolver but now I'm stuck again:
const path = require('path');
const fetch = require('node-fetch');
const url = require('url')
const fs = require('promise-fs');
const sha1 = require('sha1')
class CustomResolver {
async download_save(request, resolveContext) {
console.log(request, resolveContext)
var target = url.parse(request.request)
var response = await fetch(request.request)
var content = await response.text()
try {
await fs.stat('_remote')
} catch(exc) {
await fs.mkdir('_remote')
}
var filename = `${sha1(request.request)}.js`
var file_path = `_remote/${filename}`
await fs.writeFile(file_path, content)
var abs_path = path.resolve(file_path)
var url_path = `${target.protocol}://${target.hostname}/`
var obj = {
path: abs_path,
request: request.request,
query: '',
}
console.log(`${request.request} saved to ${abs_path}`)
return obj
}
apply(resolver) {
var self = this
const target = resolver.ensureHook("resolved")
resolver.getHook("module")
.tapAsync("FetchResolverPlugin", (request, resolveContext, callback) => {
self.download_save(request, resolveContext)
.then((obj) => resolver.doResolve(target, obj, resolveContext, callback))
.catch((err) => {
console.log(err)
callback()
})
})
}
}
It does currently fetch urls starting with https:// but seems to be struggling to resolve urls relative to an http resource. For example
ERROR in _remote/88f978ae6c4a58e98a0a39996416d923ef9ca531.js
Module not found: Error: Can't resolve '/-/#pika/polyfill#v0.0.3/dist=es2017/polyfill.js' in '_remote/'
# _remote/88f978ae6c4a58e98a0a39996416d923ef9ca531.js 25:0-58
# _remote/f80b922b2dd42bdfaaba4e9f4fc3c84b9cc04fca.js
# ./src/index.js
It doesn't look like it tries to resolve relative path to already resolved files. Is there a way to tell the resolver to try to resolve everything?
Main point is: if you have CDN files - you don't need a bundler.
They already minified and ready to use. Just import files in root of your project and call libraries globally.
This is a totally reduced example to better explain the issue! So when I use the resolver Query getAllUsers, the MongoDB Collection Users is not available in the external resolver file user.js. So when I send that query I get:
ReferenceError: Users is not defined
That's a correct behaviour. But I do not want to include all the resolvers in my index.js, because I have a better modularization in this way. So I have all my typedefs and resolvers in external files like this.
Current file structure
index.js
/graphql
/typdef
user.graphql
/resolver
user.js
The user.graphql schema is correctly working. It is just the user.js that is producing the error when I execute the query with the not available Users variable, as already said.
Here the index.js and user.js.
index.js
import express from 'express'
import cors from 'cors'
const app = express()
app.use(cors())
import bodyParser from 'body-parser'
import {graphqlExpress, graphiqlExpress} from 'graphql-server-express'
import {makeExecutableSchema} from 'graphql-tools'
import {fileLoader, mergeTypes, mergeResolvers} from 'merge-graphql-schemas';
import {writeFileSync} from 'fs'
const typeDefs = mergeTypes(fileLoader(`${__dirname}/graphql/typedef/*.graphql`), { all: true })
writeFileSync(`${__dirname}/graphql/typedef.graphql`, typeDefs)
export const start = async () => {
try {
const MONGO_URL = 'mongodb://localhost:27017'
const MongoClient = require('mongodb').MongoClient;
MongoClient.connect(MONGO_URL, function(err, client) {
console.log("Connected successfully to server");
const db = client.db('project');
const Users = db.collection('user')
});
const URL = 'http://localhost'
const homePath = '/graphql'
const PORT = 3001
app.use(
homePath,
bodyParser.json(),
graphqlExpress({schema})
)
app.use(homePath,
graphiqlExpress({
endpointURL: homePath
})
)
app.listen(PORT, () => {
console.log(`Visit ${URL}:${PORT}${homePath}`)
})
} catch (e) {
console.log(e)
}
}
user.js
export default {
Query: {
getAllUsers: async () => {
return (await Users.find({}).toArray()).map(prepare)
}
}
}
What is the best way to pass the MongoDB or the Users collection to the resolver files. Or is there an even better solution for this issue?
First of all, this is NOT a proper solution, because declaring global variables while outsourcing schema is a bad design at all. But it works out and maybe this way someone gets an idea about how to improve this fix.
So to solve the issue all I had to do is changing the variable from local const to global.
So in index.js const Users = db.collection('user') is rewritten by global.Users = db.collection('user').
Same for the user.js. Here return (await Users.find({}).toArray()).map(prepare) is rewritten by return (await global.Users.find({}).toArray()).map(prepare).
I want to use koa-views with Koa and Koa-Router with Next.js. In previous projects, I had no issues with express but in this project, I have to use Koa. Using its router, I want to render a page: /some/page/:id. Following the same Nextjs way:
router.get('/some/page/:id', async (ctx, next) => {
const actualPage = '/some/page/id' // id.js (not actual name 😝)
await ctx.render(actualPage, {/* could pass object */})
});
That would work if I was using express. With Koa:
const Koa = require('koa');
const views = require('koa-views');
// const render = require('koa-views-render'); <-- I what's this?
[..] // Making things short here
const server = new Koa();
const router = new Router();
// My issue, I'm seeing tutorials using other engines: .ejs etc
// I'm not using any, I only have .js files
server.use(views(__dirname + "/pages", { extension: 'js' }));
Using the same router.get... function as above, I get:
Error: Engine not found for the ".js" file extension
When I go to /some/page/123, I'd expect it to render the file /pages/some/page/id.js. How?
It turns out I do not need any extra modules to achieve this 🙀
Create a function called, ie, routes then pass app and router as a param
const routes = (router, app) => {
router.get('/some/page/:id', async (ctx) => {
const { id } = ctx.params
const actualPage = '/some/page/id'
// Render the page
await app.render(ctx.req, ctx.res, actualPage, {foo: 'Bar'})
}
}
module.exports = routes
Inside your server.js file:
// const routes = require('./routes);
// const app = next({ dev }); // import other modules for this section
// app.prepare().then(() => {
// const router = new Router();
// [..]
// routes(router, app)
// })
The commented out section is a slim down version to make a point in where things should be.
My Issue
I've coded a very simple CRUD API and I've started recently coding also some tests using chai and chai-http but I'm having an issue when running my tests with $ mocha.
When I run the tests I get the following error on the shell:
TypeError: app.address is not a function
My Code
Here is a sample of one of my tests (/tests/server-test.js):
var chai = require('chai');
var mongoose = require('mongoose');
var chaiHttp = require('chai-http');
var server = require('../server/app'); // my express app
var should = chai.should();
var testUtils = require('./test-utils');
chai.use(chaiHttp);
describe('API Tests', function() {
before(function() {
mongoose.createConnection('mongodb://localhost/bot-test', myOptionsObj);
});
beforeEach(function(done) {
// I do stuff like populating db
});
afterEach(function(done) {
// I do stuff like deleting populated db
});
after(function() {
mongoose.connection.close();
});
describe('Boxes', function() {
it.only('should list ALL boxes on /boxes GET', function(done) {
chai.request(server)
.get('/api/boxes')
.end(function(err, res){
res.should.have.status(200);
done();
});
});
// the rest of the tests would continue here...
});
});
And my express app files (/server/app.js):
var mongoose = require('mongoose');
var express = require('express');
var api = require('./routes/api.js');
var app = express();
mongoose.connect('mongodb://localhost/db-dev', myOptionsObj);
// application configuration
require('./config/express')(app);
// routing set up
app.use('/api', api);
var server = app.listen(3000, function () {
var host = server.address().address;
var port = server.address().port;
console.log('App listening at http://%s:%s', host, port);
});
and (/server/routes/api.js):
var express = require('express');
var boxController = require('../modules/box/controller');
var thingController = require('../modules/thing/controller');
var router = express.Router();
// API routing
router.get('/boxes', boxController.getAll);
// etc.
module.exports = router;
Extra notes
I've tried logging out the server variable in the /tests/server-test.js file before running the tests:
...
var server = require('../server/app'); // my express app
...
console.log('server: ', server);
...
and I the result of that is an empty object: server: {}.
You don't export anything in your app module. Try adding this to your app.js file:
module.exports = server
It's important to export the http.Server object returned by app.listen(3000) instead of just the function app, otherwise you will get TypeError: app.address is not a function.
Example:
index.js
const koa = require('koa');
const app = new koa();
module.exports = app.listen(3000);
index.spec.js
const request = require('supertest');
const app = require('./index.js');
describe('User Registration', () => {
const agent = request.agent(app);
it('should ...', () => {
This may also help, and satisfies #dman point of changing application code to fit a test.
make your request to the localhost and port as needed
chai.request('http://localhost:5000')
instead of
chai.request(server)
this fixed the same error message I had using Koa JS (v2) and ava js.
The answers above correctly address the issue: supertest wants an http.Server to work on. However, calling app.listen() to get a server will also start a listening server, this is bad practice and unnecessary.
You can get around by this by using http.createServer():
import * as http from 'http';
import * as supertest from 'supertest';
import * as test from 'tape';
import * as Koa from 'koa';
const app = new Koa();
# add some routes here
const apptest = supertest(http.createServer(app.callback()));
test('GET /healthcheck', (t) => {
apptest.get('/healthcheck')
.expect(200)
.expect(res => {
t.equal(res.text, 'Ok');
})
.end(t.end.bind(t));
});
Just in case, if someone uses Hapijs the issue still occurs, because it does not use Express.js, thus address() function does not exist.
TypeError: app.address is not a function
at serverAddress (node_modules/chai-http/lib/request.js:282:18)
The workaround to make it work
// this makes the server to start up
let server = require('../../server')
// pass this instead of server to avoid error
const API = 'http://localhost:3000'
describe('/GET token ', () => {
it('JWT token', (done) => {
chai.request(API)
.get('/api/token?....')
.end((err, res) => {
res.should.have.status(200)
res.body.should.be.a('object')
res.body.should.have.property('token')
done()
})
})
})
Export app at the end of the main API file like index.js.
module.exports = app;
We had the same issue when we run mocha using ts-node in our node + typescript serverless project.
Our tsconfig.json had "sourceMap": true . So generated, .js and .js.map files cause some funny transpiling issues (similar to this). When we run mocha runner using ts-node. So, I will set to sourceMap flag to false and deleted all .js and .js.map file in our src directory. Then the issue is gone.
If you have already generated files in your src folder, commands below would be really helpful.
find src -name ".js.map" -exec rm {} \;
find src -name ".js" -exec rm {} \;
I am using Jest and Supertest, but was receiving the same error. It was because my server takes time to setup (it is async to setup db, read config, etc). I needed to use Jest's beforeAll helper to allow the async setup to run. I also needed to refactor my server to separate listening, and instead use #Whyhankee's suggestion to create the test's server.
index.js
export async function createServer() {
//setup db, server,config, middleware
return express();
}
async function startServer(){
let app = await createServer();
await app.listen({ port: 4000 });
console.log("Server has started!");
}
if(process.env.NODE_ENV ==="dev") startServer();
test.ts
import {createServer as createMyAppServer} from '#index';
import { test, expect, beforeAll } from '#jest/globals'
const supertest = require("supertest");
import * as http from 'http';
let request :any;
beforeAll(async ()=>{
request = supertest(http.createServer(await createMyAppServer()));
})
test("fetch users", async (done: any) => {
request
.post("/graphql")
.send({
query: "{ getQueryFromGqlServer (id:1) { id} }",
})
.set("Accept", "application/json")
.expect("Content-Type", /json/)
.expect(200)
.end(function (err: any, res: any) {
if (err) return done(err);
expect(res.body).toBeInstanceOf(Object);
let serverErrors = JSON.parse(res.text)['errors'];
expect(serverErrors.length).toEqual(0);
expect(res.body.data.id).toEqual(1);
done();
});
});
Edit:
I also had errors when using data.foreach(async()=>..., should have use for(let x of... in my tests