Seed db with sequelize seed files before running tests with - javascript

How do I do rake style commands in my test file (Jest) with sequelize seeder files?
I'm trying to do the same thing as this, but with sequelize.
describe('routes : movies', () => {
beforeEach(() => {
return knex.migrate.rollback()
.then(() => { return knex.migrate.latest(); })
.then(() => { return knex.seed.run(); });
});
afterEach(() => {
return knex.migrate.rollback();
});
});

I know I'm late to answer this but I just spent the past week trying to solve this issue. I have been able to successfully do this using Sequelize in conjunction with their sister project, Umzug. You will have to read the documentation for your specific issue but I can copy my test file so you can get an idea of how I did it. I'm happy to help someone if they still struggle with it after looking at the files.
// account.test.js
const models = require('../models/index.js');
const migrations = require("../index");
beforeAll(async () => {
await migrations.up().then(function() {
console.log("Migration and seeding completed")
});
});
afterAll( async () => {
await migrations.down().then(function() {
console.log("Migrations and down seeding completed");
})
const users = await models.User.findAll();
expect(users).toStrictEqual([]);
});
describe("Integration Test", () => {
it("Account integration test", async () => {
const data = { userId: 1210}
const users = await models.User.findAll();
console.log("All users:", JSON.stringify(users, null, 2));
expect(users[0].firstName).toBe('John');
expect(data).toHaveProperty('userId');
});
});
My index.js file
// index.js
const config = require('/config/config.json');
const { Sequelize } = require('sequelize');
const { Umzug, SequelizeStorage } = require('umzug');
const sequelize = new Sequelize(config);
const umzugMigration = new Umzug({
migrations: { glob: 'migrations/*.js' },
context: sequelize.getQueryInterface(),
storage: new SequelizeStorage({ sequelize }),
logger: console,
});
const umzugSeeding = new Umzug({
migrations: { glob: 'seeders/*.js' },
context: sequelize.getQueryInterface(),
storage: new SequelizeStorage({ sequelize }),
logger: console,
});
module.exports.up = () => umzugMigration.up().then(() => umzugSeeding.up());
module.exports.down = () => umzugSeeding.down();

I think you shouldn't make real DB requests while testing your code. Mock your DB request and return the data set from your mock function if it's needed. Otherwise, it looks like you testing a library, in your case this lib is knex.
Read for more details regarding mocks https://jestjs.io/docs/en/mock-functions

Related

Unit Testing with Jest for Strapi v4

I am trying to perform unit tests with Jest for the new version of Strapi, v4 which was just released a couple of weeks ago. In accordance with their documentation, the old guide for unit testing no longer runs as expected. I have, however, modified the code to work to a certain extent. Currently I have the following:
./test/helpers/strapi.js:
const Strapi = require("#strapi/strapi");
let instance;
async function setupStrapi() {
if (!instance) {
/** the following code in copied from `./node_modules/strapi/lib/Strapi.js` */
await Strapi().load();
instance = strapi; // strapi is global now
await instance.server
.use(instance.server.router.routes()) // populate KOA routes
.use(instance.server.router.allowedMethods()); // populate KOA methods
await instance.server.mount();
}
return instance;
}
module.exports = {
setupStrapi
};
./tests/app.test.js:
const fs = require("fs");
const { setupStrapi } = require("./helpers/strapi");
beforeAll(async () => {
await setupStrapi();
});
afterAll(async () => {
const dbSettings = strapi.config.get("database.connection.connection");
//close server to release the db-file
await strapi.server.destroy();
//DATABASE_FILENAME=.tmp/test.db
//delete test database after all tests
if (dbSettings && dbSettings.filename) {
const tmpDbFile = `${dbSettings.filename}`;
if (fs.existsSync(tmpDbFile)) {
fs.unlinkSync(tmpDbFile);
}
}
});
it("should return hello world", async () => {
await request(strapi.server.httpServer).get("/api/hello").expect(200); // Expect response http code 200
});
./config/env/test/database.js
const path = require("path");
module.exports = ({ env }) => ({
connection: {
client: "sqlite",
connection: {
filename: path.join(
__dirname,
"../../../",
env("DATABASE_FILENAME", ".tmp/test.db")
),
},
useNullAsDefault: true,
},
});
The route /api/hello is a custom API endpoint. This works perfectly when running strapi develop, and all permissions are set correctly.
The tests run, but every endpoint that is not / or /admin returns 403 Forbidden, meaning there is a problem with the permissions. It would seem that the database file .tmp/data.db (used in development) is not replicated correctly in .tmp/test.db. In other words, this is close to working, but the permissions for API endpoints are not set correctly.
I have been searching through StackOverflow and the Stapi Forums over the past few days but to no avail. I would greatly appreciate some pointers as to how to fix this :)
It seems you need to grant the right privileges to your routes on your test DB.
For that you can create a function, lets call it grantPriviledge, and call it in your test in the function beforeAll.
// Here I want to grant the route update in my organization collection
beforeAll(async () => {
await grantPrivilege(1, 'permissions.application.controllers.organization.update');
});
And here is the function grantPriviledge:
// roleID is 1 for authenticated and 2 for public
const grantPrivilege = async (roleID = 1, value, enabled = true, policy = '') => {
const updateObj = value
.split('.')
.reduceRight((obj, next) => ({ [next]: obj }), { enabled, policy });
const roleName = roleID === 1 ? 'Authenticated' : 'Public';
const roleIdInDB = await strapi
.query('role', 'users-permissions')
.findOne({ name: roleName });
return strapi.plugins['users-permissions'].services.userspermissions.updateRole(
roleIdInDB,
updateObj,
);
};
Let me know if that helps
So in order for that to work in v4, this is how I did.
This was based in this and this but Stf's posting in this saying "inject them in your database during the bootstrap phase like it is made in the templates" was what really set me in the right track.
So if you look here you will see this function:
async function setPublicPermissions(newPermissions) {
// Find the ID of the public role
const publicRole = await strapi
.query("plugin::users-permissions.role")
.findOne({
where: {
type: "public",
},
});
// Create the new permissions and link them to the public role
const allPermissionsToCreate = [];
Object.keys(newPermissions).map((controller) => {
const actions = newPermissions[controller];
const permissionsToCreate = actions.map((action) => {
return strapi.query("plugin::users-permissions.permission").create({
data: {
action: `api::${controller}.${controller}.${action}`,
role: publicRole.id,
},
});
});
allPermissionsToCreate.push(...permissionsToCreate);
});
await Promise.all(allPermissionsToCreate);
}
Later on the code, this function is called like this:
await setPublicPermissions({
article: ["find", "findOne"],
category: ["find", "findOne"],
author: ["find", "findOne"],
global: ["find", "findOne"],
about: ["find", "findOne"],
});
So in my case I modified this function a bit to accept between authenticated (1) and public (2) roles inspired by Sidney C answer above.
This is how I did it:
const grantPrivilege = async (roleID = 1, newPermissions) => {
const roleName = roleID === 1 ? "authenticated" : "public";
// Find the ID of the public role
const roleEntry = await strapi
.query("plugin::users-permissions.role")
.findOne({
where: {
type: roleName,
},
});
// Create the new permissions and link them to the public role
const allPermissionsToCreate = [];
Object.keys(newPermissions).map((controller) => {
const actions = newPermissions[controller];
const permissionsToCreate = actions.map((action) => {
return strapi.query("plugin::users-permissions.permission").create({
data: {
action: `api::${controller}.${controller}.${action}`,
role: roleEntry.id,
},
});
});
allPermissionsToCreate.push(...permissionsToCreate);
});
await Promise.all(allPermissionsToCreate);
};
And then in my beforeAll block I call it like this:
await grantPrivilege(1, {
"my-custom-collection": ["create", "update"],
category: ["find", "findOne"],
author: ["find", "findOne"],
});

How to mock fs module together with unionfs?

I have written a test case that successfully load files into virtual FS, and at the same time mounted a virtual volume as below
describe("should work", () => {
const { vol } = require("memfs");
afterEach(() => vol.reset());
beforeEach(() => {
vol.mkdirSync(process.cwd(), { recursive: true });
jest.resetModules();
jest.resetAllMocks();
});
it("should be able to mock fs that being called in actual code", async () => {
jest.mock("fs", () => {
return ufs //
.use(jest.requireActual("fs"))
.use(createFsFromVolume(vol) as any);
});
jest.mock("fs/promises", () => {
return ufs //
.use(jest.requireActual("fs/promises"))
.use(createFsFromVolume(vol) as any);
});
const { createFsFromVolume } = require("memfs");
const { ufs } = require("unionfs");
const { countFile } = require("../src/ops/fs");
vol.fromJSON(
{
"./some/README.md": "1",
"./some/index.js": "2",
"./destination": null,
},
"/app"
);
const result = ufs.readdirSync(process.cwd());
const result2 = ufs.readdirSync("/app");
const result3 = await countFile("/app");
console.log({ result, result2, result3 });
});
});
By using ufs.readdirSync, I can access to virtual FS and indeed result giving me files that loaded from disc into virtual FS, result2 representing /app which is a new volume created from vol.fromJSON.
Now my problem is I am unable to get the result for result3, which is calling countFile method as below
import fsPromises from "fs/promises";
export const countFile = async (path: string) => {
const result = await fsPromises.readdir(path);
return result.length;
};
I'm getting error
Error: ENOENT: no such file or directory, scandir '/app'
which I think it's because countFile is accessing the actual FS instead of the virtual despite I've had jest.mock('fs/promises')?
Please if anyone can provide some lead?
This is the function you want to unit test.
//CommonJS version
const fsPromises = require('fs/promises');
const countFile = async (path) => {
const result = await fsPromises.readdir(path);
return result.length;
};
module.exports = {
countFile
}
Now, how you would normally go about this, is to mock fsPromises. In this example specifically readdir() since that is the function being used in countFile.
This is what we call: a stub.
A skeletal or special-purpose implementation of a software component, used to develop or test a component that calls or is otherwise dependent on it. It replaces a called component.
const {countFile} = require('./index');
const {readdir} = require("fs/promises");
jest.mock('fs/promises');
beforeEach(() => {
readdir.mockReset();
});
it("When testing countFile, given string, then return files", async () => {
const path = "/path/to/dir";
// vvvvvvv STUB HERE
readdir.mockResolvedValueOnce(["src", "node_modules", "package-lock.json" ,"package.json"]);
const res = await countFile(path);
expect(res).toBe(4);
})
You do this because you're unit testing. You don't want to be dependent on other functions because that fails to be a unit test and more integration test. Secondly, it's a third-party library, which is maintained/tested by someone else.
Here is where your scenario applies. From my perspective, your objective isn't to test countFile() rather, to test fsPromises and maybe test functionality to read virtual file-systems: unionfs. If so then, fsPromises doesn't need to really be mocked.

Test firestore trigger locally

I am writing a test which tests a firebase trigger. The problem, however, is that I cannot make it work.
I want to use the local firestore emulator and Jest in order to simulate a change in the firestore and see if the trigger does what it needs to do.
I require the cloud function in my test and I initialize my app
Setup.js:
const firebase = require('#firebase/testing');
const PROJECT_ID = 'project';
let admin;
let db;
const setupAdmin = async () => {
admin = firebase.initializeAdminApp({
projectId: PROJECT_ID
});
db = admin.firestore();
};
const getAdmin = () => {
return admin;
};
const getDb = () => {
return db;
};
module.exports.setupAdmin = setupAdmin;
module.exports.getAdmin = getAdmin;
module.exports.getDb = getDb;
Test.js
describe('Billing', () => {
let dbRef;
beforeAll(async () => {
const {db, admin} = require('../../../functions/helpers/setup');
dbRef = db;
});
afterAll(async () => {
await Promise.all(firebase.apps().map(app => app.delete()));
console.log(`View rule coverage information at ${COVERAGE_URL}\n`);
});
it('test', async () => {
const mockData = {
'Users/user1': {
uid: 'user1'
},
['Users/user1/Taxes/' + new Date().getFullYear().toString()]: {
totalExpenseEuro: 0
}
};
for (const key in mockData) {
const ref = dbRef.doc(key);
await ref.set(mockData[key]);
}
// Create mockup data
await dbRef.collection('Users').doc('user1').collection('Expenses').doc('expense1').set({
amountEuroInclVAT: 100
});
// Make snapshot for state of database beforehand
const beforeSnap = test.firestore.makeDocumentSnapshot({amountEuroInclVAT: 0}, 'Users/user1/Expenses/expense1');
// Make snapshot for state of database after the change
const afterSnap = test.firestore.makeDocumentSnapshot(
{amountEuroInclVAT: 100},
'Users/user1/Expenses/expense1'
);
const change = test.makeChange(beforeSnap, afterSnap);
// Call wrapped function with the Change object
const wrapped = test.wrap(calculateTaxesOnExpenseUpdate);
wrapped(change, {
params: {
uid: 'test1'
}
});
});
});
Now the main problem comes when I try to access this db object in my trigger
const calculateTaxesOnExpenseUpdate = functions.firestore
.document('Users/{uid}/Expenses/{expenseId}')
.onWrite(async (change, context) => {
const {getDb} = require('../helpers/setup'); // This setup is the same as above
let db = getDb();
...
For some reason when I perform an action like (await db.collection('Users').get()).get('totalExpenseEuro'), Jest stops executing my code. When I set a debugger right after that line, it never gets printed. That piece of code crashes, and I have no idea why. I think the DB instance if not properly configured in my cloud trigger function.
Question: What is a good way of sharing the DB instance (admin.firestore()) between the test and the cloud trigger functions?

MongoDB reusable custom javascript module

I would like to create a local Javascript module I can "require" in other files to handle all MongoDB CRUD operations.
I wrote something as:
-- dbConn.js file --
require('dotenv').config()
const MongoClient = require('mongodb').MongoClient
const ObjectID = require('mongodb').ObjectID
let _connection
const connectDB = async () => {
try {
const client = await MongoClient.connect(process.env.MONGO_DB_URI, {
useNewUrlParser: true,
useUnifiedTopology: true
})
console.log('Connected to MongoDB')
return client
} catch (err) {
console.log(error)
}
}
exports.findOne = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.findOne(query);
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
exports.findAll = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.find(query).toArray();
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
Then in another file (not necessary inside Express app), say
-- app.js ---
const findAll = require('./dbConn').findAll
const findOne = require('./dbConn').findOne
findAll().then(res => JSON.stringify(console.log(res)))
findOne().then(res => JSON.stringify(console.log(res)))
I wonder if it is correct?
I have to close the connection after each method/CRUD operation?
I was trying to use IIF instead of ".then", as:
(async () => {
console.log(await findOne())
})()
But I receive a weird error saying that findAll is not a function.
What's wrong with it?
Thanks.
It really depends on your use case which isn’t clear If you are using Express or just stand alone and how frequent are you planning to run app.js
Either way your code is expensive, each time you reference dbCon.js you are opening a new connection to the database.
So you can fix app.js by only requiring dbCon.js once and use it..
The best practice is to ofcourse use connection pooling https://www.compose.com/articles/connection-pooling-with-mongodb/

How to mock pg Pool with Sinon

In a previous project I mocked the mysql library with Sinon. I did this like so:
X.js:
const con = mysql.createPool(config.mysql);
...
Some other place in the project:
const rows = await con.query(query, inserts);
...
X.test.js:
const sinon = require('sinon');
const mockMysql = sinon.mock(require('mysql'));
...
mockMysql.expects('createPool').returns({
query: () => {
// Handles the query...
},
...
It worked perfectly.
In another project I am trying to mock pg, again with Sinon.
pool.js:
const { Pool } = require('pg');
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
module.exports = pool;
Some other place in the project:
const con = await pool.connect();
const result = await con.query(...
Y.test.js:
???
I can't understand how to mock connect().query(). None of the following approaches work:
1:
const { Pool } = require('pg');
const config = require('#blabla/config');
const mockPool = sinon.mock(new Pool(config.get('database')));
...
mockPool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
1 results in no error but the real db connection is used.
2:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
pool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
2 => TypeError: Pool is not a constructor
3:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = sinon.createStubInstance(Pool);
pool.connect.returns({
query: () => {
console.log('query here');
},
});
3 => TypeError: The constructor should be a function.
Can anybody point me in the right direction with how to mock my PostgreSQL connection?
Example: I have postgres.js like this.
const { Pool } = require('pg');
const handler = {
count: async (pgQuery) => {
try {
const pool = new Pool();
const res = await pool.query(pgQuery);
return { count: parseInt(res.rows[0].counter, 10) };
} catch (error) {
// Log/Throw error here.
}
return false;
}
}
module.exports = handler;
The spec test I created on postgres.spec.js is like this.
const { expect } = require('chai');
const sinon = require('sinon');
const pgPool = require('pg-pool');
const handler = require('postgres.js');
describe('Postgres', function () {
it('should have method count that bla bla', async function () {
// Create stub pgPool query.
const postgreeStubQuery = sinon.stub(pgPool.prototype, 'query');
postgreeStubQuery.onFirstCall().throws('XXX');
postgreeStubQuery.onSecondCall().resolves({
rows: [{ counter: 11 }],
});
// Catch case.
const catcher = await handler.count('SELECT COUNT()..');
expect(catcher).to.equal(false);
expect(postgreeStubQuery.calledOnce).to.equal(true);
// Correct case.
const correct = await handler.count('SELECT COUNT()..');
expect(correct).to.deep.equal({ count: 11 });
expect(postgreeStubQuery.calledTwice).to.equal(true);
// Restore stub.
postgreeStubQuery.restore();
});
});
To stub pool.query(), you need to stub pg-pool prototype and method query.
Hope this helps.
Since you're needing to mock the returned results of a query, I think the easiest solution would be to abstract your database from the the code needing the query results. Example being, your query results are returning information about a person. Create a person.js module with specific methods for interacting with the database.
Your other code needing the person information from the database won't know or care what type of database you use or how you connect to it, all they care to know is what methods are exposed from person.js when they require it.
//person.js
const { Pool } = require('pg')
// do other database connection things here
const getPersonById = function (id) {
// use your query here and return the results
}
module.exports = { getPersonById }
Now in your tests, you mock the person module, not the pg module. Imagine if you had 20 some odd tests that all had the mock MySQL pool set up then you changed to pg, you'd have to change all of those, nightmare. But by abstracting your database connection type/setup, it makes testing much easier, because now you just need to stub/mock your person.js module.
const person = require('../person.js') //or whatever relative file path it's in
const sinon = require('sinon')
describe('person.js', function () {
it('is stubbed right now', function () {
const personStub = sinon.stub(person)
personStub.getPersonById.returns('yup')
expect(personStub.getPersonById()).to.eq('yup')
})
})
Below is a simpler approach that means the system-under-test doesn't need any special tricks.
It is comprised of two parts, though the first is "nice to have":
Use a DI framework to inject the pg.Pool. This is a better approach IMO anyway, and fits really well with testing.
In the beforeEach() of the tests, configure the DI framework to use a mock class with sinon.stub instances.
If you aren't using a DI framework, pass the mock as a Pool parameter... but DI is better ;)
The code below is TypeScript using tsyringe, but similar approaches will work fine with plain JavaScript etc.
Somewhere you'll have code that uses pg.Pool. A contrived example:
import { Pool } from 'pg'
...
function getPets(pool: Pool): Promise<Pet[]> {
return pool.connect()
.then(db => db.query(SQL_HERE)
.then(result => {
db.release()
return result.rows // or result.rows.map(something) etc
})
.catch(error => {
db.release()
throw error
})
)
}
That works, and it's fine if you want to pass the Pool instance in. I'd prefer not to, so I use tsyringe like this:
import { container } from 'tsyringe'
...
function getPets(): Promise<Pet[]> {
return container.resolve<Pool>().connect()
.then(...)
}
Exactly the same outcome, but getPets() is cleaner to call - it can be a pain to lug around a Pool instance.
The main of the program would set up an instance in one of a few ways. Here's mine:
...
container.register(Pool, {
useFactory: instanceCachingFactory(() => {
return new Pool(/* any config here */)
})
})
The beauty of this comes out in tests.
The code above (the "system under test") needs a Pool instance, and that instance needs a connect() method that resolves to a class with query() and release() methods.
This is what I used:
class MockPool {
client = {
query: sinon.stub(),
release: sinon.stub()
}
connect () {
return Promise.resolve(this.client)
}
}
Here's the setup of a test using MockPool:
describe('proof', () => {
let mockPool: MockPool
beforeEach(() => {
// Important! See:
// https://github.com/microsoft/tsyringe#clearing-instances
container.clearInstances()
mockPool = new MockPool()
container.registerInstance(Pool, mockPool as unknown as Pool)
})
})
The cast through unknown to Pool is needed because I'm not implementing the whole Pool API, just what I need.
Here's what a test looks like:
it('mocks postgres', async () => {
mockPool.client.query.resolves({
rows: [
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: 'Cat'}
]
})
const r = await getPets()
expect(r).to.deep.equal([
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: Cat'}
])
})
You can easily control what data the mock Postgres Pool returns, or throw errors, etc.

Categories

Resources