Here is my function which calls google dataflow function
index.js
const { google } = require('googleapis');
const triggerDataflowJob = async (event, context) => {
const auth = new google.auth.GoogleAuth({
scopes: ['https://www.googleapis.com/auth/cloud-platform'],
});
const authClient = await auth.getClient();
const projectId = await auth.getProjectId();
const dataflow = google.dataflow({ version: 'v1b3', auth: authClient });
const dataflowReqBody = dataflowRequest(projectId, event.bucket, event.name, context);
return dataflow.projects.locations.templates.create(dataflowReqBody);
};
module.exports = { triggerDataflowJob };
My unit test for above function
index.test.js
const { google } = require('googleapis');
const { triggerDataflowJob } = require('./index.js');
describe('Function: triggerDataflowJob', () => {
it('should return success', async () => {
const projectsStub = sinon.stub().returnsThis();
const locationsStub = sinon.stub().returnsThis();
const dataflowStub = sinon.stub(google, 'dataflow').callsFake(() => ({
projects: projectsStub,
locations: locationsStub,
templates: sinon.stub(),
}));
const context = { eventId: '126348454' };
const event = { bucket: 'test-bucket', name: 'test-file.json' };
await triggerDataflowJob(event, context);
sinon.assert.calledOnce(dataflowStub);
});
});
But I am getting below error when I run test.
1) Trigger Dataflow Job:
Function: triggerDataflowJob
should return success:
TypeError: Cannot read property 'templates' of undefined
at triggerDataflowJob (index.js:12:38)
at process._tickCallback (internal/process/next_tick.js:68:7)
Can some one please help where is the issue? or what I am missing or doing wrong?
From the error, it looks like the dataflow object you are getting back does not have a templates key within the location object.
Looking at your test, it looks like the dataflow object looks something like this:
const dataflow = {
projects: {...},
locations: {...},
templates: {...}
}
In the return statement of the main function you are looking for templates within locations.
return dataflow.projects.locations.templates.create(dataflowReqBody);
If locations is suppose to have a templates key, then you may need to update your test and the way your mocking the object. If it is not suppose to have a templates key then you can update your return statement like so:
return dataflow.projects.templates.create(dataflowReqBody);
Hope that helps!
Related
I am trying to perform unit tests with Jest for the new version of Strapi, v4 which was just released a couple of weeks ago. In accordance with their documentation, the old guide for unit testing no longer runs as expected. I have, however, modified the code to work to a certain extent. Currently I have the following:
./test/helpers/strapi.js:
const Strapi = require("#strapi/strapi");
let instance;
async function setupStrapi() {
if (!instance) {
/** the following code in copied from `./node_modules/strapi/lib/Strapi.js` */
await Strapi().load();
instance = strapi; // strapi is global now
await instance.server
.use(instance.server.router.routes()) // populate KOA routes
.use(instance.server.router.allowedMethods()); // populate KOA methods
await instance.server.mount();
}
return instance;
}
module.exports = {
setupStrapi
};
./tests/app.test.js:
const fs = require("fs");
const { setupStrapi } = require("./helpers/strapi");
beforeAll(async () => {
await setupStrapi();
});
afterAll(async () => {
const dbSettings = strapi.config.get("database.connection.connection");
//close server to release the db-file
await strapi.server.destroy();
//DATABASE_FILENAME=.tmp/test.db
//delete test database after all tests
if (dbSettings && dbSettings.filename) {
const tmpDbFile = `${dbSettings.filename}`;
if (fs.existsSync(tmpDbFile)) {
fs.unlinkSync(tmpDbFile);
}
}
});
it("should return hello world", async () => {
await request(strapi.server.httpServer).get("/api/hello").expect(200); // Expect response http code 200
});
./config/env/test/database.js
const path = require("path");
module.exports = ({ env }) => ({
connection: {
client: "sqlite",
connection: {
filename: path.join(
__dirname,
"../../../",
env("DATABASE_FILENAME", ".tmp/test.db")
),
},
useNullAsDefault: true,
},
});
The route /api/hello is a custom API endpoint. This works perfectly when running strapi develop, and all permissions are set correctly.
The tests run, but every endpoint that is not / or /admin returns 403 Forbidden, meaning there is a problem with the permissions. It would seem that the database file .tmp/data.db (used in development) is not replicated correctly in .tmp/test.db. In other words, this is close to working, but the permissions for API endpoints are not set correctly.
I have been searching through StackOverflow and the Stapi Forums over the past few days but to no avail. I would greatly appreciate some pointers as to how to fix this :)
It seems you need to grant the right privileges to your routes on your test DB.
For that you can create a function, lets call it grantPriviledge, and call it in your test in the function beforeAll.
// Here I want to grant the route update in my organization collection
beforeAll(async () => {
await grantPrivilege(1, 'permissions.application.controllers.organization.update');
});
And here is the function grantPriviledge:
// roleID is 1 for authenticated and 2 for public
const grantPrivilege = async (roleID = 1, value, enabled = true, policy = '') => {
const updateObj = value
.split('.')
.reduceRight((obj, next) => ({ [next]: obj }), { enabled, policy });
const roleName = roleID === 1 ? 'Authenticated' : 'Public';
const roleIdInDB = await strapi
.query('role', 'users-permissions')
.findOne({ name: roleName });
return strapi.plugins['users-permissions'].services.userspermissions.updateRole(
roleIdInDB,
updateObj,
);
};
Let me know if that helps
So in order for that to work in v4, this is how I did.
This was based in this and this but Stf's posting in this saying "inject them in your database during the bootstrap phase like it is made in the templates" was what really set me in the right track.
So if you look here you will see this function:
async function setPublicPermissions(newPermissions) {
// Find the ID of the public role
const publicRole = await strapi
.query("plugin::users-permissions.role")
.findOne({
where: {
type: "public",
},
});
// Create the new permissions and link them to the public role
const allPermissionsToCreate = [];
Object.keys(newPermissions).map((controller) => {
const actions = newPermissions[controller];
const permissionsToCreate = actions.map((action) => {
return strapi.query("plugin::users-permissions.permission").create({
data: {
action: `api::${controller}.${controller}.${action}`,
role: publicRole.id,
},
});
});
allPermissionsToCreate.push(...permissionsToCreate);
});
await Promise.all(allPermissionsToCreate);
}
Later on the code, this function is called like this:
await setPublicPermissions({
article: ["find", "findOne"],
category: ["find", "findOne"],
author: ["find", "findOne"],
global: ["find", "findOne"],
about: ["find", "findOne"],
});
So in my case I modified this function a bit to accept between authenticated (1) and public (2) roles inspired by Sidney C answer above.
This is how I did it:
const grantPrivilege = async (roleID = 1, newPermissions) => {
const roleName = roleID === 1 ? "authenticated" : "public";
// Find the ID of the public role
const roleEntry = await strapi
.query("plugin::users-permissions.role")
.findOne({
where: {
type: roleName,
},
});
// Create the new permissions and link them to the public role
const allPermissionsToCreate = [];
Object.keys(newPermissions).map((controller) => {
const actions = newPermissions[controller];
const permissionsToCreate = actions.map((action) => {
return strapi.query("plugin::users-permissions.permission").create({
data: {
action: `api::${controller}.${controller}.${action}`,
role: roleEntry.id,
},
});
});
allPermissionsToCreate.push(...permissionsToCreate);
});
await Promise.all(allPermissionsToCreate);
};
And then in my beforeAll block I call it like this:
await grantPrivilege(1, {
"my-custom-collection": ["create", "update"],
category: ["find", "findOne"],
author: ["find", "findOne"],
});
I have written a test case that successfully load files into virtual FS, and at the same time mounted a virtual volume as below
describe("should work", () => {
const { vol } = require("memfs");
afterEach(() => vol.reset());
beforeEach(() => {
vol.mkdirSync(process.cwd(), { recursive: true });
jest.resetModules();
jest.resetAllMocks();
});
it("should be able to mock fs that being called in actual code", async () => {
jest.mock("fs", () => {
return ufs //
.use(jest.requireActual("fs"))
.use(createFsFromVolume(vol) as any);
});
jest.mock("fs/promises", () => {
return ufs //
.use(jest.requireActual("fs/promises"))
.use(createFsFromVolume(vol) as any);
});
const { createFsFromVolume } = require("memfs");
const { ufs } = require("unionfs");
const { countFile } = require("../src/ops/fs");
vol.fromJSON(
{
"./some/README.md": "1",
"./some/index.js": "2",
"./destination": null,
},
"/app"
);
const result = ufs.readdirSync(process.cwd());
const result2 = ufs.readdirSync("/app");
const result3 = await countFile("/app");
console.log({ result, result2, result3 });
});
});
By using ufs.readdirSync, I can access to virtual FS and indeed result giving me files that loaded from disc into virtual FS, result2 representing /app which is a new volume created from vol.fromJSON.
Now my problem is I am unable to get the result for result3, which is calling countFile method as below
import fsPromises from "fs/promises";
export const countFile = async (path: string) => {
const result = await fsPromises.readdir(path);
return result.length;
};
I'm getting error
Error: ENOENT: no such file or directory, scandir '/app'
which I think it's because countFile is accessing the actual FS instead of the virtual despite I've had jest.mock('fs/promises')?
Please if anyone can provide some lead?
This is the function you want to unit test.
//CommonJS version
const fsPromises = require('fs/promises');
const countFile = async (path) => {
const result = await fsPromises.readdir(path);
return result.length;
};
module.exports = {
countFile
}
Now, how you would normally go about this, is to mock fsPromises. In this example specifically readdir() since that is the function being used in countFile.
This is what we call: a stub.
A skeletal or special-purpose implementation of a software component, used to develop or test a component that calls or is otherwise dependent on it. It replaces a called component.
const {countFile} = require('./index');
const {readdir} = require("fs/promises");
jest.mock('fs/promises');
beforeEach(() => {
readdir.mockReset();
});
it("When testing countFile, given string, then return files", async () => {
const path = "/path/to/dir";
// vvvvvvv STUB HERE
readdir.mockResolvedValueOnce(["src", "node_modules", "package-lock.json" ,"package.json"]);
const res = await countFile(path);
expect(res).toBe(4);
})
You do this because you're unit testing. You don't want to be dependent on other functions because that fails to be a unit test and more integration test. Secondly, it's a third-party library, which is maintained/tested by someone else.
Here is where your scenario applies. From my perspective, your objective isn't to test countFile() rather, to test fsPromises and maybe test functionality to read virtual file-systems: unionfs. If so then, fsPromises doesn't need to really be mocked.
I am writing a test which tests a firebase trigger. The problem, however, is that I cannot make it work.
I want to use the local firestore emulator and Jest in order to simulate a change in the firestore and see if the trigger does what it needs to do.
I require the cloud function in my test and I initialize my app
Setup.js:
const firebase = require('#firebase/testing');
const PROJECT_ID = 'project';
let admin;
let db;
const setupAdmin = async () => {
admin = firebase.initializeAdminApp({
projectId: PROJECT_ID
});
db = admin.firestore();
};
const getAdmin = () => {
return admin;
};
const getDb = () => {
return db;
};
module.exports.setupAdmin = setupAdmin;
module.exports.getAdmin = getAdmin;
module.exports.getDb = getDb;
Test.js
describe('Billing', () => {
let dbRef;
beforeAll(async () => {
const {db, admin} = require('../../../functions/helpers/setup');
dbRef = db;
});
afterAll(async () => {
await Promise.all(firebase.apps().map(app => app.delete()));
console.log(`View rule coverage information at ${COVERAGE_URL}\n`);
});
it('test', async () => {
const mockData = {
'Users/user1': {
uid: 'user1'
},
['Users/user1/Taxes/' + new Date().getFullYear().toString()]: {
totalExpenseEuro: 0
}
};
for (const key in mockData) {
const ref = dbRef.doc(key);
await ref.set(mockData[key]);
}
// Create mockup data
await dbRef.collection('Users').doc('user1').collection('Expenses').doc('expense1').set({
amountEuroInclVAT: 100
});
// Make snapshot for state of database beforehand
const beforeSnap = test.firestore.makeDocumentSnapshot({amountEuroInclVAT: 0}, 'Users/user1/Expenses/expense1');
// Make snapshot for state of database after the change
const afterSnap = test.firestore.makeDocumentSnapshot(
{amountEuroInclVAT: 100},
'Users/user1/Expenses/expense1'
);
const change = test.makeChange(beforeSnap, afterSnap);
// Call wrapped function with the Change object
const wrapped = test.wrap(calculateTaxesOnExpenseUpdate);
wrapped(change, {
params: {
uid: 'test1'
}
});
});
});
Now the main problem comes when I try to access this db object in my trigger
const calculateTaxesOnExpenseUpdate = functions.firestore
.document('Users/{uid}/Expenses/{expenseId}')
.onWrite(async (change, context) => {
const {getDb} = require('../helpers/setup'); // This setup is the same as above
let db = getDb();
...
For some reason when I perform an action like (await db.collection('Users').get()).get('totalExpenseEuro'), Jest stops executing my code. When I set a debugger right after that line, it never gets printed. That piece of code crashes, and I have no idea why. I think the DB instance if not properly configured in my cloud trigger function.
Question: What is a good way of sharing the DB instance (admin.firestore()) between the test and the cloud trigger functions?
I'm new the Node.js and I've been working with a sample project by a third party provider and I'm trying to use Azure Key Vault to store configuration values.
I'm having trouble getting a process to wait before executing the rest. I'll try to detail as much as I know.
The sample project has a file named agent.js which is the start page/file. On line 16 (agent_config = require('./config/config.js')[process.env.LP_ACCOUNT][process.env.LP_USER]) it calls a config file with values. I'm trying to set these value using Key Vault. I've tried many combinations of calling functions, and even implementing async / await but the value for agent_config always contains a [Promise] object and not the data returned by Key Vault.
If I'm right, this is because the Key Vault itself uses async / await too and the config file returns before the Key Vault values are returned.
How can Key Vault be added/implemented in a situation like this?
Here's what I've tried:
First updated agent.js to
let agent_config = {};
try {
agent_config = require('./config/config.js')['123']['accountName'];
} catch (ex) {
log.warn(`[agent.js] Error loading config: ${ex}`)
}
console.log(agent_config);
Test 1
./config/config.js
const KeyVault = require('azure-keyvault');
const msRestAzure = require('ms-rest-azure');
const KEY_VAULT_URI = 'https://' + '{my vault}' + '.vault.azure.net/' || process.env['KEY_VAULT_URI'];
function getValue(secretName, secretVersion) {
msRestAzure.loginWithAppServiceMSI({ resource: 'https://vault.azure.net' }).then((credentials) => {
const client = new KeyVault.KeyVaultClient(credentials);
client.getSecret(KEY_VAULT_URI, secretName, secretVersion).then(
function (response) {
return response.Value;
});
});
}
module.exports = {
'123': {
'accountName': {
accountId: getValue('mySecretName', '')
}
}
};
Results
{ accountsId: undefined }
Test 2
Made getValue an async function and wrapped it around another function (tried without the wrapping and didn't work either)
./config/config.js
const KeyVault = require('azure-keyvault');
const msRestAzure = require('ms-rest-azure');
const KEY_VAULT_URI = 'https://' + '{my vault}' + '.vault.azure.net/' || process.env['KEY_VAULT_URI'];
async function getValue(secretName, secretVersion) {
msRestAzure.loginWithAppServiceMSI({ resource: 'https://vault.azure.net' }).then((credentials) => {
const client = new KeyVault.KeyVaultClient(credentials);
client.getSecret(KEY_VAULT_URI, secretName, secretVersion).then(
function (response) {
return response.Value;
});
});
}
async function config() {
module.exports = {
'123': {
'accountName': {
accountId: await getValue('mySecretName', '')
}
}
};
}
config();
Results
{}
Test 3
Made getValue an async function and wrapped it around another function (tried without the wrapping and didn't work either)
./config/config.js
const KeyVault = require('azure-keyvault');
const msRestAzure = require('ms-rest-azure');
const KEY_VAULT_URI = 'https://' + '{my vault}' + '.vault.azure.net/' || process.env['KEY_VAULT_URI'];
async function getValue(secretName, secretVersion) {
return msRestAzure.loginWithAppServiceMSI({ resource: 'https://vault.azure.net' })
.then((credentials) => {
const client = new KeyVault.KeyVaultClient(credentials);
return client.getSecret(KEY_VAULT_URI, secretName, secretVersion).then(
function (response) {
return response.Value;
});
});
}
module.exports = {
'123': {
'accountName': {
accountId: getValue('mySecretName', '')
}
}
};
config();
Results
{ accountId: { <pending> } }
Other
I've tried many others ways like module.exports = async (value) =< {...} (found through other questions/solutions without success.
I'm starting to think I need to do some "waiting" on agent.js but I haven't found good info on this.
Any help would be great!
One issue is that your getValue function is not returning anything as your returns need to be explicit.
(and without the promise being returned, there's nothing to await on)
async function getValue(secretName, secretVersion) {
return msRestAzure.loginWithAppServiceMSI({ resource: 'https://vault.azure.net' })
.then((credentials) => {
const client = new KeyVault.KeyVaultClient(credentials);
return client.getSecret(KEY_VAULT_URI, secretName, secretVersion).then(
function (response) {
return response.Value;
});
});
}
You could also get away with less explicit returns using arrow functions..
const getValue = async (secretName, secretVersion) =>
msRestAzure.loginWithAppServiceMSI({ resource: 'https://vault.azure.net' })
.then(credentials => {
const client = new KeyVault.KeyVaultClient(credentials);
return client.getSecret(KEY_VAULT_URI, secretName, secretVersion)
.then(response => response.Value);
});
Introducing the Azure Key Vault read, which is async, means your whole config read is async. There' nothing you can do to get around that. This will mean that the code that uses the config will need to handle it appropriately. You start by exporting an async function that will return the config..
async function getConfig() {
return {
'123': {
'accountName': {
accountId: await getValue('mySecretName', '')
}
}
};
}
module.exports = getConfig;
In your agent code you call that function. This will mean that your agent code will need to be wrapped in a function too, so maybe something like this..
const Bot = require('./bot/bot.js');
const getConfig = require('./config/config.js');
getConfig().then(agentConfig => {
const agent = new Bot(agentConfig);
agent.on(Bot.const.CONNECTED, data => {
log.info(`[agent.js] CONNECTED ${JSON.stringify(data)}`);
});
});
The package azure-keyvault has been deprecated in favor of the new packages to deal with Keyvault keys, secrets and certificates separately. For your scenario, you can use the new #azure/keyvault-secrets package to talk to Key Vault and the new #azure/identity package to create the credential.
const { SecretClient } = require("#azure/keyvault-secrets");
const { DefaultAzureCredential } = require("#azure/identity");
async function getValue(secretName, secretVersion) {
const credential = new DefaultAzureCredential();
const client = new SecretClient(KEY_VAULT_URI, credential);
const secret = await client.getSecret(secretName);
return secret.value;
}
The DefaultAzureCredential assumes that you have set the below env variables
AZURE_TENANT_ID: The tenant ID in Azure Active Directory
AZURE_CLIENT_ID: The application (client) ID registered in the AAD tenant
AZURE_CLIENT_SECRET: The client secret for the registered application
To try other credentials, see the readme for #azure/identity
If you are moving from the older azure-keyvault package, checkout the migration guide to understand the major changes
In a previous project I mocked the mysql library with Sinon. I did this like so:
X.js:
const con = mysql.createPool(config.mysql);
...
Some other place in the project:
const rows = await con.query(query, inserts);
...
X.test.js:
const sinon = require('sinon');
const mockMysql = sinon.mock(require('mysql'));
...
mockMysql.expects('createPool').returns({
query: () => {
// Handles the query...
},
...
It worked perfectly.
In another project I am trying to mock pg, again with Sinon.
pool.js:
const { Pool } = require('pg');
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
module.exports = pool;
Some other place in the project:
const con = await pool.connect();
const result = await con.query(...
Y.test.js:
???
I can't understand how to mock connect().query(). None of the following approaches work:
1:
const { Pool } = require('pg');
const config = require('#blabla/config');
const mockPool = sinon.mock(new Pool(config.get('database')));
...
mockPool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
1 results in no error but the real db connection is used.
2:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
pool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
2 => TypeError: Pool is not a constructor
3:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = sinon.createStubInstance(Pool);
pool.connect.returns({
query: () => {
console.log('query here');
},
});
3 => TypeError: The constructor should be a function.
Can anybody point me in the right direction with how to mock my PostgreSQL connection?
Example: I have postgres.js like this.
const { Pool } = require('pg');
const handler = {
count: async (pgQuery) => {
try {
const pool = new Pool();
const res = await pool.query(pgQuery);
return { count: parseInt(res.rows[0].counter, 10) };
} catch (error) {
// Log/Throw error here.
}
return false;
}
}
module.exports = handler;
The spec test I created on postgres.spec.js is like this.
const { expect } = require('chai');
const sinon = require('sinon');
const pgPool = require('pg-pool');
const handler = require('postgres.js');
describe('Postgres', function () {
it('should have method count that bla bla', async function () {
// Create stub pgPool query.
const postgreeStubQuery = sinon.stub(pgPool.prototype, 'query');
postgreeStubQuery.onFirstCall().throws('XXX');
postgreeStubQuery.onSecondCall().resolves({
rows: [{ counter: 11 }],
});
// Catch case.
const catcher = await handler.count('SELECT COUNT()..');
expect(catcher).to.equal(false);
expect(postgreeStubQuery.calledOnce).to.equal(true);
// Correct case.
const correct = await handler.count('SELECT COUNT()..');
expect(correct).to.deep.equal({ count: 11 });
expect(postgreeStubQuery.calledTwice).to.equal(true);
// Restore stub.
postgreeStubQuery.restore();
});
});
To stub pool.query(), you need to stub pg-pool prototype and method query.
Hope this helps.
Since you're needing to mock the returned results of a query, I think the easiest solution would be to abstract your database from the the code needing the query results. Example being, your query results are returning information about a person. Create a person.js module with specific methods for interacting with the database.
Your other code needing the person information from the database won't know or care what type of database you use or how you connect to it, all they care to know is what methods are exposed from person.js when they require it.
//person.js
const { Pool } = require('pg')
// do other database connection things here
const getPersonById = function (id) {
// use your query here and return the results
}
module.exports = { getPersonById }
Now in your tests, you mock the person module, not the pg module. Imagine if you had 20 some odd tests that all had the mock MySQL pool set up then you changed to pg, you'd have to change all of those, nightmare. But by abstracting your database connection type/setup, it makes testing much easier, because now you just need to stub/mock your person.js module.
const person = require('../person.js') //or whatever relative file path it's in
const sinon = require('sinon')
describe('person.js', function () {
it('is stubbed right now', function () {
const personStub = sinon.stub(person)
personStub.getPersonById.returns('yup')
expect(personStub.getPersonById()).to.eq('yup')
})
})
Below is a simpler approach that means the system-under-test doesn't need any special tricks.
It is comprised of two parts, though the first is "nice to have":
Use a DI framework to inject the pg.Pool. This is a better approach IMO anyway, and fits really well with testing.
In the beforeEach() of the tests, configure the DI framework to use a mock class with sinon.stub instances.
If you aren't using a DI framework, pass the mock as a Pool parameter... but DI is better ;)
The code below is TypeScript using tsyringe, but similar approaches will work fine with plain JavaScript etc.
Somewhere you'll have code that uses pg.Pool. A contrived example:
import { Pool } from 'pg'
...
function getPets(pool: Pool): Promise<Pet[]> {
return pool.connect()
.then(db => db.query(SQL_HERE)
.then(result => {
db.release()
return result.rows // or result.rows.map(something) etc
})
.catch(error => {
db.release()
throw error
})
)
}
That works, and it's fine if you want to pass the Pool instance in. I'd prefer not to, so I use tsyringe like this:
import { container } from 'tsyringe'
...
function getPets(): Promise<Pet[]> {
return container.resolve<Pool>().connect()
.then(...)
}
Exactly the same outcome, but getPets() is cleaner to call - it can be a pain to lug around a Pool instance.
The main of the program would set up an instance in one of a few ways. Here's mine:
...
container.register(Pool, {
useFactory: instanceCachingFactory(() => {
return new Pool(/* any config here */)
})
})
The beauty of this comes out in tests.
The code above (the "system under test") needs a Pool instance, and that instance needs a connect() method that resolves to a class with query() and release() methods.
This is what I used:
class MockPool {
client = {
query: sinon.stub(),
release: sinon.stub()
}
connect () {
return Promise.resolve(this.client)
}
}
Here's the setup of a test using MockPool:
describe('proof', () => {
let mockPool: MockPool
beforeEach(() => {
// Important! See:
// https://github.com/microsoft/tsyringe#clearing-instances
container.clearInstances()
mockPool = new MockPool()
container.registerInstance(Pool, mockPool as unknown as Pool)
})
})
The cast through unknown to Pool is needed because I'm not implementing the whole Pool API, just what I need.
Here's what a test looks like:
it('mocks postgres', async () => {
mockPool.client.query.resolves({
rows: [
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: 'Cat'}
]
})
const r = await getPets()
expect(r).to.deep.equal([
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: Cat'}
])
})
You can easily control what data the mock Postgres Pool returns, or throw errors, etc.