How can I connect twice to db in Cypress tests? - javascript

I need to connect to my db (Postgres) twice during autotest - at the beginning to truncate and at the end to select a new note from the table.
I tried to do it using pg-promise, but in the second connection it returns null or undefined (the following error shows after asserting expected and real results - "Target cannot be null or undefined.").
If I don't execute the first connecting (for truncating), the second runs normally and returns new note from table.
Also, if I make Truncate after Select, it gives two different result, depending of whether the table empty or not.
If it is empty, the result is the same error. But if there is some record initially, all becomes ok and test finishes without error.
This is how I make connecting and disconnecting to db:
const pgp = require('pg-promise')();
const postgresConfig = require(require('path').resolve('cypress.json'));
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || postgresConfig.db);
return db.any(query).finally(db.$pool.end)
// return db.any(query).finally(pgp.end)
}
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}
And this is how I make requests to DB in test:
describe('example to-do app', () => {
beforeEach(() => {
cy.task('dbQuery', {'query': 'TRUNCATE TABLE auto_db.public.requests RESTART IDENTITY CASCADE'})
})
it('tablist displays class', () => {
cy.contains('button > span', 'Save').click()
cy.task('dbQuery', {'query': 'SELECT * FROM auto_db.public.requests'})
.then(queryResponse => {
expect(queryResponse[0]).to.deep.contain({
id: 1,
author_id: 4,
type_id: 1,
})
})
})
})

If you look at the implementation for cypress-postgres it's similar but they break up the response/return to fit the call to pgp.end() in between.
Since it sounds like the 1st connection isn't closing, I'd suspect the .finally() call isn't working.
const pgp = require('pg-promise')();
const postgresConfig = require(require('path').resolve('cypress.json'));
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || postgresConfig.db);
let response = db.any(query)
pgp.end()
return response
}
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}
You should be able to get rid of the postgresConfig line since (on, config) is the same (in fact it's better because you might want to overwrite some config on the command line).
const pgp = require('pg-promise')();
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || config.db);
let response = db.any(query)
pgp.end()
return response
}
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}

The problem was that new record in database has been creating not right away and cypress just was looking in db earlier that creating.
Adding of waiting before db checking helped.

Related

Context is undefined in translation module

I tried to add a call to an endpoint in order to get translation. I have like this :
const loadLocales = async () => {
const context = require.context('./locales', true);
const data = await ApiService.post(`${translationToolUrl}/gateway/translations`, { project: 'myProject' });
const messages = context.keys()
.map((key) => ({ key, locale: key.match(/[-a-z0-9_]+/i)[0] }))
.reduce((msgs, { key, locale }) => ({
...msgs,
[locale]: extendMessages(context(key)),
}), {});
return { context, messages };
};
const { context, messages } = loadLocales();
i18n = new VueI18n({
locale: 'en',
fallbackLocale: 'en',
silentFallbackWarn: true,
messages,
});
if (module.hot) {
module.hot.accept(context.id, () => {
const { messages: newMessages } = loadLocales();
Object.keys(newMessages)
.filter((locale) => messages[locale] !== extendMessages(newMessages[locale]))
.forEach((locale) => {
const msgs = extendMessages(newMessages[locale]);
messages[locale] = msgs;
i18n.setLocaleMessage(locale, msgs);
});
});
}
I added this request : ApiService.post. But I have the error TypeError: context is undefined droped at this line module.hot.accept(context.id.... Have you an idea how I can solve that ? My scope was to add this request in order to get translations from database and from .json files for now. I want to do a merge between both for now, in the feature I will get only from database but this will be done step by step.
The problem is, that you trying to declare multiple const in the wrong way, independently of trying to declaring them twice. This shows in:
const { context, messages } = loadLocales();
This would couse context and messages to be undefined. This won´t give an error, as I replicated in a small example:
const {first, second} = 'Testing'
console.log(first)
console.log(second)
Both, first and second, will be undefined. If you try to declare multiple const at once, you need to do it this way:
const context = loadLocales(), messages = loadLocales();

TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received an instance of Object

I am using the source code from a security rules tutorial to attempt to do integration testing with Jest for my Javascript async function async_create_post, used for my firebase HTTP function create_post The files involved has a directory structure of the following:
Testing file: root/tests/handlers/posts.test.js
File to be tested: root/functions/handlers/posts.js
Helper code from the tutorial: root/tests/rules/helpers.js
And here is the source code that is involved:
posts.test.js
const { setup, teardown} = require("../rules/helpers");
const {
async_get_all_undeleted_posts,
async_get_post,
async_delete_post,
async_create_post
} = require("../../functions/handlers/posts");
describe("Post Creation", () => {
afterEach(async () => {
await teardown();
});
test("should create a post", async () => {
const db = await setup();
const malloryUID = "non-existent uid";
const firstPost = {
body: "First post from Mallory",
author_id: malloryUID,
images: ["url1", "url2"]
}
const before_post_snapshot = await db.collection("posts").get();
expect(before_post_snapshot.docs.length).toBe(0);
await async_create_post(firstPost); //fails at this point, expected to create a new post, but instead threw an error
const after_post_snapshot = await db.collection("posts").get();
expect(after_post_snapshot.docs.length).toBe(1);
});
});
posts.js
const {admin, db } = require('../util/admin');
//admin.initializeApp(config); //my credentials
//const db = admin.firestore();
const { uuid } = require("uuidv4");
const {
success_response,
error_response
} = require("../util/validators");
exports.async_create_post = async (data, context) => {
try {
const images = [];
data.images.forEach((url) => {
images.push({
uid: uuid(),
url: url
});
})
const postRecord = {
body: data.body,
images: images,
last_updated: admin.firestore.FieldValue.serverTimestamp(),
like_count: 0,
comment_count: 0,
deleted: false,
author_id: data.author_id
};
const generatedToken = uuid();
await db
.collection("posts")
.doc(generatedToken)
.set(postRecord);
// return success_response();
return success_response(generatedToken);
} catch (error) {
console.log("Error in creation of post", error);
return error_response(error);
}
}
When I run the test in Webstorm IDE, with 1 terminal running Firebase emulators:start , I get the following error message.
console.log
Error in creation of post TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received an instance of Object
at validateString (internal/validators.js:120:11)
at Object.basename (path.js:1156:5)
at GrpcClient.loadProto (/Users/isaac/Desktop/project/functions/node_modules/google-gax/src/grpc.ts:166:23)
at new FirestoreClient (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/v1/firestore_client.js:118:38)
at ClientPool.clientFactory (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/index.js:330:26)
at ClientPool.acquire (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/pool.js:87:35)
at ClientPool.run (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/pool.js:164:29)
at Firestore.request (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/index.js:961:33)
at WriteBatch.commit_ (/Users/isaac/Desktop/project/functions/node_modules/#google-cloud/firestore/build/src/write-batch.js:485:48)
at exports.async_create_post (/Users/isaac/Desktop/project/functions/handlers/posts.js:36:5) {
code: 'ERR_INVALID_ARG_TYPE'
}
at exports.async_create_post (/Users/isaac/Desktop/project/functions/handlers/posts.js:44:13)
Error: expect(received).toBe(expected) // Object.is equality
Expected: 1
Received: 0
<Click to see difference>
at Object.<anonymous> (/Users/isaac/Desktop/project/tests/handlers/posts.test.js:59:45)
Error in creation of post comes from the console.log("Error in creation of post", error); in posts.js, so the error is shown in the title of this post.
I want to know why calling the async_create_post from posts.test.js will cause this error and does not populate my database with an additional record as expected behaviour. Do inform me if more information is required to solve the problem.
Here are some code snippets that may give more context.
helpers.js [Copied from the repository]
const firebase = require("#firebase/testing");
const fs = require("fs");
module.exports.setup = async (auth, data) => {
const projectId = `rules-spec-${Date.now()}`;
const app = firebase.initializeTestApp({
projectId,
auth
});
const db = app.firestore();
// Apply the test rules so we can write documents
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore-test.rules", "utf8")
});
// write mock documents if any
if (data) {
for (const key in data) {
const ref = db.doc(key); // This means the key should point directly to a document
await ref.set(data[key]);
}
}
// Apply the actual rules for the project
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore.rules", "utf8")
});
return db;
// return firebase;
};
module.exports.teardown = async () => {
// Delete all apps currently running in the firebase simulated environment
Promise.all(firebase.apps().map(app => app.delete()));
};
// Add extensions onto the expect method
expect.extend({
async toAllow(testPromise) {
let pass = false;
try {
await firebase.assertSucceeds(testPromise);
pass = true;
} catch (error) {
// log error to see which rules caused the test to fail
console.log(error);
}
return {
pass,
message: () =>
"Expected Firebase operation to be allowed, but it was denied"
};
}
});
expect.extend({
async toDeny(testPromise) {
let pass = false;
try {
await firebase.assertFails(testPromise);
pass = true;
} catch (error) {
// log error to see which rules caused the test to fail
console.log(error);
}
return {
pass,
message: () =>
"Expected Firebase operation to be denied, but it was allowed"
};
}
});
index.js
const functions = require('firebase-functions');
const {
async_get_all_undeleted_posts,
async_get_post,
async_delete_post,
async_create_post
} = require('./handlers/posts');
exports.create_post = functions.https.onCall(async_create_post);
The error message means that a method of the path module (like path.join) expects one of its arguments to be a string but got something else.
I found the offending line by binary search commenting the program until the error was gone.
Maybe one of your modules uses path and you supply the wrong arguments.

How to use dataloader?

Im trying to figure this out.
I want to get all my users from my database, cache them
and then when making a new request I want to get those that Ive cached + new ones that have been created.
So far:
const batchUsers = async ({ user }) => {
const users = await user.findAll({});
return users;
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: {
userLoader: new DataLoader(() => batchUsers(db)),// not sending keys since Im after all users
},
});
my resolver:
users: async (obj, args, context, info) => {
return context.userLoader.load();
}
load method requiers a parameter but in this case I dont want to have a specific user I want all of them.
I dont understand how to implement this can someone please explain.
If you're trying to just load all records, then there's not much of a point in utilizing DataLoader to begin in. The purpose behind DataLoader is to batch multiple calls like load(7) and load(22) into a single call that's then executed against your data source. If you need to get all users, then you should just call user.findAll directly.
Also, if you do end up using DataLoader, make sure you pass in a function, not an object as your context. The function will be ran on each request, which will ensure you're using a fresh instance of DataLoader instead of one with a stale cache.
context: () => ({
userLoader: new DataLoader(async (ids) => {
const users = await User.findAll({
where: { id: ids }
})
// Note that we need to map over the original ids instead of
// just returning the results of User.findAll because the
// length of the returned array needs to match the length of the ids
return ids.map(id => users.find(user => user.id === id) || null)
}),
}),
Note that you could also return an instance of an error instead of null inside the array if you want load to reject.
Took me a while but I got this working:
const batchUsers = async (keys, { user }) => {
const users = await user.findAll({
raw: true,
where: {
Id: {
// #ts-ignore
// eslint-disable-next-line no-undef
[op.in]: keys,
},
},
});
const gs = _.groupBy(users, 'Id');
return keys.map(k => gs[k] || []);
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: () => ({
userLoader: new DataLoader(keys => batchUsers(keys, db)),
}),
});
resolver:
user: {
myUsers: ({ Id }, args, { userLoader }) => {
return userLoader.load(Id);
},
},
playground:
{users
{Id
myUsers
{Id}}
}
playground explained:
users basically fetches all users and then myusers does the same thing by inhereting the id from the first call.
I think I choose a horrible example here since I did not see any gains in performence by this. I did see however that the query turned into:
SELECT ... FROM User WhERE ID IN(...)

Trying to run multiple pg pool query functions in one command using 'make-runnable'

Learning how to use Postgres and having difficulty calling multiple asynchronous pool functions in a linear manner.
I want to drop all my tables, create all those tables, and seed all those tables in one command on Powershell. I'm using the npm module 'make-runnable' for this. The functions run in isolation, but typing them in one at a time each time I want to try something new is a pain.
I reviewed how the async syntax works, and I've used it successfully in the past. I looked up how pool works, but I just get a lot of explanations on its syntax.
My three functions are basically this structure, and use the same pool.query() call:
const createTables = () => {
const taskTableText =
`CREATE TABLE IF NOT EXISTS
acts(
id UUID DEFAULT uuid_generate_v1 (),
name VARCHAR(128) NOT NULL,
length INTERVAL NOT NULL,
percent_complete INT NOT NULL,
start_stamp TIMESTAMPTZ NOT NULL,
PRIMARY KEY (id)
)
`;
pool.query(taskTableText)
.then((res) => {
console.log(res);
pool.end();
})
.catch((err) => {
console.log(err);
pool.end();
});
}
This works well in Powershell, but when I try and do the three together like
const makeFresh = async function() {
const stepOne = await dropTables();
const stepTwo = await createTables();
const stepThree = await seedTables();
}
One gets called, (or possibly they all try and fire since they are not running one at a time?) seemingly at random since the command can be different each time in the shell's output:
--------make-runnable-output--------
undefined
------------------------------------
connected to db
connected to db
connected to db
Result {
command: 'DROP',
rowCount: null,
oid: null,
rows: [],
fields: [],
_parsers: [],
RowCtor: null,
rowAsArray: false,
_getTypeParser: [Function: bound ] }
client removed
I'm sure there's a simple answer to this, I feel bad for asking but I don't want to burn another hour banging my head against the wall.
Solved this problem today. Since each function closes the pg pool, the subsequent calls couldn't perform their work. Made each one close the pool by default so they could continue to be called in isolation, but if a truthy value is passed in they will allow allow the pool to remain open so it can be used by other functions.
My new create tables example looks like this:
const createTables = async (isKeepingPoolOpen = false) => {
const taskTableText =
`CREATE TABLE IF NOT EXISTS
acts(
id UUID DEFAULT uuid_generate_v1 (),
name VARCHAR(128) NOT NULL,
length INTERVAL NOT NULL,
percent_complete INT NOT NULL,
start_stamp TIMESTAMPTZ NOT NULL,
PRIMARY KEY (id)
)
`;
return pool.query(taskTableText)
.then((res) => {
console.log(res);
isKeepingPoolOpen === true ? '' : pool.end();
})
.catch((err) => {
console.log(err);
isKeepingPoolOpen === true ? '' : pool.end();
});
}
My new 'call them all' function now looks like this:
const makeFresh = function() {
const isKeepingPoolOpen = true;
dropTables(isKeepingPoolOpen)
.then(() => createTables(isKeepingPoolOpen))
.then(() => seedTables(isKeepingPoolOpen))
.then(() => {
pool.end();
})
.catch((err) => {
console.log("error: " + error);
pool.end();
});
}

How to mock pg Pool with Sinon

In a previous project I mocked the mysql library with Sinon. I did this like so:
X.js:
const con = mysql.createPool(config.mysql);
...
Some other place in the project:
const rows = await con.query(query, inserts);
...
X.test.js:
const sinon = require('sinon');
const mockMysql = sinon.mock(require('mysql'));
...
mockMysql.expects('createPool').returns({
query: () => {
// Handles the query...
},
...
It worked perfectly.
In another project I am trying to mock pg, again with Sinon.
pool.js:
const { Pool } = require('pg');
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
module.exports = pool;
Some other place in the project:
const con = await pool.connect();
const result = await con.query(...
Y.test.js:
???
I can't understand how to mock connect().query(). None of the following approaches work:
1:
const { Pool } = require('pg');
const config = require('#blabla/config');
const mockPool = sinon.mock(new Pool(config.get('database')));
...
mockPool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
1 results in no error but the real db connection is used.
2:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
pool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
2 => TypeError: Pool is not a constructor
3:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = sinon.createStubInstance(Pool);
pool.connect.returns({
query: () => {
console.log('query here');
},
});
3 => TypeError: The constructor should be a function.
Can anybody point me in the right direction with how to mock my PostgreSQL connection?
Example: I have postgres.js like this.
const { Pool } = require('pg');
const handler = {
count: async (pgQuery) => {
try {
const pool = new Pool();
const res = await pool.query(pgQuery);
return { count: parseInt(res.rows[0].counter, 10) };
} catch (error) {
// Log/Throw error here.
}
return false;
}
}
module.exports = handler;
The spec test I created on postgres.spec.js is like this.
const { expect } = require('chai');
const sinon = require('sinon');
const pgPool = require('pg-pool');
const handler = require('postgres.js');
describe('Postgres', function () {
it('should have method count that bla bla', async function () {
// Create stub pgPool query.
const postgreeStubQuery = sinon.stub(pgPool.prototype, 'query');
postgreeStubQuery.onFirstCall().throws('XXX');
postgreeStubQuery.onSecondCall().resolves({
rows: [{ counter: 11 }],
});
// Catch case.
const catcher = await handler.count('SELECT COUNT()..');
expect(catcher).to.equal(false);
expect(postgreeStubQuery.calledOnce).to.equal(true);
// Correct case.
const correct = await handler.count('SELECT COUNT()..');
expect(correct).to.deep.equal({ count: 11 });
expect(postgreeStubQuery.calledTwice).to.equal(true);
// Restore stub.
postgreeStubQuery.restore();
});
});
To stub pool.query(), you need to stub pg-pool prototype and method query.
Hope this helps.
Since you're needing to mock the returned results of a query, I think the easiest solution would be to abstract your database from the the code needing the query results. Example being, your query results are returning information about a person. Create a person.js module with specific methods for interacting with the database.
Your other code needing the person information from the database won't know or care what type of database you use or how you connect to it, all they care to know is what methods are exposed from person.js when they require it.
//person.js
const { Pool } = require('pg')
// do other database connection things here
const getPersonById = function (id) {
// use your query here and return the results
}
module.exports = { getPersonById }
Now in your tests, you mock the person module, not the pg module. Imagine if you had 20 some odd tests that all had the mock MySQL pool set up then you changed to pg, you'd have to change all of those, nightmare. But by abstracting your database connection type/setup, it makes testing much easier, because now you just need to stub/mock your person.js module.
const person = require('../person.js') //or whatever relative file path it's in
const sinon = require('sinon')
describe('person.js', function () {
it('is stubbed right now', function () {
const personStub = sinon.stub(person)
personStub.getPersonById.returns('yup')
expect(personStub.getPersonById()).to.eq('yup')
})
})
Below is a simpler approach that means the system-under-test doesn't need any special tricks.
It is comprised of two parts, though the first is "nice to have":
Use a DI framework to inject the pg.Pool. This is a better approach IMO anyway, and fits really well with testing.
In the beforeEach() of the tests, configure the DI framework to use a mock class with sinon.stub instances.
If you aren't using a DI framework, pass the mock as a Pool parameter... but DI is better ;)
The code below is TypeScript using tsyringe, but similar approaches will work fine with plain JavaScript etc.
Somewhere you'll have code that uses pg.Pool. A contrived example:
import { Pool } from 'pg'
...
function getPets(pool: Pool): Promise<Pet[]> {
return pool.connect()
.then(db => db.query(SQL_HERE)
.then(result => {
db.release()
return result.rows // or result.rows.map(something) etc
})
.catch(error => {
db.release()
throw error
})
)
}
That works, and it's fine if you want to pass the Pool instance in. I'd prefer not to, so I use tsyringe like this:
import { container } from 'tsyringe'
...
function getPets(): Promise<Pet[]> {
return container.resolve<Pool>().connect()
.then(...)
}
Exactly the same outcome, but getPets() is cleaner to call - it can be a pain to lug around a Pool instance.
The main of the program would set up an instance in one of a few ways. Here's mine:
...
container.register(Pool, {
useFactory: instanceCachingFactory(() => {
return new Pool(/* any config here */)
})
})
The beauty of this comes out in tests.
The code above (the "system under test") needs a Pool instance, and that instance needs a connect() method that resolves to a class with query() and release() methods.
This is what I used:
class MockPool {
client = {
query: sinon.stub(),
release: sinon.stub()
}
connect () {
return Promise.resolve(this.client)
}
}
Here's the setup of a test using MockPool:
describe('proof', () => {
let mockPool: MockPool
beforeEach(() => {
// Important! See:
// https://github.com/microsoft/tsyringe#clearing-instances
container.clearInstances()
mockPool = new MockPool()
container.registerInstance(Pool, mockPool as unknown as Pool)
})
})
The cast through unknown to Pool is needed because I'm not implementing the whole Pool API, just what I need.
Here's what a test looks like:
it('mocks postgres', async () => {
mockPool.client.query.resolves({
rows: [
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: 'Cat'}
]
})
const r = await getPets()
expect(r).to.deep.equal([
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: Cat'}
])
})
You can easily control what data the mock Postgres Pool returns, or throw errors, etc.

Categories

Resources