I create for my project a log module and i am actually creating a new instance in all module to allow them to log in cli with the right syntax,color conf etc.
For example (a simplified example)
// index.js
const {Log,Ansi} = require("./class/log.js");
const Tool = require("./class/tool.js");
const argv = require("yargs").argv;
let log = new Log({
levelIcon:true,
powerlineRoot:{
name:"root",
backgroundColor:Ansi.BLACK_BRIGHT,
text: "myappName"
}
});
let tool = new Tool(argv.toolName,argv.envName)
tool.run().then(() => {
log.print("Tool is running","info");
}).catch((err) => {
log.print(err,"critical");
});
// tool.js
const {Log,Ansi} = require("./log.js");
class Tool {
let log = new Log({
levelIcon:true,
powerlineRoot:{
name:"root",
backgroundColor:Ansi.BLACK_BRIGHT,
text: "myappName"
}
});
run(){
return new Promise((resolve, reject) => {
resolve()
}
}
}
module.exports = Tool
I am wondering if there is a way to create only one instance in my index.js and share it with the instance of modules like Tools. I don't know if it's possible but i think that it will be less memory consumption to share one instance of Log than creating multiple one
I hope that my question is enough clear. Feel free to ask me more information if needed
Yes you can do that absolutely. Since the entry is index.js you can consider it will finally run as if it was a single file, in a single thread. You can create one more module logger.js like:
const {Log,Ansi} = require("./class/log.js");
const logger = new Log({
levelIcon:true,
powerlineRoot:{
name:"root",
backgroundColor:Ansi.BLACK_BRIGHT,
text: "myappName"
}
});
module.exports = logger;
Now you can just import logger and use it like:
const logger = require("./logger")
logger.print("hello world!");
Like #jjsingh says i use global variable to store the object.
Not sure it's the better way but for the moment it resolve my issue.
global.log = new Log({
levelIcon:true,
powerlineRoot:{
name:"root",
color:{
background:Ansi.BLUE_SEA,
foreground:Ansi.RED,
},
text:global.conf.get("infos").name
}
});
Related
I am trying to use pino for logging in to my node app Server and I do have some large logs coming, so rotating the files every day would be more practical.
So I used pino.multistream() and require('file-stream-rotator')
My code works, but for performance reasons, I would not like to use the streams in the main thread.
according to the doc, I should use pino.transport():
[pino.multistream()] differs from pino.transport() as all the streams will be executed within the main thread, i.e. the one that created the pino instance.
https://github.com/pinojs/pino/releases?page=2
However, I can't manage to combine pino.transport() and file-stream-rotator.
my code that does not work completely
-> logs the first entries, but is not exportable because it blocks the script with the error
throw new Error('the worker has exited')
Main file
const pino = require('pino')
const transport = pino.transport({
target: './custom-transport.js'
})
const logger = pino(transport)
logger.level = 'info'
logger.info('Pino: Start Service Logging...')
module.exports = {
logger
}
custom-transport.js file
const { once } = require('events')
const fileStreamRotator = require('file-stream-rotator')
const customTransport = async () => {
const stream = fileStreamRotator.getStream({ filename: 'myfolder/custom-logger.log', frequency: 'daily' })
await once(stream, 'open')
return stream
}
module.exports = customTransport
first and foremost, I'm very new to this. I've been following the tutorials at the Discord.js Site, with the goal being to make a discord bot for the Play by Post DnD server I'm in where everyone wants to gain experience via word count.
I mention I'm new to this because this is my first hands-on experience with Javascript, a lot of the terminology goes over my head.
So, the problem seems to be where I've broken away from the tutorial. It goes over command handlers, which I want to stick with because it seems to be good practice and easier to work with down the line when something most breaks (And I know it will). But the tutorial for Databases (Currency/Sequelizer) doesn't really touch on command handlers beyond "Maintain references".
But that's enough foreword, the problem is in trying to get a command that checks the database for a player's current experience points and level.
I have the seemingly relevant files organized with the index.js and dbObjects.js together, a models folder for the Users, and LevelUp(CurrencyShop in the tutorial) and a separate folder for the Commands like the problematic one, xpcheck.js
I can get the command to function without breaking, using the following,
const { Client, Collection, Formatters, Intents } = require('discord.js');
const { SlashCommandBuilder } = require('#discordjs/builders');
const experience = new Collection();
const level = new Collection();
Reflect.defineProperty(experience, 'getBalance', {
/* eslint-disable-next-line func-name-matching */
value: function getBalance(id) {
const user = experience.get(id);
return user ? user.balance : 0;
},
});
Reflect.defineProperty(level, 'getBalance', {
/* eslint-disable-next-line func-name-matching */
value: function getBalance(id) {
const user = level.get(id);
return user ? user.balance : 1;
},
});
module.exports = {
data: new SlashCommandBuilder()
.setName('xpcheck')
.setDescription('Your current Experience and Level'),
async execute(interaction) {
const target = interaction.options.getUser('user') ?? interaction.user;
return interaction.reply(`${target.tag} is level ${level.getBalance(target.id)} and has ${experience.getBalance(target.id)} experience.`);;
},
};
The problem is that the command doesn't reference the database. It returns default values (1st level, 0 exp) every time.
I tried getting the command to reference the database, one of many attempts being this one;
const { Client, Collection, Formatters, Intents } = require('discord.js');
const { SlashCommandBuilder } = require('#discordjs/builders');
const Sequelize = require('sequelize');
const { Users, LevelUp } = require('./DiscordBot/dbObjects.js');
module.exports = {
data: new SlashCommandBuilder()
.setName('xpcheck')
.setDescription('Your current Experience and Level'),
async execute(interaction) {
const experience = new Collection();
const level = new Collection();
const target = interaction.options.getUser('user') ?? interaction.user;
return interaction.reply(`${target.tag} is level ${level.getBalance(target.id)} and has ${experience.getBalance(target.id)} experience.`);;
},
};
However, when I run node deploy-commands.js, it throws
Error: Cannot find module './DiscordBot/dbObjects.js'
It does the same thing even if I remove the /DiscordBot, or any other way I've attempted to make a constant for it. I'm really uncertain what I should do to alleviate this issue.
My file structure, for reference, is:
v DiscordBot
v commands
xpcheck.js
v models
LevelUp.js
UserItems.js
Users.js
dbInit.js
dbObjects.js
deploy-commands.js
index.js
As was pointed out in the comments, the problem was simple, the solution simpler.
Correcting
const { Users, LevelUp } = require('./dbObjects.js');
to
const { Users, LevelUp } = require('../dbObjects.js');
allows it to search the main directory for the requisite file.
I am trying to implement a singleton pattern for the fastify instance. My code is as follows :-
const { createFastifyServer: server } = require("../app");
const getFastifyInstance = (() => {
let fastify;
return {
fastifyInstance: async () => {
if (!fastify) {
console.log("Called")
fastify = server();
await fastify.ready();
}
return fastify
}
}
})();
const { fastifyInstance } = getFastifyInstance
module.exports = fastifyInstance
Now wherever I am importing the code in a different file, the console prints "Called" each time it's imported in a new file, but shouldn't that be only once if singleton pattern was correctly implemented. Any idea what am I doing wrong?
I’ve logger which I initiate using a constractor in the index.js file. Now I need
To pass the logger instance to other files, and I do it like this
index.js
const books = require(“./books”);
books(app, logger);
logger = initLogger({
level: levels.error,
label: “app”,
version: "0.0.1",
});
app.listen(port, () => logger.info(`listening on port ${port}`));
And inside the books.js file I use it like following, get the logger from the index.js file and use it
inside the books.js file, also pass it to another file with the function isbn.get(books, logger);,
Is it recommended to do it like this? Is there a cleaner way in nodes ?
books.js
const isbn = require(“./isbn”);
module.exports = async function (app, logger) {
…
try {
Let books = await getBooks();
logger.info(“get “books process has started”);
} catch (err) {
logger.error("Failed to fetch books", err);
return;
}
…
// this function is from the file “isbn” and I should pass the logger to it also
try {
let url = await isbn.get(books, logger);
} catch (e) {
res.send(e.message);
}
}
Try creating a module specifically for your logger configuration, then you can import that into your modules instead of using a side-effect of your business module to create a logger.
This will help if you ever need/want to change your logger configuration - instead of following a chain of business methods, you can just update the log configuration.
Example
logger.js
'use strict';
// Any setup you need can be done here.
// e.g. load log libraries, templates etc.
const log = function(level, message) {
return console.log(level + ": " + message);
};
module.exports = log;
business-logic.js
'use strict';
var log = require('./logger');
var stuff = require('./stuff');
const do_stuff = function (thing) {
// do stuff here
log("INFO", "Did stuff");
}
This is a pretty clean way of doing it, however it could be awkward when trying to share more variables or adding more requires. So, you could put all the variables in an object and destructure only the variables you need in books.js:
index.js:
const state = {app, logger, some, other, variables};
require("./books")(state);
require("./another_file")(state);
books.js:
module.exports = async function ({app, logger}) {
};
In a previous project I mocked the mysql library with Sinon. I did this like so:
X.js:
const con = mysql.createPool(config.mysql);
...
Some other place in the project:
const rows = await con.query(query, inserts);
...
X.test.js:
const sinon = require('sinon');
const mockMysql = sinon.mock(require('mysql'));
...
mockMysql.expects('createPool').returns({
query: () => {
// Handles the query...
},
...
It worked perfectly.
In another project I am trying to mock pg, again with Sinon.
pool.js:
const { Pool } = require('pg');
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
module.exports = pool;
Some other place in the project:
const con = await pool.connect();
const result = await con.query(...
Y.test.js:
???
I can't understand how to mock connect().query(). None of the following approaches work:
1:
const { Pool } = require('pg');
const config = require('#blabla/config');
const mockPool = sinon.mock(new Pool(config.get('database')));
...
mockPool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
1 results in no error but the real db connection is used.
2:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = new Pool(config.get('database'));
pool.expects('connect').returns({
query: () => {
console.log('query here');
},
});
2 => TypeError: Pool is not a constructor
3:
const { Pool } = sinon.mock(require('pg'));
const config = require('#blabla/config');
const pool = sinon.createStubInstance(Pool);
pool.connect.returns({
query: () => {
console.log('query here');
},
});
3 => TypeError: The constructor should be a function.
Can anybody point me in the right direction with how to mock my PostgreSQL connection?
Example: I have postgres.js like this.
const { Pool } = require('pg');
const handler = {
count: async (pgQuery) => {
try {
const pool = new Pool();
const res = await pool.query(pgQuery);
return { count: parseInt(res.rows[0].counter, 10) };
} catch (error) {
// Log/Throw error here.
}
return false;
}
}
module.exports = handler;
The spec test I created on postgres.spec.js is like this.
const { expect } = require('chai');
const sinon = require('sinon');
const pgPool = require('pg-pool');
const handler = require('postgres.js');
describe('Postgres', function () {
it('should have method count that bla bla', async function () {
// Create stub pgPool query.
const postgreeStubQuery = sinon.stub(pgPool.prototype, 'query');
postgreeStubQuery.onFirstCall().throws('XXX');
postgreeStubQuery.onSecondCall().resolves({
rows: [{ counter: 11 }],
});
// Catch case.
const catcher = await handler.count('SELECT COUNT()..');
expect(catcher).to.equal(false);
expect(postgreeStubQuery.calledOnce).to.equal(true);
// Correct case.
const correct = await handler.count('SELECT COUNT()..');
expect(correct).to.deep.equal({ count: 11 });
expect(postgreeStubQuery.calledTwice).to.equal(true);
// Restore stub.
postgreeStubQuery.restore();
});
});
To stub pool.query(), you need to stub pg-pool prototype and method query.
Hope this helps.
Since you're needing to mock the returned results of a query, I think the easiest solution would be to abstract your database from the the code needing the query results. Example being, your query results are returning information about a person. Create a person.js module with specific methods for interacting with the database.
Your other code needing the person information from the database won't know or care what type of database you use or how you connect to it, all they care to know is what methods are exposed from person.js when they require it.
//person.js
const { Pool } = require('pg')
// do other database connection things here
const getPersonById = function (id) {
// use your query here and return the results
}
module.exports = { getPersonById }
Now in your tests, you mock the person module, not the pg module. Imagine if you had 20 some odd tests that all had the mock MySQL pool set up then you changed to pg, you'd have to change all of those, nightmare. But by abstracting your database connection type/setup, it makes testing much easier, because now you just need to stub/mock your person.js module.
const person = require('../person.js') //or whatever relative file path it's in
const sinon = require('sinon')
describe('person.js', function () {
it('is stubbed right now', function () {
const personStub = sinon.stub(person)
personStub.getPersonById.returns('yup')
expect(personStub.getPersonById()).to.eq('yup')
})
})
Below is a simpler approach that means the system-under-test doesn't need any special tricks.
It is comprised of two parts, though the first is "nice to have":
Use a DI framework to inject the pg.Pool. This is a better approach IMO anyway, and fits really well with testing.
In the beforeEach() of the tests, configure the DI framework to use a mock class with sinon.stub instances.
If you aren't using a DI framework, pass the mock as a Pool parameter... but DI is better ;)
The code below is TypeScript using tsyringe, but similar approaches will work fine with plain JavaScript etc.
Somewhere you'll have code that uses pg.Pool. A contrived example:
import { Pool } from 'pg'
...
function getPets(pool: Pool): Promise<Pet[]> {
return pool.connect()
.then(db => db.query(SQL_HERE)
.then(result => {
db.release()
return result.rows // or result.rows.map(something) etc
})
.catch(error => {
db.release()
throw error
})
)
}
That works, and it's fine if you want to pass the Pool instance in. I'd prefer not to, so I use tsyringe like this:
import { container } from 'tsyringe'
...
function getPets(): Promise<Pet[]> {
return container.resolve<Pool>().connect()
.then(...)
}
Exactly the same outcome, but getPets() is cleaner to call - it can be a pain to lug around a Pool instance.
The main of the program would set up an instance in one of a few ways. Here's mine:
...
container.register(Pool, {
useFactory: instanceCachingFactory(() => {
return new Pool(/* any config here */)
})
})
The beauty of this comes out in tests.
The code above (the "system under test") needs a Pool instance, and that instance needs a connect() method that resolves to a class with query() and release() methods.
This is what I used:
class MockPool {
client = {
query: sinon.stub(),
release: sinon.stub()
}
connect () {
return Promise.resolve(this.client)
}
}
Here's the setup of a test using MockPool:
describe('proof', () => {
let mockPool: MockPool
beforeEach(() => {
// Important! See:
// https://github.com/microsoft/tsyringe#clearing-instances
container.clearInstances()
mockPool = new MockPool()
container.registerInstance(Pool, mockPool as unknown as Pool)
})
})
The cast through unknown to Pool is needed because I'm not implementing the whole Pool API, just what I need.
Here's what a test looks like:
it('mocks postgres', async () => {
mockPool.client.query.resolves({
rows: [
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: 'Cat'}
]
})
const r = await getPets()
expect(r).to.deep.equal([
{name: 'Woof', kind: 'Dog'},
{name: 'Meow', kind: Cat'}
])
})
You can easily control what data the mock Postgres Pool returns, or throw errors, etc.