I'm trying to create a singleton that has a single amqp connection and when createChannel method is called, it must return a new channel from the same connection:
export interface IBroker {
createChannel(): Promise<IChannel>;
}
export default class Broker implements IBroker {
private static instance: Broker;
private conn: IConnection | undefined;
private constructor(public config: IRabbitMQConfig = new RabbitMQConfig()) {}
/**
* singleton
*/
public static getInstance(): Broker {
if (!this.instance) {
this.instance = new Broker();
}
return this.instance;
}
/**
* initiates configuration on infra service
*/
async createChannel(): Promise<IChannel> {
try {
if (!this.conn) {
this.conn = await this.config.init();
await this.createExchanges();
await this.createQueues();
await this.createBinds();
logger.info('Broker started successfully');
}
if (!this.conn) {
throw new InternalError('Error starting broker. Missing connection!');
}
return await this.conn.createChannel();
} catch (err) {
logger.error('Error trying to start broker', err);
throw new InternalError('Error trying to start broker', 500);
}
}
// code...
the call config.init() returns the amqp connection.
when I test the class like below, every time I call createChannel it creates a new connection!
const a = Broker.getInstance();
const b = Broker.getInstance();
console.log(a === b); // return true
a.createChannel(); // create a new connection
b.createChannel(); // creates another connection
this.conn of Broker class is always undefined when createChannel is called!
I think the issue is that the two synchronous calls to createChannel mean that the first one hasn't initialized the connection by the time the second is called, which leads to 2 connections being created.
If you want to make your createChannel "thread-safe" in terms of creating the connection, you could do something like this (untested):
interface IConnection {
connect: () => void
}
const initConnection = () : Promise<IConnection> => {
return Promise.resolve({
connect: () => {}
});
};
class Broker {
private connection : IConnection | undefined;
private pendingConnection : Promise<IConnection> | undefined;
async createChannel() : Promise<IConnection> {
if (this.connection) {
return this.connection;
}
if (this.pendingConnection) {
return this.pendingConnection;
}
this.pendingConnection = initConnection();
const conn = await this.pendingConnection;
// Do other setup stuff
this.connection = conn;
this.pendingConnection = undefined;
return conn;
}
}
Related
I have written a Kafka-node consumer. This works absolutely fine if there is any message available in the topic but gets blocked forever if there is no new message available in the topic. I want to close the consumer and call a callback function if no message is received within specified time-frame of 10 seconds. Is there a way to handle this scenario in nodejs?
Here is the sample of consumer i created with schema registry integrated:
You need to register disconnected event
this.consumer.on("disconnected", this.disconnected.bind(this));
private disconnected(arg) {
this.connect();
}
Code Snippet
const AVROSchemaRegistry = require("avro-schema-registry");
import { KafkaConsumer } from "node-rdkafka";
export class Consumer {
private consumer;
private brokerList;
private schemaRegistry;
private topics;
private groupId;
private consumerAmount = 15;
private consumerIntervalMs = 1000;
private callBack;
/**
* Gets the Consumer singleton instance. If it wasn"t created, first creates the instance then returns that instance.
* #param brokerList List of brokerList in the Kafka cluster ("," separated)
* #param consumerGroup Array of consumer groups and their topics
* #param callBack Callback function that will run when a new event is received
* #return Consumer
*/
constructor(brokerList, schemaRegistry, consumerGroup, callBack) {
this.brokerList = brokerList;
this.schemaRegistry = schemaRegistry;
this.topics = consumerGroup.topics;
this.groupId = consumerGroup.id;
this.callBack = callBack;
}
public async init(cb) {
if(!this.consumer) {
//KAFKA CONSUMER CONFIG
const config = {
"metadata.broker.list": this.brokerList,
...(this.groupId && { "group.id": this.groupId }),
};
this.consumer = new KafkaConsumer(config);
this.consumer.on("event.log", this.eventLog.bind(this));
this.consumer.on("event.error", this.eventError.bind(this));
this.consumer.on("ready", this.ready.bind(this));
this.consumer.on("data", this.data.bind(this));
this.consumer.on("disconnected", this.disconnected.bind(this));
this.connect();
}
}
private connect(cb) {
if(this.consumer) {
// KAFKA CONSUMER CONNECTING...
this.consumer.connect(undefined, (err, data) => {
if(err) {
// KAFKA CONSUMER CONNECTION ERROR
} else {
// KAFKA CONSUMER CONNECTED
if(cb) {
cb(err, data);
}
}
});
}
}
private eventLog(log) {
// log("KAFKA CONSUMER EVENT LOG:", JSON.stringify(log));
}
private eventError(eventErr) {
// log("KAFKA CONSUMER EVENT ERROR:", JSON.stringify(eventErr));
}
private ready(arg) {
// log("KAFKA CONSUMER READY:", JSON.stringify(arg));
this.consumer.subscribe(this.topics);
setInterval(() => {
this.consumer.consume(this.consumerAmount);
}, this.consumerIntervalMs);
}
private async data(message) {
// KAFKA CONSUMER RECEIVED DATA message and DECODED MESSAGE VALUE
const messageValue = await AVROSchemaRegistry(this.schemaRegistry).decode(message.value);
if(this.callBack) {
this.callBack(null, {
value: JSON.parse(JSON.stringify(messageValue)),
key: message.key ? message.key.toString() : null
});
}
}
private disconnected(arg) {
// KAFKA CONSUMER DISCONNECTED RECONNECT AGAIN
this.connect();
}
}
I'm trying to run some code which creates tracing spans for the various phases in the lifecycle of an http request (socket, dns lookup, connect or secureConnect, ttfb, end). As of now it looks more or less like this:
function tracedRequest(
options: HttpRequestOptions | HttpsRequestOptions,
callback: ResponseCallback
): ClientRequest {
const isHttps = options.protocol === 'https' || options.agent instanceof HttpsAgent;
const transport = isHttps ? https.request : http.request;
const requestSpan = tracer.createChildSpan({ name: 'request' });
if (!tracer.isRealSpan(requestSpan)) {
return transport.call(null, options, callback);
}
let socketSpan: ISpan | undefined;
let dnsSpan: ISpan | undefined;
let tcpSpan: ISpan | undefined;
let tlsSpan: ISpan | undefined;
let ttfbSpan: ISpan | undefined;
const onLookup = () => {
dnsSpan?.endSpan();
tcpSpan = tracer.createChildSpan({ name: 'http_tcp_handshake' });
};
const onConnect = () => {
tcpSpan?.endSpan();
if (isHttps) {
tlsSpan = tracer.createChildSpan({ name: 'http_tls_handshake' });
} else {
ttfbSpan = tracer.createChildSpan({ name: 'http_ttfb' });
}
}
const onSecureConnect = () => {
tlsSpan?.endSpan();
// just in case secureConnect is emmited not only for https transports
if (isHttps) {
ttfbSpan = tracer.createChildSpan({ name: 'http_ttfb' });
}
}
const onResponse = (response: IncomingMessage) => {
ttfbSpan?.endSpan();
response.prependOnceListener('end', () => {
requestSpan.endSpan();
});
}
const onSocket = (socket: Socket | TLSSocket) => {
socketSpan.endSpan();
socket.prependOnceListener('lookup', onLookup);
deferToConnect(socket, {
connect: onConnect,
secureConnect: onSecureConnect
});
}
socketSpan = tracer.createChildSpan({ name: 'http_establish_socket' });
const request: ClientRequest = transport.call(null, options, callback);
if (request.socket) {
onSocket(request.socket as any);
} else {
request.prependOnceListener('socket', onSocket);
}
request.prependOnceListener('response', onResponse);
return request;
}
The problem with this approach arises when you use an agent with keepalive enable. In this situation the socket may be reused, thus the socket already has established a connection to the remote host and neither the socket nor the lookup events will be emitted (notice that for the socket event this is handled, we can know that the socket event will not be emitted if the request.socket property is set).
How can I do the same thing for the lookup event ? Which property of the socket object can I check to be sure that the host has already been resolved and the lookup event will not be emitted ? Should I use the localAddress/localPort, remoteAddress/remotePort properties or the socket.address() method ?
So, I did some testing and apparently you can do:
if (Object.keys(socket.address()).length) {
onLookup();
} else {
socket.prependOnceListener('lookup', onLookup);
}
socket.address() returns an empty object if an address has not been resolved yet, otherwise it returns an object with the properties address, port and family.
So far it has worked for me
I built a TS, MongoDB Client wrapper. for some reason when I call the function that gets the connection, its callback is called twice.
There are 2 calls in total to the get() function, 1 before the export as you can see and another from a mocha test.
I am pretty new to TS and JS in general, but this seems a bit off.
import {Db, MongoClient} from "mongodb";
import {MongoConfig} from '../config/config'
class DbClient {
private cachedDb : Db = null;
private async connectToDatabase() {
console.log('=> connect to database');
let connectionString : string = "mongodb://" + MongoConfig.host + ":" + MongoConfig.port;
return MongoClient.connect(connectionString)
.then(db => {
console.log('=> connected to database');
this.cachedDb = db.db(MongoConfig.database);
return this.cachedDb;
});
}
public async get() {
if (this.cachedDb) {
console.log('=> using cached database instance');
return Promise.resolve(this.cachedDb);
}else{
return this.connectToDatabase();
}
}
}
let client = new DbClient();
client.get();
export = client;
where the console output is:
=> connect to database
=> connected to database
=> connected to database
Any particular reason this is misbehaving?
There are 2 calls in total to the get() function, 1 before the export as you can see and another from a mocha test.
I suspect the output has an additional => connect to database. As I said in the comments: There's a "race condition" where get() could be called multiple times before this.cachedDb is set which would lead to multiple connections/instances of Db being created.
For example:
const a = client.get();
const b = client.get();
// then
a.then(resultA => {
b.then(resultB => {
console.log(resultA !== resultB); // true
});
});
Solution
The problem can be fixed by storing the promise as the cached value (also, no need to have the async keyword on the methods as Randy pointed out, as there's no values being awaited in any of the methods so you can just return the promises):
import {Db, MongoClient} from "mongodb";
import {MongoConfig} from '../config/config'
class DbClient {
private cachedGet: Promise<Db> | undefined;
private connectToDatabase() {
console.log('=> connect to database');
const connectionString = `mongodb://${MongoConfig.host}:${MongoConfig.port}`;
return MongoClient.connect(connectionString);
}
get() {
if (!this.cachedGet) {
this.cachedGet = this.connectToDatabase();
// clear the cached promise on failure so that if a caller
// calls this again, it will try to reconnect
this.cachedGet.catch(() => {
this.cachedGet = undefined;
});
}
return this.cachedGet;
}
}
let client = new DbClient();
client.get();
export = client;
Note: I'm not sure about the best way of using MongoDB (I've never used it), but I suspect connections should not be so long lived as to be cached like this (or should probably only be cached for a short time and then disconnected). You'll need to investigate that though.
I'm trying to extends the KUZZLE JavaScript SDK in order to call some controllers on kuzzle servers, implemented via plugins.
I'm following that guide: add controller
Here is my controller which extends from the BaseController:
const { BaseController } = require('kuzzle-sdk');
export class UserController extends BaseController {
constructor (kuzzle) {
super(kuzzle, 'plugins-user/userController');
}
/**
* Method to call the action "CreateAccount" on the UserController
* #param {*} user
*/
async createAccount(user) {
const apiRequest = {
action: 'new',
body: {
user
}
};
try {
const response = await this.query(apiRequest);
return response.result.user;
}
catch (error) {
//Manage errors
}
}
}
And here is where I specify the controller in order to use it further in the App, on the creation of the singleton.
const {UserController} = require('./UserController');
const { Kuzzle, WebSocket } = require('kuzzle-sdk');
class KuzzleService {
static instance = null;
static async createInstance() {
var object = new KuzzleService();
object.kuzzle = new Kuzzle(
new WebSocket('localhost'),{defaultIndex: 'index'}
);
object.kuzzle.useController(UserController, 'user');
await object.kuzzle.connect();
const credentials = { username: 'admin', password: 'pass' };
const jwt = await object.kuzzle.auth.login('local', credentials);
return object;
}
static async getInstance () {
if (!KuzzleService.instance) {
KuzzleService.instance = await KuzzleService.createInstance();
}
return KuzzleService.instance;
}
}
export default KuzzleService;
Somehow I'm getting the following error:
Controllers must inherit from the base controller
Is there something wrong with the imports ?
I've found out the solution to that issue. Firstly, I was not on the right version of the kuzzle SDK released recently (6.1.1) and secondly the controller class must be exported as default:
const { BaseController } = require('kuzzle-sdk');
export default class UserController extends BaseController {
constructor (kuzzle) {
super(kuzzle, 'plugins-user/userController');
}
/**
* Method to call the action "CreateAccount" on the UserController
* #param {*} user
*/
async createAccount(user) {
const apiRequest = {
action: 'new',
body: {
user
}
};
try {
const response = await this.query(apiRequest);
return response.result.user;
}
catch (error) {
//Manage errors
}
}
}
And then the UserController needs to be importer that way:
import UserController from './UserController.js'
Then, as specified in the documentation, we need just inject the kuzzle object into the controller that way:
kuzzle.useController(UserController, 'user');
Assume the following class in TypeScript:
class MongoDbContext implements IMongoDbContext {
private connectionString : string;
private databaseName : string;
private database : Db;
public constructor (connectionString : string, databaseName : string) {
this.connectionString = connectionString;
this.databaseName = databaseName;
}
public async initializeAsync () : Promise<MongoDbContext> {
// Create a client that represents a connection with the 'MongoDB' server and get a reference to the database.
var client = await MongoClient.connect(this.connectionString, { useNewUrlParser: true });
this.database = await client.db(this.databaseName);
return this;
}
}
Now, I want to test if an exception is thrown when I'm trying to connect to an unexisting MongoDB server, this is done with the following integration test:
it('Throws when a connection to the database server could not be made.', async () => {
// Arrange.
var exceptionThrowed : boolean = false;
var mongoDbContext = new MongoDbContext('mongodb://127.0.0.1:20000/', 'databaseName');
// Act.
try { await mongoDbContext.initializeAsync(); }
catch (error) { exceptionThrowed = true; }
finally {
// Assert.
expect(exceptionThrowed).to.be.true;
}
}).timeout(5000);
When I run this unit test, my CMD window doesn't print a summary.
It seems that it's hanging somewhere.
What am I'm doing wrong in this case?
Kind regards,
I've managed to find the issue.
It seems that I must close my 'MongoClient' connection for Mocha to quit correctly.
So, I've added an extra method
public async closeAsync () : Promise<void> {
await this.client.close();
}
This method is called after each test.