How to close a MongoDB connection in nodejs - javascript

I have the following method to connect to MongoDB:
import { Db, MongoClient } from 'mongodb';
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
'Please define the MONGO_URI environment variable inside .env.local'
);
}
if (!database) {
throw new Error(
'Please define the DATABASE environment variable inside .env.local'
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
And I use it in the following way:
const { db, client } = await connectToDatabase(
config.URI,
config.USERS_DATABASE
);
const user = await db
.collection(config.USERS_COLLECTION)
.findOne({ _id: new ObjectId(userId) });
It seems to be ok, but it is not. The problem of this method is that it doesn't close the connections. For example I have a cluster on Atlas, and the connections keep growing till 500. after that it doesn't serve anymore, goes in timeout and then my backend crashes.
To solve this I tried with client.close() just before returning the response to frontend.
It throws me one error saying MongoError: Topology is closed, please connect. I believe that it closes the connection before it finishes? is it right? Even if I put it after the DB responded.
this is the screenshot of the error:
Do you think there is a way to solve this or I just have to do the entire procedure in each file I need to connect to mongo? Do you also think I did something wrong?

Related

I can't fix this bug on my backend Express

I have this end point that creates a new server in my database, when I say server, it's just a server name. We have several gaming servers, so, we want to manage them from a website,
const createServer = asyncHandler(async (req, res) => {
const { gameName, serverName } = req.body;
const serverExist = await Server.findOne({ serverName });
if (!gameName || !serverName) {
res.status(400);
throw new Error("Please add game name / server name");
}
if (!serverExist?.isHidden) {
// if i set it to just serverExists it works
res.status(401);
throw new Error("Server exist");
}
//Get user using the id in the JWT
const user = await User.findById(req.user.id);
if (!user) {
res.status(401);
throw new Error("User not found");
}
const server = {
gameName,
serverName,
user: req.user.id,
};
await Server.create(server);
res.status(201).json(server);
});
Now, I'm checking if the server already exists in the database with serverExist, it returns all the document about that server, all works good. It has a property called isHidden, the time I set it in the if statement, the server is not able to return any response. if I set if statement to only if(serverExist), it does work, but I need it to work if the server was set to hidden so that a user could re-create it. I don't want to delete as it will be needed for later.
I checked all the returns and all seems ok, even the server object at the end does have the information. The problem is happening when I'm calling the create method. Don't know why adding serverExist.isHidden makes it unable to create the document!
You've got a little logic problem.
Consider these states when evaluating !serverExist?.isHidden...
State
!serverExist?.isHidden
No record exists
true ⚠️
Record exists with isHidden: true
false
Record exists with isHidden: false
true
I would instead include the isHidden parameter in your query and use the Server.exists() method
const serverExists = await Server.exists({ serverName, isHidden: true });
if (serverExists) {
// Server exists and is not hidden
res.status(409); // 409 Conflict is a better status than 401 Unauthorized
throw new Error("Server exists");
}

How to get a variable from front to a service worker?

Some context
I've created a service worker to send notifications to registered users.
It works well until I tried to implement a sort of id to each people who register to a service worker (to send notification).
I do that because I have to delete old registration from my database, so I took the choice to let each users three registration (one for mobile device and two others for different navigator on computer) and if there is more, I want to remove from the database the older.
Tools
I'm using nodejs, express and mySql for the database.
The issue
When I launch a subscription I got this error:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
I saw in an other post that it's because they try to JSON.parse what's already an object.
But in my case, I can't find where I parse, see the part which are concerned:
// service.js (service worker file)
// saveSubscription saves the subscription to the backend
const saveSubscription = async (subscription, usrCode) => {
const SERVER_URL = 'https://mywebsite:4000/save-subscription'
subscription = JSON.stringify(subscription);
console.log(subscription); // I got here what I expect
console.log(usrCode); // <-------------------------------- HERE I GOT UNDEFIND
const response = await fetch(SERVER_URL, {
method: 'post',
headers: {
'Content-Type' : 'application/json',
},
body : {
subscription: subscription,
usrCode: usrCode
}
})
return response
}
But when I console.log(usrCode) in my inspector, I got the good value.
So how should I do to get the value in service.js
Maybe the problem is from:
const bodyParser = require('body-parser')
app.use(bodyParser.json())
At the beginning I thought that the issue is from the back (because I'm not really good with async function).
And here is the back, If maybe I got something wrong.
// index.js (backend)
// Insert into database
const saveToDatabase = async (subscription, usrCode) => {
// make to connection to the database.
pool.getConnection(function (err, connection) {
if (err) throw err; // not connected!
console.log(usrCode);
console.log(subscription);
connection.query(`INSERT INTO webpushsub (webpushsub_info, webpushsub_code) VALUES ('${subscription}', '${usrCode}')`, function (err, result, fields) {
// if any error while executing above query, throw error
if (err) throw err;
// if there is no error, you have the result
console.log(result);
connection.release();
});
});
}
// The new /save-subscription endpoint
app.post('/save-subscription', async (req, res) => {
const usrCode = req.body.usrCode; // <------------------ I'm not sure about this part
const subscription = req.body.subscription
await saveToDatabase(JSON.stringify(subscription, usrCode)) //Method to save the subscription to Database
res.json({ message: 'success' })
})
By searching on google, I've found this tutorial. So the reason why usrCode is undefined is because the service worker doesn't have access to a data stored in front.
First you have to pass it in the URL as following:
// swinstaller.js (front)
// SERVICE WORKER INITIALIZATION
const registerServiceWorker = async (usrCode) => {
const swRegistration = await navigator.serviceWorker.register('service.js?config=' + usrCode); //notice the file name
return swRegistration;
}
And then get it in the service worker:
// service.js (service worker file)
// get the usrCode
const usrCode = new URL(location).searchParams.get('config');

Nodejs + SocketIO + MySql Connections Not Closing Properly and Creating Database Overhead

I've been having this issue for over a couple of months now, and still can't seem to figure out how to fix it. It seems that I'm experiencing a high number of connections to our database, and I assume it's because our connections aren't closing properly which is causing them to hang for long periods of time. In return this causes a lot of overhead which occasionally causes our web application to crash. Currently the application runs the promise-mysql npm package to create a connection and query the database. Our web application uses socketio to request these connections to our mysql database.
I'm working with existing code that was here before me, so I did not set it up this way. This makes it a bit more confusing for me to debug this issue because I'm not that familiar with how the connections get closed after a successful / unsuccessful query.
When logging errors from our server I'm getting messages like this:
db error { Error: Connection lost: The server closed the connection.
at Protocol.end (/home/ec2-user/myapp/node_modules/mysql/lib/protocol/Protocol.js:113:13)
at Socket.<anonymous> (/home/ec2-user/myapp/node_modules/mysql/lib/Connection.js:109:28)
at Socket.emit (events.js:185:15)
at Socket.emit (domain.js:422:20)
at endReadableNT (_stream_readable.js:1106:12)
at process._tickCallback (internal/process/next_tick.js:178:19) fatal: true, code: 'PROTOCOL_CONNECTION_LOST' }
(Not sure if that has anything to do with the high number of connections I'm seeing or not)
I recently changed the wait_timeout and interactive_timeout to 5000 in MySql, which is way lower than the default 28800, but setting it to this stopped the application from crashing so often.
This is the code for creating the database connection:
database.js file
import mysql from 'promise-mysql';
import env from '../../../env.config.json';
const db = async (sql, descriptor, serializedParameters = []) => {
return new Promise( async (resolve, reject) => {
try {
const connection = await mysql.createConnection({
//const connection = mysql.createPool({
host: env.DB.HOST,
user: env.DB.USER,
password: env.DB.PASSWORD,
database: env.DB.NAME,
port: env.DB.PORT
})
if (connection && env.ENV === "development") {
//console.log(/*"There is a connection to the db for: ", descriptor*/);
}
let result;
if(serializedParameters.length > 0) {
result = await connection.query(sql, serializedParameters)
} else result = await connection.query(sql);
connection.end();
resolve(result);
} catch (e) {
console.log("ERROR pool.db: " + e);
reject(e);
};
});
}
export default db;
And this is an example of what the sockets look like:
sockets.js file
socket.on('updateTimeEntry', async (time, notes, TimeEntryID, callback) => {
try {
const results = await updateTimeEntry(time, notes, TimeEntryID);
callback(true);
//socket.emit("refreshJobPage", false, "");
}
catch (error) {
callback(false);
}
});
socket.on('selectDatesFromTimeEntry', (afterDate, beforeDate, callback) => {
const results = selectDatesFromTimeEntry(afterDate, beforeDate).then((results) => {
//console.log('selectLastTimeEntry: ', results);
callback(results);
})
});
And this is an example of the methods that get called from the sockets to make a connection to the database
timeEntry.js file
import db from './database';
export const updateTimeEntry = (time, notes, TimeEntryID) => {
return new Promise(async (resolve, reject) => {
try {
const updateTimeEntry = `UPDATE mytable SET PunchOut = NOW(), WorkTimeTotal = '${time}', Notes = "${notes}" WHERE TimeEntryID = '${TimeEntryID}';`
const response = await db(updateTimeEntry, "updateTimeEntry");
resolve(response[0]);
} catch (e) {
console.log("ERROR TimeEntry.updateTimeEntry: " + e);
reject(e);
}
});
};
//Gets a List for Assigned Jobs
export const selectDatesFromTimeEntry = (afterDate, beforeDate) => {
return new Promise(async (resolve, reject) => {
try {
const selectDatesFromTimeEntry = `SELECT * FROM mytable.TimeEntry WHERE PunchIn >= '${afterDate}' && PunchIn < '${beforeDate}';`
//console.log("Call: " + selectDatesFromTimeEntry);
const response = await db(selectDatesFromTimeEntry, "selectDatesFromTimeEntry");
//console.log("Response: " + response);
resolve(response);
} catch (e) {
console.log("ERROR TimeEntry.selectDatesFromTimeEntry: " + e);
reject(e);
}
});
};
I just really want to figure out why I'm noticing so much overhead with my database connections, and what I can do to resolve it. I really don't want to have to keep restarting my server each time it crashes, so hopefully I can find some answers to this. If anyone has any suggestions or knows what I can change in my code to solve this issue that would help me out a lot, thanks!
EDIT 1
These are the errors I'm getting from mysql
2020-04-30T11:12:40.214381Z 766844 [Note] Aborted connection 766844 to db: 'mydb' user: 'xxx' host: 'XXXXXX' (Got timeout reading communication packets)
2020-04-30T11:12:48.155598Z 766845 [Note] Aborted connection 766845 to db: 'mydb' user: 'xxx' host: 'XXXXXX' (Got timeout reading communication packets)
2020-04-30T11:15:53.167160Z 766848 [Note] Aborted connection 766848 to db: 'mydb' user: 'xxx' host: 'XXXXXX' (Got timeout reading communication packets)
EDIT 2
Is there a way I can see why some of these connections would be hanging or going idle?
EDIT 3
I've been looking into using a pool instead, as it seems that it is a more scalable and appropriate solution for my application. How can I achieve this with the existing code that I have?
You are opening a new connection for each and every query... Opening a connection is slow, there is a lot of overhead for doing so, and your server certainly does not have unlimited number of connections allowed. The NodeJS mysql package provides a pooling mechanism which would be a lot more efficient for you.
The goal is to reuse the connections as much as possible instead of always disposing of them right after the first query.
In your db.js, create a pool on startup and use it:
var pool = mysql.createPool({
connectionLimit : 10, //Number of connections to create.
host: env.DB.HOST,
user: env.DB.USER,
password: env.DB.PASSWORD,
database: env.DB.NAME,
port: env.DB.PORT
});
To execute your query, you would simply do this:
await pool;
return pool.query(sql, serializedParameters);

How to make sure AMQP message is not lost in case of error in subscriber with rhea?

So I have designed a basic Publisher-Subscriber model using rhea in JS that takes an API request for saving data in DB and then publishes it to a queue.
From there a subscriber(code added below) picks it up and tries to save it in a DB. Now my issue is that this DB instance goes through a lot of changes during development period and can result in errors during insert operations.
So now when the subscriber tries to push to this DB and it results in an error, the data is lost since it was dequeued. I'm a total novice in JS so is there a way to make sure that a message isn't dequeued unless we are sure that it is saved properly without having to publish it again on error?
The code for my subscriber:
const Receiver = require("rhea");
const config = {
PORT: 5672,
host: "localhost"
};
let receiveClient;
function connectReceiver() {
const receiverConnection = Receiver.connect(config);
const receiver = receiverConnection.open_receiver("send_message");
receiver.on("connection_open", function () {
console.log("Subscriber connected through AMQP");
});
receiver.on("error", function (err) {
console.log("Error with Subscriber:", err);
});
receiver.on("message", function (element) {
if (element.message.body === 'detach') {
element.receiver.detach();
}
else if (element.message.body === 'close') {
element.receiver.close();
}
else {
//save in DB
}
}
receiveClient = receiver;
return receiveClient;
}
You can use code like this to explicitly accept the message or release it back to the sender:
try {
save_in_db(event.message);
event.delivery.accept();
} catch {
event.delivery.release();
}
See the delivery docs for more info.

Accessing Mongo DB from within Node

I’m trying to connect to a database through node. I’ve got it working with smaller databases using a Mongo URL of the form:
mongodb://[username]:[password]#db1-a0.example.net:27017/[DB-Name]
When I switched it out to use a larger DB, using the Mongo URL of the form:
mongodb://[username]:[password]#db1-a1.example.net:27017,db2.example.net:2500/[DB-Name]?replicaSet=test
It throws a ‘ RangeError: Maximum call stack size exceeded’ error and won’t connect. This URL is the onlything that has changed between the databases.
I’ve checked the db details and can access it through RoboMongo / Robo 3T so the database definitely exists.
Trying to connect through Mongoose version ^5.2.10 using the following code:
function connect() {
if (MONGO_URL) {
mongoose.connect(MONGO_URL, err => {
if (err) {
console.log('error connecting')
console.log(err)
}
})
} else {
mongoose.connect(`mongodb://${host}`, {
user,
pass,
dbName,
useNewUrlParser: true //depresiation issue
}, err => {
if (err) {
console.log('error connecting')
console.log(err)
}
})
}
}
mongoose.connection.on('error', (message) => {
console.log('connection error!') //This is logged
console.log(message)
process.exit()
})
mongoose.connection.on('disconnected', connect)
connect()
Looks like you are trying to use a replica set. If so try to connect like following`
var uri = `mongodb://${userName}:${encodeURIComponent(password)}#${clusterName}/${dbName}?ssl=true&replicaSet=${process.env.replicaSetName}&authSource=${authDB}`
var db = mongoose.connect(uri).then().catch() // Whatever inside the then and catch blocks
`

Categories

Resources