Mysql pooling in node.js, should I use only one pool? - javascript

Here's the issue: When I use my local environment MYSQL I get no issues when not using MYSQL pools, however, when I connect to the remote DB that I want to use for production, I get an error about too many connections (it says the current amount of connections is 10).
So here's what I did to solve the issue:
if (typeof GLOBAL.connection === 'undefined') {
GLOBAL.connection = mysql.createPool({
connectionLimit : 10,
host : this.host,
user : this.user,
password : this.password,
database : this.database
});
}
this.connection = GLOBAL.connection;
This solves the issue by creating one global pool, that all queries must run through. The only "problem" as I can see it is that now I have this pool sitting in my global variable.
Each time my queries run it instantiates a new Query(); object, which contains the above code in it. I'm basically just trying to find out if this has repercussions that I can't currently see, that may bite me in the butt later?
Thanks for your help!

Related

NodeJS and Mysql connection configuration ignored

I have a weird behavior while I'm trying to query my MySQL database from a nodeJS API.
I define a connection pool to mysql on node using the following code
const mysql = require('mysql2')
const pool = mysql.createPool({
host: process.env.DB_HOST,
user: 'mydb.user',
database: process.env.DB_DB,
password: process.env.DB_PWD,
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0,
multipleStatements: true
}).promise()
Before that, I was using another user named mydb.owner defined in a .env file.
When I execute a query, I have the following error
Access denied for user 'mydb.owner'#'localhost' to database 'mydb'
That's not the user I've configured, that's the old one.
If I have a look on the Mysql connections, I can see that the user of the pool is correct:
show processlist;
Returns
Id User Host db
6 root localhost:37752 mydb
9 mydb.user localhost:38102 mydb
It seems I haven't any environment variable defined from elsewhere:
echo $DB_USER
Returns nothing.
The user seems to have the necessary rights :
SHOW GRANTS FOR 'mydb.user'#'localhost';
GRANT USAGE ON *.* TO 'mydb.user'#'localhost'
GRANT SELECT, INSERT, UPDATE, DELETE, EXECUTE ON `mydb`.* TO 'mydb.user'#'localhost' WITH GRANT OPTION
I don't understand why mysql2 returns me an error about my old user mydb.owner.
The query was a stored procedure and was created by default with the tag
CREATE definer = 'mydb.owner'#'localhost' PROCEDURE ...
No matter which user execute the stored procedure, it was impersonated with the mydb.owner account.
To specify that the procedure must be executed under the current account, I added the following instructions:
CREATE definer = 'mydb.owner'#'localhost' PROCEDURE ...()
SQL SECURITY INVOKER
BEGIN ...

Knex.js / SQL : Knex / SQL Connection Pools

I have a question regarding SQL connection pools. My team is using the knex.js library in one of our node applications to make database query's.
The application from time to time needs to switch databases. So my team created an initialization function that returns a knex object configured to the correct database. Then that object is used to do said query. To me this seems redundant and can cause bad performance, because we initiate a knex object every time need to do a query instead of reusing a single knex object. Which i could ignore if knex already does this when you which databases (and if anyone could shed light on this question as well that would be FANTASTIC !) . Moreover, (and this leads me to my question titled above) the connection pool properties are redefined. So does that mean we are creating new pools every time, or does the SQL ( SQL Sever in this case) reuse the connection pool you already defined ? The question might not be Knex specific, like if i used a library like knex for C#, and call that library a similar way, would SQL Server know not to make more connection pools?
Example code:
/** db.js
* #param {any} database
* #returns db: Knex
*/
module.exports = ( database ) => {
var knex = require('knex')({
client: 'mssql',
connection: {
database: database,
server: '127.0.0.1',
user: 'your_database_user',
password: 'your_database_password'
},
pool: {
min: 0,
max: 10,
idleTimeoutMillis: 5000,
softIdleTimeoutMillis: 2000,
evictionRunIntervalMillis: 500
}
});
return knex;
};
Index.js
var db = require('./db.js');
/**
* #returns users:Array
*/
const getUsers = async() => {
const users = await db('master')
.select()
.from('users_table')
.orderBy('user_id');
return users;
}
Short answer: The 'singleton' nature of the node require() statement prevents reinitialization of multiple occurrences of knex. So the initially created pool continues to be used for the duration of your process, not recreated, as long as you don't discard the db. variable reference.
More discussion...
... my team created an initialization function that returns a knex
object configured to the correct database. Then that object is used to
do said query. To me this seems redundant and can cause bad
performance, because we initiate a knex object every time need to do a
query instead of reusing a single knex object. Which i could ignore if
knex already does this when you switch databases...
var db = require('./db.js');
The node.js require statement creates a singleton object. (You probably already know) this means that the first time the module is called by your program using the require statement, the module and it's data will be initialized, but successive identical require calls will just reuse the same module reference and will not reinitialize the module.
... the connection pool properties are redefined. So does that mean
we are creating new pools every time, or does the SQL ( SQL Sever
in this case) reuse the connection pool you already defined ?
So since the require()-ed module is not reinitialized, then the originally created pool will not be re-created. Unless you discard the db variable reference (discussed more below).
The question might not be Knex specific, like if i used a library like
knex for C#, and call that library a similar way, would SQL Server
know not to make more connection pools?
Generally speaking, you need to build or acquire connection some code to properly manage a pool of connections throughout the life of your process. Knex and most other database wrappers do this for us. (Under the covers Knex uses this library before v0.18.3 and this one on/after.)
Properly initializing and then using the singly initialized pooling code throughout the life of your application process accomplishes this. Discarding the pool and recreating it within your process defeats the purpose of having pooling. Often pooling is setup as part of process initialization.
Also, this was probably just a misstatement within your question, but your Node.js module is making the connection pools, not the SQL Server.
... The application from time to time needs to switch databases. my
team created an initialization function that returns a knex object
configured to the correct database.
From that statement, I would expect to see code like the following:
var db = require('./db.js');
var dbOther = require('./dbOther.js');
... which each establishes a different database connection. If you are instead using:
var db = require('./db.js');
// ... do other stuff here in the same module ...
var db = require('./dbOther.js');
... then you are likely throwing away the original reference to your first database, and in that case, YES, you are discarding your DB connection and connection pool as you switch connections.
Or, you could do something like the following:
// initialize the 2 connection pools
const dbFirst = require('./db.js');
const dbOther = require('./dbOther.js');
// set the active connection
var db = dbFirst;
// change the active connection
db = dbOther;

Write a listener for mongodb changes on server side

I want to write a listener for changes to a collection in mongoDB for a nodejs app on the server side. I am using Robe to get the oplog. Here is my code:
var co = require('co'),
Robe = require('robe');
co(function*() {
// connect to db
var db = yield Robe.connect('mongodb://localhost/');
yield collection.addWatcher(function(collectionName, operationType, data) {
console.log(collectionName)
});
var oplog = yield db.oplog();
yield oplog.start();
// listen for any operation on any collection
oplog.onAny(function(collectionName, operationType, data) {
console.log("something happened!!!")
});
})
.catch(function(err) {
console.error(err);
});
The documentation for Robe says that to get oplog, I need to connect to a replica set in mongoDB. I have been reading on the replica set on mongoDB and I have not been abe to make much sense of it. I did create a replica set called 'rs0'. I ran this command to start mongod:
mongod --replset "rs0"
It is still not doing anything upon a DB change. Is this really the right way to do this?
Whenever you start mongod instances with a replSet configuration, you will have to initiate the replica set with rs.initiate() command then add the replica set members.
When you are connecting to the replica set using driver clients, give the complete list of replica set mongod hosts, so that we will would be taking the advantage of automatic failover of replica sets.
I figured it out. You need to initiate the replset. In your MongoDb terminal, type:
rs.initiate()
And also make sure that you are connecting to the correct host name for the db.

Node.js with mysql from felixge design and strange behaviour

Hello I noticed some strange behaviour in node.js with felix geisendörfers awesome mysql module.
I have the following structure in my express app.
app.js (main)
routesA.js
routesB.js
routesC.js
The routes use the Router function of express.
Every routing file (A-C) has to access the mysql server.
But I was to lazy to write a connection and the connection options in every routing script file.
So I made another file called DBServer.js It is as follows (inspired by some hints from someone here in stackoverflow):
var mysql = require('mysql');
exports.connect = function (){
var db_config = {
host: '127.0.0.1',
user: 'my username',
password: '********',
database: 'my database'
};
var connection;
function autoConnect() {
connection = mysql.createConnection(db_config);
connection.connect(function(err) {
if(err) {
console.log('DBServer Error: cannot connect to db. Reconnect attempt in 2 seconds...\nError: ', err);
setTimeout(autoConnect, 2000);
}
else{
console.log('DBServer connected successfully...');
}
});
connection.on('error', function(err) {
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
console.log('DBServer Error: lost connection. Reconnect attempt in 2 seconds...\nError: ', err);
autoConnect();
}
else {
console.log('DBServer Error: minor error\nError: ', err);
}
});
}
autoConnect();
return connection
}
In every routing file I require this DBServer file via:
var db = require('./lib/DBServer').connect();
When I start my app, the console logs 3 times
DBServer connected successfully...
DBServer connected successfully...
DBServer connected successfully...
... as intended.
Everything works perfect. I run the App with forever and every time the script losses connection to the db-server (what happens from time to time) it reconnects again... as intended.
... BUT! Except of one script. Script routesA.js stops working when it comes to a mysql query. The script freezes but does not quit. I have to stop and restart it again.
There is no difference between the invocation of DBServer.js between routesA, routesB or routesC. And it works pretty good... But it seems that if routesA looses connection it does not reconnect again... routesB and routesC still works fine.
So I changed the way how script routesA.js connects to the database. I connect now in script routesA.js not via the DBServer.js and require but the manual way
var mysql = require('mysql');
var db = mysql.createConnection({
host: '127.0.0.1',
user: 'my username',
password: '******',
database: 'ma database'
});
db.connect();
Now it works... and runs for days without problems. But the reason why this works is, because now I don't have an error handling in script routesA.js.. so forever detects a script exit and restarts... and everything works again.
But I don't want that way. I want a proper error handling like in DBServer.js. As said this works for script B and C, but not script A...
I know that it is strange and difficult to say, what might be the problem. But maybe someone has had some similar problem.
Another question here is: How do you handle the database connections with multiple script files. I there a way to share one mysql connection for all script files you have in an app?
kind regards
martin
The reason probably has to do with the fact that you're returning the initial connection object on require(), but if you get disconnected, you reassign the connection variable which the external scripts do not have a reference to (they still only have a reference to the old/original connection object).
I should also note that if you're using the mysql2 module (compatible with mysql except much faster), there is a connection.ping() method that you can use to periodically ping the server to help keep the connection alive.

What is the best way to open a persistent (mongo) database connection in NodeJS

I am using the node-mongodb-native drivers and I am looking for a way to open a persistent database connection rather than opening/closing it each time.
A simplified connection might look like this...
var DB = new mongo.Db('vows', new mongo.Server("127.0.0.1", 27017, {})),
connection = DB.open(function(err, db) {
// Here we have access to db
});
How can I make the db object accessible to any module in my application? Rather than having to open the connection for every module separately?
Can this be done using module.exports? Or a global variable?
My solution:
getClient = function(cb) {
if(typeof client !== "undefined") {
return cb(null, client);
} else {
db.open(function(err, cli) {
client = cli;
getClient(cb);
});
}
}
Now, instead of
db.open(function(err, client) {
...stuff...
});
Do:
getClient(function(err, client) {
...stuff...
});
Your first db call opens a connection, the others use that connection.
BTW: suggestions on checking that client is still alive?
Edit: Don't use mongoose, Use something like mongo-col or mongo-client. Then have a single client open in your application. I have ./client.js file that exports a properly opened and configured mongo client.
Mongoose is a solid abstraction on top of mongodb that will allow you to handle mongodb more easily. It's a worth a look.
What you really want to do though is re-open your client every time you do anything with mongo.
You don't keep an open connection to any other database.
Just place your DB in a module along with some helper / wrapper functions.

Categories

Resources