This question already has answers here:
where to destroy knex connection
(3 answers)
Closed 7 months ago.
I'm confused on where to use knex.destroy() in my Node API.
If I don't use knex.destroy() after I open the connection to make a call, the connection pool fills up over time, leading to error:
Unhandled rejection TimeoutError: Knex: Timeout acquiring a connection. The pool is probably full. Are you missing a .transacting(trx) call?
If I close the connection, which makes sense to me, when I'm done with it,
router.get('/users', function(req, res, next) {
var select = knex.select('*').from('users');
select.then((result) => {
res.send(result);
}).catch((error) => {
res.send(error);
}).finally(function() {
knex.destroy(); // close it when I'm done
});
});
The connection is closed for separate API calls:
Unhandled rejection Error: Unable to acquire a connection
at Client_PG.acquireConnection (/var/app/current/node_modules/knex/lib/client.js:331:40)
So where and when do I actually destroy the connection? Again, this Node application simply serves as an API. Each API call should open, then close, the connection, but knex doesn't seem to like this.
Router files that require knex: (I do this for each router file)
const knexService = require('../knexService');
const bookshelf = knexService.bookshelf;
const knex = knexService.knex;
let User = require('../models/User');
module.exports = function(app, router) {
router.get('/users', function(req, res, next) {
var select = knex.select('*').from('users');
select.then((result) => {
res.send(result);
}).catch((error) => {
res.send(error);
}).finally(function() {
knex.destroy(); // close it when I'm done
});
});
...
UserModel file
const knexService = require('../knexService');
const bookshelf = knexService.bookshelf;
var BaseModel = require('./BaseModel');
var addressModel = require('./Address').Address;
var User = BaseModel.extend({
tableName: 'users',
hasTimestamps: true,
addresses: function() {
return this.hasMany(addressModel);
}
});
KnexService.js
const knexfile = require('./knexfile');
const knex = require('knex')(knexfile.production);
const bookshelf = require('bookshelf')(knex);
module.exports.knex = knex;
module.exports.bookshelf = bookshelf;
KnexFile.js
module.exports = {
development: {
client: 'pg',
version: '7.2',
connection: {
...
knex.destroy() should be called when you want to knex to throw away all connections from pool and stop all timers etc. so that application can end gracefully.
So basically that should be called only on application exit unless you are doing some more complex stuff for example with connecting to multiple databases with multiple knex instances.
If you are running out of connections and pool fills up it means that you have problems in your code. Possible reasons could be:
making too many long lasting queries concurrently
creating transactions and never committing / rolling back so those connections are never returned to the pool
bug in knex / bookshelf / some middleware
You should try to pinpoint which parts of your app causes pool to fill up, remove all the extra code like bookshelf related stuff and find the minimal setup which you can use to replicate your problem (also remove all transactions to start with).
Are you really using postgresql 7.2 or are you connecting some custom postgresql compatible DB? That could cause some issues, but I don't think those would reveal themselves in that way, but by having broken connections to be left in pool.
Related
I'm new to javascript, node.js (or backend at all). I am trying to create a controller for the login page requests and I am confused about getting data from the MYSQL table and User Authentication and working with JWT package !
In my Controller, I first check if the user input is available in the user table (with a simple stored procedure), then I compare the database password and the user input, after this I want to create a token and with limited time. (I have watched some tutorial videos about JWT and there is no problem with it), my main problem is to figure out how to write a proper controller with this functions?
I have 2 other questions:
1.Is it the right and secure way to get data from MySQL table inside the route? Or should I create a JS class for my controller? (I'm a bit confused and doubtful here)
2.Assuming that comparePassword() returns true, how can I continue coding outside of the db.query callback function scope? Because I have to execute comparePasssword() inside db.query callback
loginController.js :
const { validationResult } = require('express-validator');
const bcrypt = require('bcrypt');
const db = require('../../sqlConnection')
let comparePassword = (dbPass, inputPass) => {
bcrypt.compare(inputPass, dbPass, function(err, result) {
console.log(result)
});
}
// for get request
exports.getController = (req, res) => {
res.send('login')
}
// for post request
exports.postController = (req, res) => {
let errors = validationResult(req)
if(!errors.isEmpty()) {
res.status(422).json({ errors: errors.array() })
}
// find data from MYSQL table
let sql = `CALL findUser(?)`
db.query(sql, [req.body.username], (err, res) => {
if(err) console.log(err)
//console.log(Object.values(JSON.parse(JSON.stringify(res[0]))))
var data = JSON.stringify(res[0])
data = JSON.parse(data).find(x => x)
data ? comparePassword(data.password, req.body.password) : res.status(400).send('cannot find
user')
})
res.send('post login')
}
login.js :
const express = require('express')
const router = express.Router()
const { check } = require('express-validator');
const loginCont = require('../api/controllers/loginController')
router.route('/')
.get(
loginCont.getController
)
.post(
[
check('username').isLength({min: 3}).notEmpty(),
check('password').isLength({min: 4}).notEmpty()
],
loginCont.postController
)
module.exports = router
In my point of view, looks like there is no easy answer for your question so I will try to give you some directions so you can figure out which are the gaps in your code.
First question: MySQL and business logic on controller
In a design pattern like MVC or ADR (please take a look in the links for the flow details) The Controllers(MVC) Or Actions(ADR) are the entry point for the call, and a good practice is to use these entry points to basically:
Instantiate a service/class/domain-class that supports the request;
Call the necessary method/function to resolve what you want;
Send out the response;
This sample project can help you on how to structure your project following a design pattern: https://riptutorial.com/node-js/example/30554/a-simple-nodejs-application-with-mvc-and-api
Second question: db and continue the process
For authentication, I strongly suggest you to take a look on the OAuth or OAuth2 authentication flow. The OAuth(2) has a process where you generate a token and with that token you can always check in your Controllers, making the service a lot easier.
Also consider that you may need to create some external resources/services to solve if the token is right and valid, but it would facilitate your job.
This sample project should give you an example about how to scope your functions in files: https://github.com/cbroberg/node-mvc-api
Summary
You may have to think in splitting your functions into scoped domains so you can work with them in separate instead of having all the logic inside the controllers, then you will get closer to classes/services like: authenticantion, user, product, etc, that could be used and reused amount your controllers.
I hope that this answer could guide you closer to your achievements.
I'm trying to get my frontend to watch for events whenever a certain table on my postgres db is altered.
The Postgres events fire perfectly and I'm able to relay them to through the Socker.io connection, but I'm having reliability issues. I'm getting (node:26) UnhandledPromiseRejectionWarning: Error: Client was closed and is not queryable errors on my server and often events are not emitted and caught by Socket. I'm assume it has to do with the way I connect to Socket / db clients.
pg config:
const {Pool} = require('pg');
const production = process.env.NODE_ENV === 'production';
const connectionString = `postgresql://${process.env.DB_USER}:${process.env.DB_PASSWORD}#${process.env.DB_HOST}:${process.env.DB_PORT}/${process.env.DB_DATABASE}`
const pool = new Pool({
connectionString: process.env.CONNECTION_STRING ? process.env.CONNECTION_STRING : connectionString,
ssl: production,
connectionTimeoutMillis : 5000,
idleTimeoutMillis : 30000
});
index.js
io.of("/marketDetails").on('connect', (socket) => {
pool.connect((err, client, release) => {
if (err) {
console.log(err);
return;
}
client.query('LISTEN update_table');
client.on('notification', async(data) => {
console.log("notified of table change")
handleDBEvent(socket, data);
})
socket.on("disconnect", () => {
client.query('UNLISTEN update_table');
})
release();
})
});
I get notified on certain table changes but very inconsistently.
You are immediately releaseing the database client that you acquired, before any notifications can happen, and you're getting the error message every time the socket disconnects and you try to run the UNLISTEN command on the released client whose connection was closed after 30s.
Instead, use
socket.on("disconnect", async () => {
try {
await client.query('UNLISTEN update_table');
} finally {
release();
}
});
Btw I would recommend not to acquire a new database connection for each socket.io client, the database is far too valuable for that. Instead, create a single client for your app (you might not even need a pool), have it listen to the update_table events (maybe only when sockets are connected), and then broadcast each event to all currently-connected sockets.
Im writting a node app to log some informations in a mongo database.
Below is the snippet code that called each time i need to store log in the mongo database.
const mongo = {}
const mongo_cli = require('mongodb').MongoClient
module.exports = {
log (l) {
mongo_cli.connect(the_mongo_url, (error, client) => {
if (error) throw error;
mongo.cli = client;
mongo.db = client.db(the_database);
//insert and update operations
});
}
}
The code above work for now. I mean, I can insert and update logs already inserted at the price of one (or more) connection (s) that I never close due to my lack of control of callback functions.
So, how can i structure it better so that i can just have only one mongo_cli call to not consume too many ressources ?
I have a server on sails nodejs and I am trying to connect my controllers with my MySQL db through a wrapper file that would create the connection pool. My purpose is that I use that pool everytime a function in any controller needs to interact with DB, and in such a way that connection is created at the time interaction starts and connection is closed at the time interaction is over. For this, I have created a wrapper file db.js
db.js
var mysql = require('mysql');
var connection = mysql.createConnection({
host:"localhost",
port: '3306',
user:"ye_old_username",
password:"ye_old_password",
database: "ye_old_schema"
});
module.exports = connection;
Now, I am creating a connection pool called ConnectionPool.js
ConnectionPool.js
var mysql = require('mysql'),
config = require("./db");
/*
* #sqlConnection
* Creates the connection, makes the query and close it to avoid concurrency conflicts.
*/
var sqlConnection = function sqlConnection(sql, values, next) {
// It means that the values hasnt been passed
if (arguments.length === 2) {
next = values;
values = null;
}
var connection = mysql.createConnection(config);
connection.connect(function(err) {
if (err !== null) {
console.log("[MYSQL] Error connecting to mysql:" + err+'\n');
}
});
connection.query(sql, values, function(err) {
connection.end();
if (err) {
throw err;
}
next.apply(this, arguments);
});
}
module.exports = sqlConnection;
I have followed the method answered on this question to create the connection pool: How to provide a mysql database connection in single file in nodejs
And finally, I am trying to run a function from a controller using the wrapper and the connection pool. The code inside the Controller is
var connPool = require('./ConnectionPool');
module.exports = {
testConn: function(req, res){
connPool('SELECT * from user where ?', {id: '1'}, function(err, rows) {
if(err){
sails.log.debug(err);
}else{
console.log(rows);
}
});
}
};
All the three files, the wrapper, the connection pool, and the controller are in the same Controllers folder.
Now, when I send a request to the URL through my client, that would invoke the testConn function inside the controller, I get the following response on server log:
[MYSQL] Error connecting to mysql:Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''#'localhost' (using password: NO)
This error is coming from the line connection.connect(function(err) { in connection pool file.
When I try to log on my MySQL db through the same credentials on command line, I am through it. Therefore I believe that db.js file has some format related issue because of which a proper connection is not getting initiated. There can be other reason as well, but the reason I suspect seems to be very strong.
I need some guidance on solving this issue. Any help will be appreciated.
Background Information
I'm attempting my first node.js API/application. As a learning exercise, I'm trying to create some test cases initially delete all records in a table, insert 3 specific records, and then query for those 3 records.
Code
Here's the code I have cobbled together:
http://pastebin.com/duQQu3fm
Problem
As you can see from the code, I'm trying to put the database connection logic in a dbSession.js file and pass it around.
I am able to start up the http server by doing the following:
dev#devbox:~/nimble_node$ sudo nodejs src/backend/index.js
Server started and listening on port: 8080
Database connection successful
However, when I try to run my jasmine tests, it fails with the following error:
F
Failures:
1) The API should respond to a GET request at /api/widgets/
Message:
TypeError: Object #<MongoClient> has no method 'collection'
Stacktrace:
TypeError: Object #<MongoClient> has no method 'collection'
at resetDatabase (/home/dev/nimble_node/spec/resetDatabase.js:6:29)
at /home/dev/nimble_node/spec/e2e/apiSpec.js:23:25
at /home/dev/nimble_node/node_modules/async/lib/async.js:683:13
at iterate (/home/dev/nimble_node/node_modules/async/lib/async.js:260:13)
at async.forEachOfSeries.async.eachOfSeries (/home/dev/nimble_node/node_modules/async/lib/async.js:279:9)
at _parallel (/home/dev/nimble_node/node_modules/async/lib/async.js:682:9)
at Object.async.series (/home/dev/nimble_node/node_modules/async/lib/async.js:704:9)
at null.<anonymous> (/home/dev/nimble_node/spec/e2e/apiSpec.js:19:9)
at null.<anonymous> (/home/dev/nimble_node/node_modules/jasmine-node/lib/jasmine-node/async-callback.js:45:37)
Finished in 0.01 seconds
1 test, 1 assertion, 1 failure, 0 skipped
Database connection successful
Line 6 of resetDatabase is:
var collection = dbSession.collection('widgets');
Given that after the error appears, I get the "Database connection successful" message, I think what's happening is that when the tests request the dbSession library, the database hasn't finished running the code to connect. And therefore, I can't get the collection object.
I'm currently reading through the mongodb online manual to see if I can find some hints as to how to do something like this.
Any suggestions or pointers would be appreciated.
EDIT 1
To prove that there is a collection method on the MongoClient object, I changed the dbSession.js code to look like this:
'use strict';
var DBWrapper = require('mongodb').MongoClient;
var dbWrapper = new DBWrapper;
dbWrapper.connect("mongodb://localhost:27017/test", function(err, db) {
if (!err) {
console.log("Database connection successful");
dbWrapper = db;
var collection = dbWrapper.collection('widgets');
console.log('just created a collection...');
}
});
module.exports = dbWrapper;
And now, when I start up the http server (index.js), notice the messages:
dev#devbox:~/nimble_node$ sudo nodejs src/backend/index.js
Server started and listening on port: 8080
Database connection successful
just created a collection...
It could be an async issue.
Your code in dbSessionjs
dbWrapper.connect("mongodb://localhost:27017/test", function(err, db) {
if (!err) {
console.log("Database connection successful");
dbWrapper = db;
}
});
module.exports = dbWrapper;
Starts the connection at dbWrapper asynchronously, but exports dbWrapper right away, which is then imported in resetDatabase. Thus yes, the connect function may have not yet returned from the async function when you call it in resetDatabase (and is what the log suggests,as the error appears before the success log).
You could add a callback after dbWrapper.connect() returns, in order to actually only be able to use dbWrapper when the connection finished.
(With sqlite, this may not happen as it accesses the DB faster on the commandline).
This may not be your problem but looks like a candidate.
EDIT: Here's a possible example for a callback, but please take note it depends on what you need to do so there are a lot of different solutions. The key is to call a callback function when you are done initializing.
Another solution could be to simply wait, and/or poll (e.g. chcke a variable 'initialized').
'use strict';
var DBWrapper = require('mongodb').MongoClient;
var dbWrapper = new DBWrapper;
function doConnect(callback) {
console.log("Initializing DB connection...");
dbWrapper.connect("mongodb://localhost:27017/test", function(err, db) {
if (!err) {
console.log("Database connection successful");
dbWrapper = db;
var collection = dbWrapper.collection('widgets');
console.log('just created a collection...');
console.log('calling callback...');
callback(dbWrapper);
} else {
console.log("Error connectingi: " + err);
}
});
};
doConnect(function(correctDbWrapper) {
//Now you can use the wrapper
console.log("Inside callback, now consuming the dbWrapper");
dbWrapper = correctDbWrapper;
var collection = dbWrapper.collection('widgets');
});
It's interesting though I never ran into this issue, although I have generally used similar code like yours. I guess because normally I have this DB initialization right at the top, and then have to do lots of initializations on the node app, which gives the app time enough to return from the connect call....