I tried to close the connection for the below query by using connection.close(). but it is not working, so how to close connection inside route file
var express = require('express');
var router = express.Router();
var connection = require('../database.js');
var db = require('../database.js');
/* GET home page. */
router.get('/', function(req, res, next) {
connection.query("Select users..... ",function(err,supervisorrows) {
if(err){
req.flash('error', err);
res.render('View',{page_title:"Users - Node.js",supervisor:''});
}else{
res.render('View',{page_title:"Users - Node.js",supervisor:supervisorrows.recordset});
}
});
module.exports = router;
According to mysql npm docs There are two ways to end a connection :
connection.end() method
connection.destroy() method
The first one will make sure all previously enqueued queries are still before sending a COM_QUIT packet to the MySQL server.
The second one terminates a connection immediately and guarantees that no more events or callbacks will be triggered for the connection.
Related
const mysql = require('mysql');
let connection = mysql.createConnection(...);
connection.connect((err)=>{
...
connection.query((err)=>{
...
connection.end();});
});
After I close the connection by using
connection.end()
, if I want to query the database again using the same credentials, do I need to make a new connection by calling
mysql.createConnection(...)
Or can I reuse the same connection by simply calling
connection.connect(...)
A little background: I'm deploying an angular/node.js app to a shared hosting website, and the web host has a maximum limit of 25 concurrent connections to mySQL database, therefore I need to make sure I close a connection properly after a user does his query. I am not sure if I could reuse the connection created by mysql.createConnection(...) after I close that connection, or do I need to create a brand new connection.
You can use one global connection for getting data from db .
If You working on single file than you can write as
app.js one file only
var mysql = require('mysql');
var connection = mysql.createConnection(...);
connection.query('SELECT 1', function (error, results, fields) {
if (error) throw error;
// connected!
});
if you want to use same connection in multiple file than you can write as
app.js
app.use(
connection(mysql, {
host: xxxxx,
user: 'root',
password : xxxx,
port : 3306,
database:dbname
},'pool'),
);
var oem = require('./routes/type');
app.get('/api/oemtype',oem.type);
For the second file
type.js
exports.type = function(req, res){
req.getConnection(function(err,connection){
var query = connection.query('SELECT * FROM type',function(err,rows)
{
if(err)
res.json({
status:0
});
res.send(rows);
res.render('customers',{page_title:"Customers - Node.js",data:rows});
});
});
};
No need to use of connection.end().
Currently I dont use connection pooling on my settings since i only have 1-4 users in the application.
According to the doc this is the recommended way.
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'example.org',
user : 'bob',
password : 'secret'
});
connection.connect(function(err) {
if (err) {
console.error('error connecting: ' + err.stack);
return;
}
console.log('connected as id ' + connection.threadId);
});
Now, what i did was export the connection object and shared it to other api resources.
On db.js file
var mysql = require('mysql');
var connection = mysql.createConnection({ ... });
module.exports = connection;
On api.js file
const express = require('express');
const router = express.Router();
const conn = require('./db');
router.post('/create', function (req, res) {
connection.query('INSERT ...', function (error, results, fields) {
if (error) throw error;
// connected!
});
});
router.post('/update', function (req, res) {
connection.query('UPDATE SET ...', function (error, results, fields) {
if (error) throw error;
// connected!
});
});
AND SO ON the same goes to the other api resources that was omitted in these examples..
What is the drawback in this connection design?
Please correct me if I'm wrong. I think, even though you export the connection object, you're still managing the connection one by one.
Opening and maintaining a database connection for each user, especially requests made to a dynamic database-driven website application, is costly and wastes resources
Wikipedia
And from the same doc,
This is because two calls to pool.query() may use two different connections and run in parallel
So the best way is to use pool for managing the connections
Connections are lazily created by the pool. If you configure the pool to allow up to 100 connections, but only ever use 5 simultaneously, only 5 connections will be made. Connections are also cycled round-robin style, with connections being taken from the top of the pool and returning to the bottom.
Hope that helps.
I am trying to incorporate mongoDB into my application, however when I try to add to a collection I get the error 'cannot read property of 'insert' undefined when trying to start the server with nodeJS.
I appreciate that this has been asked before, however when I have tried to rectify the error as per another question asked on here by writing the following code, however I get var is not defined;
var accountCollection = var mongodb = mongodb.collection('account');
accountCollection.insert({username:"wendy3", password:"lilac3"});
The relevant code for my server is below, I have looked at many guides online and nothing seems to solve my problem, so any help would be appreciated.
//create server, listening on port 3000
//when there is a request to port 3000 the server is notified
//depending on the request a specific action will be carried out
var mongodb = require("mongodb");
//create connection to MongoDB
var db = mongodb('localhost:27017/Game', ['account', 'progress']);
//insert into collection
db.account.insert({username:"wendy3", password:"lilac3"});
var express = require('express');
var app = express();
var serv = require('http').Server(app);
var colors = require('colors/safe');
Connection to a database is an asynchronous operation but you're trying to access it as if it was synchronous.
If you look at the examples for the package you're using, it shows that you must you use a callback function which gets called once the connection response is received:
var MongoClient = require('mongodb').MongoClient;
// Connection URL
var url = 'localhost:27017/Game';
// Use connect method to connect to the Server
MongoClient.connect(url, function(err, db) {
if (err) return console.log('Error: ' + err);
console.log("Connected correctly to server");
});
Thank you - I think I understand now ( I am new to MongoDB so please forgive me)
I now have the following code; however I am receiving error invalid schema, expected mongodb. Have I maybe put MongoClient in place of mongodb somewhere? So I can see that it is trying to connect, however the callback is returning an error.
Code:
var MongoClient = require('mongodb').MongoClient;
//connection url
var url = ('localhost:27017/Game', 'account', 'progress');
//use connect method to connect to server
MongoClient.connect(url, function(err, db){
if (err) return console.log('Error: ' + err);
console.log('mongodb is connected to the server');
});
I am really confused on how to build a typical REST type web app with Node.js (even though I am using Sockets).
In a typical WAMP stack you have a public javascript file and server-side PHP files. The user might be able to execute a JS function like
function updateDetails(){
$.post('details.php', formData, function(data){
console.log(data);
},'json');
}
And the server-side PHP file is something like
var stmt = "UPDATE table SET user = :user";
var params = (':user', $user);
stmt->execute();
Now I do know about node-mysql, but I don't see how it is implemented in the same way. How can I maintain a list of statements on the server side and allow the user to simply execute these statements on the client side.
I have node_modules/mysql installed.
In my server (app.js)
var mysql = require('mysql');
But unless I add all my statements there, I don't seem to be able to access them from a public JS file.
There are libraries out there that wrap database tables as resources and expose them as REST services (GET, POST, PUT, DELETE).
For example, sails.js uses Waterline for this purpose. You should look into other similar frameworks if you do not want to write the routes from scratch.
If you're just using express (4.x), then you can declare routes like the following:
// In users.js
var express = require('express');
var users = new express.Router();
users.get('/', function (req, res, next) {
// access mysql, grab all users, then...
res.json(results);
})
users.get('/:id', function (req, res, next) {
var userid = req.params.id;
// access mysql, grab user by id, then...
res.json(result);
})
users.put('/', function (req, res, next) {
var newUser = req.body;
// you can also use POST instead of PUT, and alter where the information for the new user should come from.
// insert a new row with the data found in newUser then...
res.json(createdUser);
});
users.delete('/:id', function (req, res, next) {
var userid = req.params.id;
// delete the user by id then...
res.json(deletedUser);
});
If you're really adventurous, or is a stickler for having http methods do exactly what they're supposed to, you can include a PATCH route for updating a user.
You have the option of either including mysql directly in the route, or declaring another module, then referencing that in the route. I'll finish one of the methods completely.
// more complete version of /users/:id
var mysql = require('mysql');
var connectionString = 'fill me in';
var pool = mysql.createPool(connectionString);
users.get('/:id', function (req, res, next) {
var userid = req.params.id;
var statement = "SELECT * FROM 'users' WHERE id = ?";
pool.query(statement, [userid], function (err, rows) {
if (err) { res.status(500).json(err); } // if you want to expose why you can't get the user by id.
else if (rows.length == 0) { res.status(404); } // could not find user with the given id.
else { res.json(rows[0]); } // return the found user.
});
})
I can't figure out why this code is blocking/preventing the server to fulfill concurrent requests, since I am only using async code. Can someone throw some light?
I am using mongoose streaming feature! The following is an express route.
function getChart(req, res, next) {
var stream = Model.find({}).stream({
transform: JSON.stringify
});
stream.on('data', function(record) {
res.write(record);
});
stream.on('end', function() {
res.end();
});
}
The problem can be verified when requesting around 10000 records from database, which takes around 10 seconds. During this time I open another tab and make a request to any other route, say /home, the content of /home only arrives immediately after the first request finishes!
EDIT
I have just made some basic tests (even set mongoose connection pool to a higher value) and now I know that the problem is not with mongoose, but with nodejs itself or perhaps with express. I created the following test app:
var express = require('express'),
http = require('http'),
app = express();
app.get('/', function(req, res, next) {
console.log('> Request received!');
setTimeout(function() {
console.log('> Request sent!');
res.send(200);
}, 5000);
});
app.listen(8000, function() {
console.log(' > APP LISTENING');
console.log(' > maxSockets: ' + http.globalAgent.maxSockets);
});
Within the 5 seconds of the first request I open more 3 tabs on chrome and make more requests, message > Request received! is only shown after first ones are already sent.
I get 5 as the number of maxSockets
Can anyone help me?