I have a question about the require function of Node.js, imagine we have a module that manages the connection, and many small modules that contain the routes.
An example of connection file: db.js
const mysql = require('mysql');
const connection = mysql.createConnection({
host : '127.0.0.1',
user : 'root',
password : '',
database : 'chat'
});
connection.connect(function(err) {
if (err) throw err;
});
module.exports = connection;
and one of the various files to manage the routes:
const app = express();
const router = express.Router();
const db = require('./db');
router.get('/save',function(req,res){
// some code for db
});
module.exports = router;
Imagine now to have 20 routes with the same require. How will node.js behave? How many times will my connection be created?
How many times will my connection be created?
There will be one connection, because "db.js" runs only once. The things you export get stored (module.exports) and that gets returned by every require("./db"). To verify:
require("./db") === require("./db") // true
Related
Im struggling to create API call to my database from node.js.
i have a postgres instance on Centos with multiple databases and table.
im trying to get table name "test_reslts" from database "sizing_results".
when the url, its just the server ip and port like http://{SERVER IP}:3300/
this is output -
"Cannot GET /"
when adding the table name or db name and table name, the request isn't completed and not output.
my code -
db_connection
const {Client} = require('pg')
const client = new Client({
user: 'postgres',
database: 'sizing_results',
password: 'password',
port: 5432,
host: 'localhost',
})
module.exports = client
api.js
const client = require('./db_connection.js')
const express = require('express'); // To make API calls
const app = express();
app.listen(3300, ()=>{
console.log("Server is now listening at port 3300");
})
client.connect();
app.get('/test_results', (req, res)=>{
client.query(`select * from test_results`, (err, result)=>{
if(!err){
res.send('BLABLA')
res.send(result.rows);
}
});
client.end;
})
Base on your answer to my comment I think it's not a DB connection problem.
It seems to be link to your routing, because it seems node never reach the inside of your route.
Are you sure you are calling the good route (http://localhost:3300/test_results) with no typo ? And with the good protocol (GET) ?
So i built a frontend where you can fill in a movie name, a review and submit it to a database. Now im trying to connect a mysql database i created to the index.js , so that it gets filled with the first entry. Im trying to accomplish it like this:
const express = require('express');
const app = express();
const mysql = require('mysql');
const db = mysql.createPool({
host: "localhost",
user: "root",
password:"password",
database:'CRUDDatabase',
});
app.get('/', (req, res) => {
const sqlInsert = "INSERT INTO Movie_Reviews(movieName, movieReview) VALUES (1,'inception', 'good movie');"
db.query(sqlInsert, (err, result) =>{
res.send("change done");
});
})
app.listen(3001, () => {
console.log("running on port 3001")
})
But somehow the frontend gets the text ive send "Change done" but the database still doesnt show any entries. Any ideas where my mistake may be? Is it a code mistake or does it have to do with me db configuration. In mysql workbench i just created a default connection without changing anything.
EDIT: The Error seems to be the following:
Error: ER_NOT_SUPPORTED_AUTH_MODE: Client does not support authentication protocol requested by server; consider upgrading MySQL client
EDIT:
The following comment here solved my problem:
Execute the following query in MYSQL Workbench ALTER USER 'root'#'localhost' IDENTIFIED WITH mysql_native_password BY 'password'; Where root as your user localhost as your URL and password as your password Then run this query to refresh privileges: flush privileges; Try connecting using node after you do so. If that doesn't work, try it without #'localhost' part.
I think you have an error in your code but you are not showing it as you don't test in err variable, try this code in order to show what error you are getting:
const express = require('express');
const app = express();
const mysql = require('mysql');
const db = mysql.createPool({
host: "localhost",
user: "root",
password:"password",
database:'CRUDDatabase',
});
app.get('/', (req, res) => {
const sqlInsert = "INSERT INTO Movie_Reviews(movieName, movieReview) VALUES (1,'inception', 'good movie');"
db.query(sqlInsert, (err, result) =>{
if(err) {
console.log(err);
res.send(err.toString());
}
res.send("change done");
});
})
app.listen(3001, () => {
console.log("running on port 3001")
})
So as Med Amine Bejaoui pointed out in a comment, the solution is:
Execute the following query in MYSQL Workbench ALTER USER 'root'#'localhost' IDENTIFIED WITH mysql_native_password BY 'password'; Where root as your user localhost as your URL and password as your password Then run this query to refresh privileges: flush privileges; Try connecting using node after you do so. If that doesn't work, try it without #'localhost' part.
const mysql = require('mysql');
let connection = mysql.createConnection(...);
connection.connect((err)=>{
...
connection.query((err)=>{
...
connection.end();});
});
After I close the connection by using
connection.end()
, if I want to query the database again using the same credentials, do I need to make a new connection by calling
mysql.createConnection(...)
Or can I reuse the same connection by simply calling
connection.connect(...)
A little background: I'm deploying an angular/node.js app to a shared hosting website, and the web host has a maximum limit of 25 concurrent connections to mySQL database, therefore I need to make sure I close a connection properly after a user does his query. I am not sure if I could reuse the connection created by mysql.createConnection(...) after I close that connection, or do I need to create a brand new connection.
You can use one global connection for getting data from db .
If You working on single file than you can write as
app.js one file only
var mysql = require('mysql');
var connection = mysql.createConnection(...);
connection.query('SELECT 1', function (error, results, fields) {
if (error) throw error;
// connected!
});
if you want to use same connection in multiple file than you can write as
app.js
app.use(
connection(mysql, {
host: xxxxx,
user: 'root',
password : xxxx,
port : 3306,
database:dbname
},'pool'),
);
var oem = require('./routes/type');
app.get('/api/oemtype',oem.type);
For the second file
type.js
exports.type = function(req, res){
req.getConnection(function(err,connection){
var query = connection.query('SELECT * FROM type',function(err,rows)
{
if(err)
res.json({
status:0
});
res.send(rows);
res.render('customers',{page_title:"Customers - Node.js",data:rows});
});
});
};
No need to use of connection.end().
Currently I dont use connection pooling on my settings since i only have 1-4 users in the application.
According to the doc this is the recommended way.
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'example.org',
user : 'bob',
password : 'secret'
});
connection.connect(function(err) {
if (err) {
console.error('error connecting: ' + err.stack);
return;
}
console.log('connected as id ' + connection.threadId);
});
Now, what i did was export the connection object and shared it to other api resources.
On db.js file
var mysql = require('mysql');
var connection = mysql.createConnection({ ... });
module.exports = connection;
On api.js file
const express = require('express');
const router = express.Router();
const conn = require('./db');
router.post('/create', function (req, res) {
connection.query('INSERT ...', function (error, results, fields) {
if (error) throw error;
// connected!
});
});
router.post('/update', function (req, res) {
connection.query('UPDATE SET ...', function (error, results, fields) {
if (error) throw error;
// connected!
});
});
AND SO ON the same goes to the other api resources that was omitted in these examples..
What is the drawback in this connection design?
Please correct me if I'm wrong. I think, even though you export the connection object, you're still managing the connection one by one.
Opening and maintaining a database connection for each user, especially requests made to a dynamic database-driven website application, is costly and wastes resources
Wikipedia
And from the same doc,
This is because two calls to pool.query() may use two different connections and run in parallel
So the best way is to use pool for managing the connections
Connections are lazily created by the pool. If you configure the pool to allow up to 100 connections, but only ever use 5 simultaneously, only 5 connections will be made. Connections are also cycled round-robin style, with connections being taken from the top of the pool and returning to the bottom.
Hope that helps.
I'm creating a programmer job board app and I'm trying to display json data on my main page. At some point I'll render it, but for now I'm just trying to get it to show up in json form so that I know it works.
I'm able to connect to the server, but when I load the page I get a TypeError (Job.showAllJobs is not a function).
I'm using a crud app I made the other week as a reference, but there are a few differences between it and this project that are throwing me off.
Here's my project's file structure:
job-board
database
connection.js
schema.sql
models
Job.js
User.js
views
index.ejs
login.ejs
server.js
Unlike my previous crud app, this project is using a connection.js file that gave some trouble earlier. At first I thought I was out of the woods, but I think it might be responsible for my current problem.
Not getting GET to work might seem like a minor error, but it's really bugging me and I haven't been able to keep working because of it.
I populated my table (jobs) with a sample listing as a test, but in the very near future I plan on connecting the app to the GitHub jobs api.
server.js:
const express = require('express');
const app = express();
const PORT = 3000;
const bodyParser = require('body-parser');
const methodOverride = require('method-override');
const Job = require('./models/Job');
const User = require('./models/User');
const connection = require('./database/connection')
app.use(bodyParser.json())
app.use(methodOverride('_method'));
const urlencodedParser = bodyParser.urlencoded({ extended: false })
app.set("view engine", "ejs");
///// GET /////
// GET INDEX
app.get('/', (request, response) => {
Job.showAllJobs().then(everyJob => {
response.json('index');
// response.render('index', { jobs: everyJob });
});
});
Job.js
const Job = {};
const db = require('../database/connection');
///// JOBS /////
/// INDEX ///
Job.showAllJobs = () => {
return db.any('SELECT * FROM jobs');
};
module.exports = Job;
module.exports = db;
connection.js
// require database setup to use pg-Promise
const pgp = require('pg-promise')({});
// connection url
const connectionURL = "postgres://localhost:5432/job_board";
// new database connection
const db = pgp(connectionURL);
// module.exports = db;
You have a couple of problems here.
Make sure you're passing the jobs into res.json instead of the string 'index'
Make sure you're exporting db from connection.js
You're exporting both Job and db from Job.js. Since you're exporting db second, it's overriding the export of Job.