I am creating a dashboard that displays a number of data in different charts (e.g. horizontal bars, pie charts etc.)
I am using node js as a backend, MySQL as database, and ejs to render the data to html page.
I have had MySQL queries ready to query different data. The problem that I am having when I need to encapsulate these MySQL queries inside the routing function and pass the results to ejs.
Here is the sample code
router.get('/participant-module',(req,res)=>{
let sql1 = dbModel.participant_per_module; // MySQL query string
let sql2 = dbModel.error_per_module; // MySQL query string
let count_emp = [];
let count_error = [];
db.query(sql1, (err, result)=>{
if(err) throw err;
count_emp.push(result);
});
db.query(sql2, (err, result)=>{
if(err) throw err;
error_count.push(result);
});
res.render('dashboard', {emp_count:count_emp, error_count:count_error}); // pass array to ejs
});
count_emp and count_error are display in different charts. But, here the count_emp and count_error are always null.
I have tried to search for similar problem in the forum and the cause seems to be db.query is async so it won't wait and thus when res.render send the data, emp_count and error_count are still null.
So does anyone have a workaround on this?
Thank you in advance
D
Your db.query calls and res.render are running asynchronously without waiting for the query result. What is happening is that, your first query is fired, then immediately your second query is fired, then immediately your router is returning the response. You are not waiting for the queries to retrieve the results. That's why your count_emp and count_error is [] every time the response is rendered.
Try this:
router.get('/participant-module',(req,res)=>{
let sql1 = dbModel.participant_per_module; // MySQL query string
let sql2 = dbModel.error_per_module; // MySQL query string
let count_emp = [];
let count_error = [];
db.query(sql1, (err, result)=>{
if(err) throw err;
count_emp.push(result);
db.query(sql2, (err, result)=>{
if(err) throw err;
error_count.push(result);
res.render('dashboard', {emp_count:count_emp, error_count:count_error}); // pass array to ejs
});
});
});
PS: You should try to avoid over-nesting of callbacks. It is commonly referred as callback hell. Some common ways are: 1. Using promises 2. Using named functions
Related
first of all: I am new to Node and also pretty new to JS. And asynchronous behavior seems to be something my brain refuses to understand.
I have the following issue:
I have a database table containing amongst other information the field "re_nummer" which stands for invoice number.
Each time my app creates an invoice, I want to read the actual re_nummer from the database, increase it by +1 and then write it back to the data base.
I assume this is a asynchronous behaviour and I have no spent a full day with google and here to solve the issue, but it seems the brain fog is too thick.
Here is my code:
const mysql2 = require("mysql2");
const db = mysql2.createConnection({
host: "localhost",
user: "root",
database: "mit_absicht_gluecklich",
});
db.connect((err) => {
if (err) throw err;
});
/**
* First database query: get the most recent invoice number (re_nummer)
*/
let sql = "SELECT re_nummer From admin WHERE id = 1;";
let re_nummer;
db.query(sql, (err, result) => {
if (err) throw err;
let re_nummer = result[0].re_nummer;
console.log("Rechungsnummer: ", re_nummer);
});
/**
* Second databas query: update database with re_nummer_neu = re_nummer +1
*/
let re_nummer_neu = re_nummer + 1;
sql = `UPDATE admin SET re_nummer = ${re_nummer_neu} WHERE id = 1;`;
db.query(sql, (err, result) => {
if (err) throw err;
console.log(re_nummer_neu);
});
module.exports = re_nummer;
This code doesn't work because it starts with the second database request before the first one i finalized.
How to solve this issue??
Also important to mention: this is a node module. I need the re_nummer to be available outside the module. This is why I use module.exports = re_nummer;
In case I am asking a question that has been asked already multiple times, I apologize in advance. I have honestly given my best to understand other threats with similar content. 100% fail!!!
Many thanks for some help and or comments!
Klaus
I'm currently making a discord bot with discord.js v13. Right now I have a database where the guild.id and the standard prefix of a server are stored in. Now I want to rewrite a command which gets triggered by the prefix and the name of the command, like this '!somecommand'. Currently my prefix is defined in the file with a const variable, but I want the bot to check the database for the prefix the server has, and use this one instead. I'm checking the prefix in the database with this part of code:
pool.getConnection(function(err, connection) {
if (err) throw err;
let sql = `SELECT * FROM custom_prefix WHERE guild_id = '${message.guild.id}'`;
connection.query(sql, async function (err, result) {
if (err) throw err;
console.log(result[0].prefix)
connection.release();
});
});
The output is the current prefix of the server where the command was triggered, so far everything works fine.
But as I said I want the output of the code above to be the prefix with which the bot gets triggered.
I already tried to do it, but I'm always making a mistake.
Most of the time the bot is checking the database too slow and the result will be 'undefined'.
I dont know how I make the bot to wait for the result from the database, check if this result is really the prefix and then execute the command.
I am happy about any answer :-)
If you need any more information please let me know.
I'm guessing you did put the code that uses the result outside of the callback.
pool.getConnection(function(err, connection) {
if (err) throw err;
let sql = `SELECT * FROM custom_prefix WHERE guild_id = '${message.guild.id}'`;
connection.query(sql, async function (err, result) {
if (err) throw err;
console.log(result[0].prefix)
connection.release();
//
// Put your code that uses result[0].prefix here
//
});
});
//
// Do not put your code that uses result[0].prefix here
//
I am trying to extract data from the following MySQL query and storing it in a variable so I can use it but when I console log it to see if data was successfully pulled, I see a bunch of irrelevant stuff and not the data. Any help will be greatly appreciated it.
const body = req.body
const varQuery = await pool.query('SELECT id, name, email FROM registration WHERE email = ?', [body.email])
console.log(varQuery)
If you are using mysql, you have to pass callback as an argument, instead of using the async/await syntax. It would look something like that:
pool.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
if (error) throw error;
console.log('The solution is: ', results[0].solution);
});```
Im writting a node app to log some informations in a mongo database.
Below is the snippet code that called each time i need to store log in the mongo database.
const mongo = {}
const mongo_cli = require('mongodb').MongoClient
module.exports = {
log (l) {
mongo_cli.connect(the_mongo_url, (error, client) => {
if (error) throw error;
mongo.cli = client;
mongo.db = client.db(the_database);
//insert and update operations
});
}
}
The code above work for now. I mean, I can insert and update logs already inserted at the price of one (or more) connection (s) that I never close due to my lack of control of callback functions.
So, how can i structure it better so that i can just have only one mongo_cli call to not consume too many ressources ?
I'm trying to import a table from mysql to mongodb straight without any schema changes.
I wrote a small node script for that and my issue is with the way i implemented it.
Maybe I hit some limit of using mongo db insert limit while using it inside a loop.
I think this problem would not have had come if it was in reverse (maybe not! )
So here's the thing.
The row in the mysql table is more than 100,000 but when the loop hit's more than around 30000 the number of inserted items just get reduced.
so let's say if there was 100,000 items in the mysql table after complete import using the below mentioned script, i get only a maximum of 37000 or so.
My strong suscpicion is either in the node script/node mongodb connector, or some bug in the script or lastly a limit in mongodb concurrent db inserts.
I'm pasting the script below.
Hoping i get a way around it.
Thanks,
var http = require('http'),
mysql = require('mysql'),
mongo = require('mongodb').MongoClient,
format = require('util').format;
var connection = mysql.createConnection({
user: "xxx",
password: "xxx",
database: "mydb"
});
connection.connect();
var query = "select * from mytable";
var mysqlrows = '';
connection.query(query, function(err,rows,fields){
if(err) throw err;
console.log(rows.length+'rows found.');
mongo.connect('mongodb://root:root#127.0.0.1:27017/mydb', function(err, db){
if (err)
throw err;
var collection = db.collection('mytable');
for(var i=0; i<rows.length;i++)
{
//console.log(JSON.stringify(rows[i]));
(function(i){
collection.insert(rows[i],function(err,docs){});
console.log(i);
})(i);
}
db.close();
});
});
connection.end();
The problem is that you're not waiting for the insert operations to complete before closing your connection to MongoDb via the db.close(); call. You need to keep track of your outstanding asynchronous requests and then only call db.close(); when they've all completed.
To make sure that you are getting all the data from mySQL, try to access the last row. If you can get it, use the flag w and j of mongodb to make sure that each call inserts the data before moving to the next. with the w and j flag, you should consider multiple inserts by inserting multiple rows at each call using and array.