Express: MySQL query doesn't push code to a list - javascript

I have been trying to push the results of MySQL query to a list(shoppingList), but the shoppingList is always empty, even though I'm getting 2 results back.
I think the problem is somewhere with handling the promise, but I haven't been able to figure it out:
static async showShoppingList(userId) {
let shoppingList = [];
const sql =
"SELECT item_name, family_member_name FROM shopping_list WHERE user_id = (?)";
await db.promise().query(sql, [userId], function (err, results) {
for (const result of results) {
shoppingList.push(result)
console.log(shoppingList) // Here it appears with results
}
});
console.log(shoppingList) // Here it appears empty
return shoppingList;
}

That's because the callback function(err, results) is executed asynchronously, after the database has returned the results, whereas the other statement is executed synchronously:
db.promise().query sends the query to the database.
The empty shoppingList is written to the console and returned.
After the database has returned the results, the callback function is executed, writing the full shoppingList to the console.

Related

Creating a query that relies on results of previous query using node-postgres

The first query to the postgres database is a SELECT query gets all shift/break data for the current day which is used to determine the type of scan
The second query is an INSERT query that is dependent on the results of the first query
My handler function looks like this right now:
const scanEvent = (request, response) => {
employee_id = request.body.employee_id;
var shifts;
Promise.all([
pool.query('Select * FROM shifts WHERE employee_id=$1 AND date=CURRENT_DATE', [employee_id])
]).then(function([queryResults]) {
shifts = queryResults.rows;
}).catch(function(e) {
response.status(500).send('Error retrieving data');
})
// based on the results of the query there will be a bunch of different cases for
// INSERT queries to put in the proper data to the database
if(shifts.length == 0) {
Promise.all([
pool.query('INSERT INTO shifts (employee_id, start_time) VALUES ($1, NOW())', [employee_id])
]).then(function() {
response.status(204).send('Successfully Inserted');
}).catch(function (e) {
response.status(500).send("Error");
});
} // else if ... handle all other cases
}
My issue is that I cannot access the results of the first query as it seems that the shifts variable is local in scope to the first Promise.all
** EDIT **
I have now realized my approach was not optimal (was just learning node-postgres) A better way to solve this problem is to use async / await:
const scanEvent = async (request, response) => {
employee_id = request.body.employee_id;
var shifts;
const getShifts = await pool.query('Select * FROM shifts WHERE employee_id=$1 AND date=CURRENT_DATE', [employee_id]);
shifts = getShifts.rows;
// based on the results of the query there will be a bunch of different cases for
// INSERT queries to put in the proper data to the database
if(shifts.length == 0) {
await pool.query('INSERT INTO shifts (employee_id, start_time) VALUES ($1, NOW())', [employee_id]);
} // else if ... handle all other cases
}
The variable shifts will not yet have a value when the if statement is executed, because it receives its value only in the .then function. Therefore, if the second half or your code relies on the value of shifts, move it into the .then function:
.then(function([queryResults]) {
shifts = queryResults.rows;
if(/* first scan therefore scanning in for shift */) {
...
} // else if ... handle all other cases
})
(If you want two independent queries executed in parallel, see here.)

NodeJS - code inside an async function executes out of order

I have a werid problem. What i want to do is, when calling an api, i want to make a request to the postgresql db with a bulk insert. First, i execute the loop which reads a file and extracts data from it to form the values array. Then, i want to generate a bulk insert request with pg-promise library. But when the code executes, what i get is that it tries to generate a request before the loop even starts, and it throws an error about an empty array. What the?
async import(req, res) {
var rd = readline.createInterface({
input: fs.createReadStream('.../file.csv'),
console: false
});
const createQuery = `INSERT INTO
table('columns')
VALUES ?`;
const values = [];
rd.on('line', function(line) {
//stuff
values.push({
//stuff
});
}
});
const cs = new pgp.helpers.ColumnSet(['columns'], {table: 'table'});
const query = pgp.helpers.insert(values, cs);
I've removed the details, but i hope this provides enough info. I've tried to put console logs before, in and after the loop, and first, the stuff before and after the loop gets logged, the error is thrown, and then the loop executes and logs stuff. Do i not understand or miss something?
Just making the function async doesn't accomplish anything for you here by itself. You have to find out what asynchronous operations you can await in order to serialize your asynchronous operations.
In the latest versions of node.js, you can use the for await () structure with the asynchronous iterator in the readline interface to process all the lines.
async import(req, res) {
var rd = readline.createInterface({
input: fs.createReadStream('.../file.csv'),
console: false
});
const createQuery = `INSERT INTO
table('columns')
VALUES ?`;
const values = [];
for await (const line of rd) {
values.push({...})
}
const cs = new pgp.helpers.ColumnSet(['columns'], {table: 'table'});
const query = pgp.helpers.insert(values, cs);
}
FYI, you can see an example of this in the readline doc.
You also don't need to use async at all as you can solve it by just putting your last two lines into an event listener for the close event:
import(req, res) {
var rd = readline.createInterface({
input: fs.createReadStream('.../file.csv'),
console: false
});
const createQuery = `INSERT INTO
table('columns')
VALUES ?`;
const values = [];
rd.on('line', function(line) {
//stuff
values.push({
//stuff
});
}
}).on('close', () => {
// done with all line processing here
const cs = new pgp.helpers.ColumnSet(['columns'], {table: 'table'});
const query = pgp.helpers.insert(values, cs);
});
}
When you call rd.on() you are just establishing an event with a callback to be called when the line event occurs. All your code is doing is establishling the callback but then proceeding with the rest of your code which then tries to insert the values in the database. You need to move the code that inserts into the database inside the callback of your rd.on() after you loop through all your values and push them into the array.
However, I'm not familiar of what the line even is in the case of a file. If it is truly line by line for the file then you obviously can't bulk insert there. My suggestion at that point would be to move that step into its own asyn function and await the result of that function before doing the insert.

NodeJS Returning 'undefined' In MySQL Query Function

I have a function that queries SQL to get a string called Prefix.
function getPrefix(Guild) {
let query = "SELECT Prefix FROM Guilds WHERE GuildId=?";
Connection.query(query, [Guild.id], (err, result) => {
if (err) throw err;
return result[0].GuildPrefix;
});
};
Whenever I print the Prefix out (console.log(result[0].Prefix);), it logs it fine - however, whenever I return it and then attempt to call the function, it always returns undefined.
I am using Node JS Version 10.15.1 & I am using MariaDB Version 10.1.37 on the Raspbian stretch of Debian. Please comment if I have left any other information out. Thanks.
In Nodejs the functions related to mysql are always asynchronous which means they will either not return anything or will retuen undefined.
So the solution is to use a callback function.
Eg.
function getPrefix(Guild, callback) {
let query = "SELECT Prefix FROM Guilds WHERE GuildId=?";
Connection.query(query, [Guild.id], (err, result) => {
if (err){
callback(JSON.stringify(err));
};
callback(JSON.stringify(result));
});
};

javascript- pushing resultset from database to array failed

I'm running a query to fetch the list of new users. Query is correct. It returns 15 users. I push the resultset into a javascript array but only the last record from the resultset is getting saved.
Here's my code:
var query = `SELECT *
FROM users
WHERE (status ='New')`;
var query = connection.query(query),
response = []; // this array will contain the result of our db query
query
.on('error', function (err) {
console.log(err);
})
.on('result', function (res) {
// it fills our array looping on each user row inside the db
response.push(res);
/*
for (var key in res) {
if (res.hasOwnProperty(key)) response.push(res[key]);
}
*/
})
.on('end', function () {
console.log('console')
});
As you can see response.push(res); is the line of code where I do this. Below that I have comment a few lines. I tried that option to push each row from the resultset but it ain't giving any results.
try a for loop
for(var i in res){
response.push(res[i]);
}
I maybe underestimate your test but you maybe check result at the wrong place.
You should do it on the 'end' callback.
.on('end', function () {
console.dir(res)
});

Node.js & Node-Postgres: Putting Queries into Models

I would like to 'functionalize' my queries by putting them into functions which have apt names for the task.
I want to avoid putting everything in the req, res functions (my controllers), and instead put them in 'models' of sorts, that is, another JavaScript file that will be imported and used to run the functions that execute queries and return the results on behalf of the controller.
Assuming that I have the following setup for the queries:
UserController.js
exports.userAccount = function(req, res, next) {
var queryText = "\
SELECT *\
FROM users\
WHERE id = $1\
";
var queryValues = [168];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
res.render('pathToSome/page', {
queryResult: result.rows
});
});
});
}
Here, while I'm in the query, I essentially redirect and render a page with the data. That works fine. But I want to take out all that pg.connect and client.query code and move it to a separate file to be imported as a model. I've come up with the following:
UserModel.js
exports.findUser = function(id) {
// The user to be returned from the query
// Local scope to 'findUser' function?
var user = {};
var queryText = "\
SELECT *\
FROM users\
WHERE id = $1\
";
var queryValues = [id];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
// There is only ever 1 row returned, so get the first one in the array
// Apparently this is local scope to 'client.query'?
// I want this to overwrite the user variable declared at the top of the function
user = result.rows;
// Console output correct; I have my one user
console.log("User data: " + JSON.stringify(user));
});
});
// I expect this to be correct. User is empty, because it was not really
// assigned in the user = result.rows call above.
console.log("User outside of 'pg.connect': " + JSON.stringify(user));
// I would like to return the user here, but it's empty!
return user;
};
and I'm calling my model function as so:
var user = UserModel.findUser(req.user.id);
The query executes perfectly fine in this fashion - except that the user object is not being assigned correctly (I'm assuming a scope issue), and I can't figure it out.
The goal is to be able to call a function (like the one above) from the controller, have the model execute the query and return the result to the controller.
Am I missing something blatantly obvious here?
pgconnect is an asynchronous call. Instead of waiting for data to return from the database before proceeding with the next line, it goes ahead with the rest of the program before Postgres answers. So in the code above, findUser returns a variable that has not yet been populated.
In order to make it work correctly, you have to add a callback to the findUser function. (I told you wrong in a previous edit: The done parameter in pg.connect is called in order to release the connection back to the connection pool.) The final result should look something like this:
exports.findUser = function(id, callback) {
var user = {};
var queryText = "SELECT FROM users WHERE id = $1";
var queryValues = [id];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
user = result.rows;
done(); // Releases the connection back to the connection pool
callback(err, user);
});
});
return user;
};
And you'd use it, not like this:
var user = myModule.findUser(id);
But like this:
myModule.findUser(id, function(err, user){
// do something with the user.
});
If you have several steps to perform, each of them dependent on data from a previous asynchronous call, you'll wind up with confusing, Inception-style nested callbacks. Several asynchronous libraries exist to help you with making such code more readable, but the most popular is npm's async module.

Categories

Resources