node.js global variable problematical - javascript

I don't know javascript very well. I can't define a global variable.
var data = 'empty';
connection.query('SELECT * FROM deneme',function(err, rows, fields){
if (err) throw err;
data = rows;
});
console.log(data);
Normally, console need to return rows' data but It returns 'empty'. How can I query rows from inside of function? How can I define a global variable?

The reason it is not working is because you console.log is outside the asynchronous code block. Basically what is happening is this:
data is set to empty;
connection issues a database request;
console.log fires ( data is empty at that point );
database response is received;
callback fires;
data is set to rows.
So in order to get what you want simply put console.log statement inside an asynchronous block code:
var data = 'empty';
connection.query('SELECT * FROM deneme',function(err, rows, fields){
if (err) throw err;
data = rows;
console.log(data);
});

Related

Why is my ipcMain returning an empty array after reading through a database?

For my project, I am using electron and react with ipcRenderer and ipcMain to communicate with a database. I can see it go through the database, but it returns an empty array. It is like the array is returned before anything is read from the db.
Here is the code I am using in my ipc main:
I am expecting it to return the names of the categories, but all it returns cateNames blank.
ipcMain.on(channels.GET_CATS, async(event,type) => {
console.log("made it")
let cateNames=[];
db.each(`SELECT name FROM categories WHERE type=?`, [type],(err, row) => {
if (err) {
return console.error(err.message);
}
console.log(row.name);
cateNames.push(row.name);
});
console.log(cateNames);
event.sender.send(channels.GET_LOGIN, cateNames);
});
I send the request with
ipcRenderer.send(channels.GET_CATS,"Donation");
with an ipcRenderer.on listening that will output the array to the console.
Javascript / Node.js execute commands in sequential order and does not "wait" when moving from one command to the next unless explicitely told to via the use of promises and async / await commands.
The reason your current code returns an empty cateNames array is becasue execution of the code does not "wait" for the db.each command to return it's callback. The callback is only returned once the DB has someting to return, which will either be a row or an error. This takes time. In the meantime, execution has moved on the next command.
To make this block of code "wait" until the DB has returned all available rows (if any) we could to use promises.
Instead, I propose a simpler method. Instead of pushing row.name with every db.each iteration, just use db.all and craft the response afterwards.
ipcMain.on(channels.GET_CATS, async(event, type) => {
console.log("made it")
let cateNames = [];
db.all(`SELECT name FROM categories WHERE type=?`, [type], (err, rows) => {
if (err) {
return console.error(err.message);
}
for (let row of rows) {
cateNames.push(row.name);
}
console.log(cateNames);
// Use event.reply(channel, data);
event.reply(channels.GET_LOGIN, cateNames);
});
});

Update multiple rows and columns using respective ids in mssql

I extracted data from database in excel format using sql join query. I modified few columns data in to that exce sheet. I want update modified data into database using respective row ids.
When I iterate stack: Error: aborted is thrown.
below is my code
sql.connect(db, async (err)=> {
if (err)
console.log(err);
var request = new sql.Request();
var workbook = XLSX.readFile('2021-08-30-202129.xlsx');
var sheet_name_list = workbook.SheetNames;
var xlData = XLSX.utils.sheet_to_json(workbook.Sheets[sheet_name_list[1]]);
console.log(xlData);
xlData.forEach(async (item)=>{
let returnquery = await ` UPDATE [billing] SET notes='${item.notes}' WHERE id='${item.id}'`
request.query(returnquery, function (err, result) {
if (err) {
console.log(err)
}
else{
console.log('result', result);
}
sql.close();
});
})
});
I typically use the workbench app, and open a query window, to ensure I have the correct SQL syntax. (console.log your sql statement right before you call the DB, then copy it over to the workbench).
I usually have some small typo issue, and this allows me to easily figure that out, then go back to the code and fix the error.
I also think you have issues with having ' by your variables. those should be removed since you're already in `` style. SQL will misinterpret those.

Node/JSON Issues

So I am trying to figure out why my returned object every single time is just []?
Here is my code:
var returnObject = [];
db.query(queryString, function(err, rows, fields) {
if (err) throw err;
for (var i in rows)
{
console.log('Data: ', rows[i]);
var marker =
{
o_ID:rows[i].o_ID,
data:rows[i].data
};
returnObject[i]=marker;
console.log(chalk.red(returnObject[i].o_ID));
console.log(chalk.red(returnObject[i].data));
}
});
var sqsParams = {MessageBody: JSON.stringify(returnObject), QueueUrl :'---'};
For some reasons when I print the returnObject values they are correct but when it gets to the JSON.stringify something happens and sends to my SQS queue just [].
I thought maybe getting rid of the marker variable and just assigning
returnObject[i]= {
o_ID:rows[i].o_ID,
data:rows[i].data
};
But that still results in the same issue. Any ideas? Thanks!
Database queries in JavaScript are usually asynchronous. This means that the code inside your callback function function(err, rows, fields) will be run only after the database query has been done, whereas you assignment to sqsParams will be done right after the database query has been started. The result is that code inside your callback function has not been run before you returnObject with JSON.stringify and it is still in its original value [].
Your sqsParams variable is being set outside the db.query callback. As the db.query is asynchronous, your code is going to just fall through with an empty array.
Move your sqsParams variable into the callback you are supplying to db.query, eg:
console.log(chalk.red(returnObject[i].o_ID));
console.log(chalk.red(returnObject[i].data));
}
var sqsParams = {MessageBody: JSON.stringify(returnObject), QueueUrl :'---'};
// Use sqsParams here
});

can't set headers after they are sent node js when using restify module

I have used tedious to connect to sql server and restify for restful api
here is the server.js
server.get('/getInvoiceData', function (req, res, next) {
repository.GetInvoiceData(function(data){
res.send(data);
next();
});
});
and the invoice.js
exports.GetInvoiceData = function(callback){
var query = "SELECT * FROM [Snapdeal].[dbo].[tbl_Configuration]";
var req = new request(query,function(err,rowcount){
if (err)
{
console.log(err.toString());
}else{
console.log(rowcount+ " rows");
}
});
req.on('row',function(){
callback({customernumber:123});
});
connection.execSql(req);
}
I am getting the error as Cant set the headers after they are sent.
I am not 100% sure as I am not familiar with the SQL lib you are using, however, it looks to me like the problem is your row event would be raised per row, rather than per transaction.
You are ending the response after the first row event therefore if there is more than one row being returned the response will already have been closed (hence the error).
One way of dealing with this is to accumulate the row data as it's being retrieved and then raise the callback after your done
Now that you have stated the lib you are using (Tedius), it would appear my hunch was correct. Looking at the library, here is the simplest approach you can take to returning all the rows in a single callback
exports.GetInvoiceData = function(callback){
var query = "SELECT * FROM [Snapdeal].[dbo].[tbl_Configuration]";
var req = new request(query,function(err, rowcount, rows){
if (err) {
console.log(err.toString());
} else{
callback(rows);
}
});
connection.execSql(req);
}
Note - remember to set config.options.rowCollectionOnRequestCompletion to true otherwise the rows parameter will be empty.
My issue, using mssql, was that I had a default value or binding set (in this case, (getdate())) to one of my columns (modified date column). However, the data I was trying to retrieve had preset NULL values for this particular column.
I put data in those rows and I was good to go.

node.js mysql failing to insert

I'm messing around with node.js, using the Faker, underscore, mysql and randy libraries to add some test data to a web app.
So far so good-- but in one of my loops, the mysql call fails every time, but it also fails to generate any errors. I'm a bit stumped:
var sql = 'SELECT id FROM mytable WHERE 1';
client.query(sql, function(err, rows, fields){
// loop through results
_.each(rows, function (v, i){
// get the insert id
var id = v.id;
console.log (id); // prints valid ID
// generate other data
var stuff_count = randy.randInt(0,12);
_(stuff_count).times(function(n){
var sql = 'INSERT INTO other_table (linked_id, item_id) VALUES ('+id+','+randy.randInt(1,50)+')';
console.log(sql); // prints sql
client.query(sql, function(err, rows, fields) {
console.log(sql); // prints nothing
if (err){
console.log(sql); // prints nothing
throw err; // prints nothing
}
});
});
});
The looping logic is working fine and it gets all the way to where the sql should execute the INSERT against the other_table-- but tailing the mysql log shows nothing hitting the DB, and none of the debug statements inside the client.query block are printing anything, on success or failure.
assuming you are just running this standalone, your code probably finishes before it has a chance of doing the inserts.
place a setTimeout(..., 10000) after the _(stuff_count).times(...)
By using _().times(), you queue all the inserts, but are not waiting for completion. you should use a control flow library to ensure you wait for all inserts to complete.

Categories

Resources