mongodb nodejs multiple insert issue - javascript

I'm trying to import a table from mysql to mongodb straight without any schema changes.
I wrote a small node script for that and my issue is with the way i implemented it.
Maybe I hit some limit of using mongo db insert limit while using it inside a loop.
I think this problem would not have had come if it was in reverse (maybe not! )
So here's the thing.
The row in the mysql table is more than 100,000 but when the loop hit's more than around 30000 the number of inserted items just get reduced.
so let's say if there was 100,000 items in the mysql table after complete import using the below mentioned script, i get only a maximum of 37000 or so.
My strong suscpicion is either in the node script/node mongodb connector, or some bug in the script or lastly a limit in mongodb concurrent db inserts.
I'm pasting the script below.
Hoping i get a way around it.
Thanks,
var http = require('http'),
mysql = require('mysql'),
mongo = require('mongodb').MongoClient,
format = require('util').format;
var connection = mysql.createConnection({
user: "xxx",
password: "xxx",
database: "mydb"
});
connection.connect();
var query = "select * from mytable";
var mysqlrows = '';
connection.query(query, function(err,rows,fields){
if(err) throw err;
console.log(rows.length+'rows found.');
mongo.connect('mongodb://root:root#127.0.0.1:27017/mydb', function(err, db){
if (err)
throw err;
var collection = db.collection('mytable');
for(var i=0; i<rows.length;i++)
{
//console.log(JSON.stringify(rows[i]));
(function(i){
collection.insert(rows[i],function(err,docs){});
console.log(i);
})(i);
}
db.close();
});
});
connection.end();

The problem is that you're not waiting for the insert operations to complete before closing your connection to MongoDb via the db.close(); call. You need to keep track of your outstanding asynchronous requests and then only call db.close(); when they've all completed.

To make sure that you are getting all the data from mySQL, try to access the last row. If you can get it, use the flag w and j of mongodb to make sure that each call inserts the data before moving to the next. with the w and j flag, you should consider multiple inserts by inserting multiple rows at each call using and array.

Related

mysql2 node two consecutive database queries where the second query depends on the result of the first query

first of all: I am new to Node and also pretty new to JS. And asynchronous behavior seems to be something my brain refuses to understand.
I have the following issue:
I have a database table containing amongst other information the field "re_nummer" which stands for invoice number.
Each time my app creates an invoice, I want to read the actual re_nummer from the database, increase it by +1 and then write it back to the data base.
I assume this is a asynchronous behaviour and I have no spent a full day with google and here to solve the issue, but it seems the brain fog is too thick.
Here is my code:
const mysql2 = require("mysql2");
const db = mysql2.createConnection({
host: "localhost",
user: "root",
database: "mit_absicht_gluecklich",
});
db.connect((err) => {
if (err) throw err;
});
/**
* First database query: get the most recent invoice number (re_nummer)
*/
let sql = "SELECT re_nummer From admin WHERE id = 1;";
let re_nummer;
db.query(sql, (err, result) => {
if (err) throw err;
let re_nummer = result[0].re_nummer;
console.log("Rechungsnummer: ", re_nummer);
});
/**
* Second databas query: update database with re_nummer_neu = re_nummer +1
*/
let re_nummer_neu = re_nummer + 1;
sql = `UPDATE admin SET re_nummer = ${re_nummer_neu} WHERE id = 1;`;
db.query(sql, (err, result) => {
if (err) throw err;
console.log(re_nummer_neu);
});
module.exports = re_nummer;
This code doesn't work because it starts with the second database request before the first one i finalized.
How to solve this issue??
Also important to mention: this is a node module. I need the re_nummer to be available outside the module. This is why I use module.exports = re_nummer;
In case I am asking a question that has been asked already multiple times, I apologize in advance. I have honestly given my best to understand other threats with similar content. 100% fail!!!
Many thanks for some help and or comments!
Klaus

how to pass result from MySql query to ejs in node js

I am creating a dashboard that displays a number of data in different charts (e.g. horizontal bars, pie charts etc.)
I am using node js as a backend, MySQL as database, and ejs to render the data to html page.
I have had MySQL queries ready to query different data. The problem that I am having when I need to encapsulate these MySQL queries inside the routing function and pass the results to ejs.
Here is the sample code
router.get('/participant-module',(req,res)=>{
let sql1 = dbModel.participant_per_module; // MySQL query string
let sql2 = dbModel.error_per_module; // MySQL query string
let count_emp = [];
let count_error = [];
db.query(sql1, (err, result)=>{
if(err) throw err;
count_emp.push(result);
});
db.query(sql2, (err, result)=>{
if(err) throw err;
error_count.push(result);
});
res.render('dashboard', {emp_count:count_emp, error_count:count_error}); // pass array to ejs
});
count_emp and count_error are display in different charts. But, here the count_emp and count_error are always null.
I have tried to search for similar problem in the forum and the cause seems to be db.query is async so it won't wait and thus when res.render send the data, emp_count and error_count are still null.
So does anyone have a workaround on this?
Thank you in advance
D
Your db.query calls and res.render are running asynchronously without waiting for the query result. What is happening is that, your first query is fired, then immediately your second query is fired, then immediately your router is returning the response. You are not waiting for the queries to retrieve the results. That's why your count_emp and count_error is [] every time the response is rendered.
Try this:
router.get('/participant-module',(req,res)=>{
let sql1 = dbModel.participant_per_module; // MySQL query string
let sql2 = dbModel.error_per_module; // MySQL query string
let count_emp = [];
let count_error = [];
db.query(sql1, (err, result)=>{
if(err) throw err;
count_emp.push(result);
db.query(sql2, (err, result)=>{
if(err) throw err;
error_count.push(result);
res.render('dashboard', {emp_count:count_emp, error_count:count_error}); // pass array to ejs
});
});
});
PS: You should try to avoid over-nesting of callbacks. It is commonly referred as callback hell. Some common ways are: 1. Using promises 2. Using named functions

What is the cleanest way to save and load multiple DOM elements as JSON with the FileSystem(fs) Module from Node.js?

I need to save dynamically added DOM elements(draggable DIVs) to a file and load them at a later time.
What is the cleanest way to accomplish this?
You can save mySql or Mongodb
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'locahost',
user : 'foo',
password : 'bar',
database : 'test'
});
// the callback inside connect is called when the connection is good
connection.connect(function(err){
var sql = "select 'Joe' as name from dual";
connection.query(sql, function(err, rows, fields) {
if (err) return console.log(err);
// you need to end your connection inside here.
connection.end();
console.log(rows[0].name);
});
});

Node js. Proper / Best Practice to create connection

Right now i am creating a very large application in Node JS. I am trying to make my code clean and short (Just like most of the developer). I've create my own js file to handle connection to mysql. Please see code below.
var mysql = require('mysql');
var config = {
'default' : {
connectionLimit : process.env.DB_CONN_LIMIT,
host : process.env.DB_HOST,
user : process.env.DB_USER,
password : process.env.DB_PASS,
database : process.env.DB_NAME,
debug : false,
socketPath : process.env.DB_SOCKET
}
};
function connectionFunc(query,parameters,callback,configName) {
configName = configName || "default";
callback = callback || null;
parameters = parameters;
if(typeof parameters == 'function'){
callback = parameters;
parameters = [];
}
//console.log("Server is starting to connect to "+configName+" configuration");
var dbConnection = mysql.createConnection(config[configName]);
dbConnection.connect();
dbConnection.query(query,parameters, function(err, rows, fields) {
//if (!err)
callback(err,rows,fields);
//else
//console.log('Error while performing Query.');
});
dbConnection.end();
}
module.exports.query = connectionFunc;
I am using the above file in my models, like below :
var database = require('../../config/database.js');
module.exports.getData = function(successCallBack){
database.query('SAMPLE QUERY GOES HERE', function(err, result){
if(err) {console.log(err)}
//My statements here
});
}
Using this coding style, everything works fine but when i am trying to create a function that will loop my model's method for some reason. Please see sample below :
for (i = 0; i < 10000; i++) {
myModel.getData(param, function(result){
return res.json({data : result });
});
}
It gives me an ER_CON_COUNT_ERROR : Too Many Conenction. The question is why i still get an error like these when my connection always been ended by this dbConnection.end();? I'm still not sure if i am missing something. I am still stuck on this.
My connection limit is 100 and i think adding more connection is a bad idea.
Because query data form the database is async.
In your loop the myModel.getData (or more precisely the underling query) will not halt/paus your code until the query is finished, but send the query to the database server and as soon as the database response the callback will be called.
The calling end on dbConnection will not close the connection immediately, it will just mark the connection to be closed as soon as all queries that where created with that connection are finished.
mysql: Terminating connections
Terminating a connection gracefully is done by calling the end() method. This will make sure all previously enqueued queries are still before sending a COM_QUIT packet to the MySQL server.
An alternative way to end the connection is to call the destroy() method. This will cause an immediate termination of the underlying socket. Additionally destroy() guarantees that no more events or callbacks will be triggered for the connection.
But with destroy the library will not wait for the result so the results are lost, destroy is rarely useful.
So with your given code you try to create 10000 connections at one time.
You should only use on connection by task, e.g. if a user requests data using the browser, then you should use one connection for this given request. The same is for timed task, if you have some task that is done in certain intervals.
Here an example code:
var database = require('./config/database.js');
function someTask( callback ) {
var conn = database.getConnection();
myModel.getData(conn, paramsA, dataReceivedA)
function dataReceivedA(err, data) {
myModel.getData(conn, paramsB, dataReceivedB)
}
function dataReceivedB(err, data) {
conn.end()
callback();
}
}
If you want to entirely hide your database connection in your model code. Then you would need to doe something like that:
var conn = myModel.connect();
conn.getData(params, function(err, data) {
conn.end();
})
How to actually solve this depends only many factors so it is only possible to give you hints here.

Multiple connections with node-mongodb-native

I am working on a function to insert a document in a mongoDb database using the node-mongodb-native module. Everything is working, except if I call insert multiple documents back-to-back. I use a for loop to test how my function is reacting to multiple document insert at the same time.
var server = new Server("xxx.xxx.xxx.xxx", 27017, {auto_reconnect: true, poolSize: 100});
var db = new Db("testDb", server, {safe: false});
module.exports.insert = function(document){
var database;
function db_open(err, db_local){
if(err) throw err;
database = db_local;
database.collection("rooms", handle_insert);
}
function handle_insert(err, collection){
if(err) throw err;
collection.insert(document);
database.close();
}
db.open(db_open);
};
for(var i=0; i<100; i++){
module.exports.insert({name : "test"});
}
When I'm running this code I get the error db object already connecting, open cannot be called multiple times
To resolve the problem, I decided to create a new instance of Server and Db at each call of the function :
module.exports.insert = function(document){
var database;
var server = new Server("xxx.xxx.xxx.xxx", 27017, {auto_reconnect: true, poolSize: 100});
var db = new Db("testDb", server, {safe: false});
function db_open(err, db_local){
if(err) throw err;
database = db_local;
database.collection("rooms", handle_insert);
}
function handle_insert(err, collection){
if(err) throw err;
collection.insert(document);
database.close();
}
db.open(db_open);
};
for(var i=0; i<100; i++){
module.exports.insert({name : "test"});
}
But now I'm getting connection closed thrown by the db_open function
I really don't understand why my connection is closing between the moment when I'm creating db and when my code call db_open.
Have you an idea of what is happening?
Thank you :)
(Sorry if my English is not really good)
EDIT
I found this website explaining that the problem is caused by the too long tcp_keepalive time. The problem with this solution is my workstation (Cloud 9). I don't have the permission to access the file /proc/sys/net/ipv4/tcp_keepalive_time
I don't think your problem has anything to do with TCP keep-alive. Like the error message says, you're simply attempting to open the same connection multiple times (every time the insert method is called). Instead of calling open() every time, just call it once, and make sure the callback returns, before you call insert() for the first time. There is no hard limit to the number of simultaneous inserts you can perform on the same connection since it's all asynchronous.

Categories

Resources