Error "Given transaction number * does not match" in mongodb and nodejs - javascript

I want to modify two schema while adding data. For that I used ACID transaction of mongodb with nodejs as follow. But, when I run program it displays the error like
(node:171072) UnhandledPromiseRejectionWarning: MongoError: Given transaction number 3 does not match any in-progress transactions. The active transaction number is 2
at MessageStream.messageHandler (/home/user/Projects/project/node_modules/mongodb/lib/cmap/connection.js:272:20)
at MessageStream.emit (events.js:375:28)
at MessageStream.emit (domain.js:470:12)
addData = async(request: Request, response: Response) => {
const session = await stockSchema.startSession()
try {
const userData = request.body
let data = {}
const transaction = await session.withTransaction(async() => {
try {
userData.products.map(async(item: any) => {
await inventorySchema.findOneAndUpdate({ _id: item.materialID }, { $inc: {qty: -item.qty }}, { session });
});
data = new stockSchema(userData);
await data.save({ session });
} catch (error) {
await session.abortTransaction()
throw new Error("Could not create data. Try again.");
}
});
if (transaction) {
session.endSession()
return returnData(response, data, 'Data created successfully.');
} else {
throw new Error("Could not create data. Try again.");
}
} catch (error: any) {
session.endSession();
return Error(response, error.message, {});
}
}

So you might have figured out the answer to this already, but anyway, after months of frustration, and no clear answer on the internet, i finally figured it out.
The problem with your code above is that you are passing session into a database operation (the .findOneAndUpdate function above) that is running within .map . Meaning, your 'transaction session' is being used concurrently, which is what is causing the error. read this: https://www.mongodb.com/community/forums/t/concurrency-in-mongodb-transactions/14945 (it explains why concurrency with transactions creates a bit of a mess.)
Anyway, instead of .map, use a recursive function that fires each DB operation one after another rather than concurrently, and all your problems will be solved.
You could use a function something like this:
const executeInQueue = async ({
dataAry, //the array that you .map through
callback, //the function that you fire inside .map
idx = 0,
results = [],
}) => {
if (idx === dataAry.length) return results;
//else if idx !== dataAry.length
let d = dataAry[idx];
try {
let result = await callback(d, idx);
results.push(result);
return executeInQueue({
dataAry,
callback,
log,
idx: idx + 1,
results,
});
} catch (err) {
console.log({ err });
return results.push("error");
}
};

Related

Unable to run a loop to update Object Array in SQLite with React Native

So this has been troubling me for a while, I have an array of objects that I want to insert into my SQLite DB. Each of the objects have 5 parameters and I have the SQL Query in place to run it. I was using a loop to iterate through the array and populate each of the objects via db transactions to SQLite. However, the db tasks are asynchronous which leads to the loop being completed before the task is run and incorrect data being populated into the db. The while loop in the code below doesn't work and I have tried the same thing with a for loop to no avail.
var i=0;
while(i<rawData.length){
console.log(rawData[i],i)
db.transaction(function (tx) {
console.log(rawData,i," YAY")
tx.executeSql(
'Update all_passwords SET title=?,userID=?,password=?,notes=?,category=? WHERE ID =? ',
[rawData[i].title,rawData[i].userID,rawData[i].password,rawData[i].notes,rawData[i].category,rawData[i].id],
(tx, results) => {
console.log("saved all data")
tx.executeSql(
"SELECT * FROM all_passwords ORDER BY id desc",
[],
function (tx, res) {
i++
console.log("Print Out Correct Data")
for(var i=0;i<res.rows.length;i++){
console.log(res.rows.item(i), i )
}
});
}
);
console.log("EXIT")
}
,
(error) => {
console.log(error);
}
);
}
I'm not familiar using async tasks with hooks but I believe that might be a potential solution. My intention is to populate the rawaData array of objects into the SQLDb in one go while I use a state to maintain the loading screen.
I did refer the below sources but wasn't able to come up with anything concrete.
react native insertion of array values using react-native-sqlite-storage
https://medium.com/javascript-in-plain-english/how-to-use-async-function-in-react-hook-useeffect-typescript-js-6204a788a435
Thanks in advance!
I made a little write up for you on how I would solve it. Read the comments in the code. If anything is unclear feel free to ask!
const rawData = [
{ title: "title", userID: "userID", password: "password", notes: "notes", category: "category", id: "id" },
{ title: "title_1", userID: "userID_1", password: "password_1", notes: "notes_1", category: "category_1", id: "id_1" },
{ title: "title_2", userID: "userID_2", password: "password_2", notes: "notes_2", category: "category_2", id: "id_2" }
];
// You can mostly ignore this. It's just a mock for the db
const db = {
tx: {
// AFAIK if there is a transaction it's possible to execute multiple statements
executeSql: function(sql, params, success, error) {
// just for simulating an error
if (params.title === "title_2") {
error(new Error("Some sql error"));
} else {
console.log(sql, params.title);
success();
}
}
},
transaction: function(tx, error) {
// simulating async
setTimeout(() => {
return tx(this.tx);
}, parseInt(Math.random() * 1000));
}
}
// Lets make a class which handles our dataccess in an async way
class DataAccess {
// as transaction has callback functions it's wrapped in a promise
// on success the transaction is resolved
// if there is an error it will be thrown
transaction = () => {
return new Promise(resolve => {
db.transaction(tx => resolve(tx), error => {
throw error;
});
});
}
// the actual executeSql function which "hides" all the transaction stuff
// awaits a transaction and executes the sql on it
// if the execution was successfull resolve
// if not throw the error
executeSql = async(sql, params) => {
const tx = await this.transaction();
tx.executeSql(sql, params, () => Promise.resolve(), error => {
throw error;
});
}
}
const dal = new DataAccess();
// all sql execute tha was possible
async function insert_with_execute() {
// promise all does not guarantee execution order
// but it is a possibility to await an array of promises (async functions)
await Promise.all(rawData.map(async rd => {
try {
await dal.executeSql("sql_execute", rd);
} catch (error) {
console.log(error.message);
}
}));
}
// no sql executed cause of error and all in the same transaction
async function insert_with_transaction() {
const tx = await dal.transaction();
for (let i = 0; i < rawData.length; i++) {
tx.executeSql("sql_transaction", rawData[i], () => console.log("success"), error => console.log(error.message));
}
}
async function test() {
await insert_with_execute();
console.log("---------------------------------")
await insert_with_transaction();
}
test();
Apparently the best approach to take is using anonymous functions that create a separate instance of execution for each value of i. This is a good example of how to do it....
Javascript SQL Insert Loop

Issue in promise based recursive function

I am working on AWS Lambda using nodejs environment. I have one API in which I am using recursive function for some functionality.
Actually most people say avoid recursive function but as per my functionality I need this. My functionality is as follows.
I have one table table1 in which there are two columns pid and cid which is defined as unique constraint. Which means combination of two columns should be unique.
So if I am inserting any combination in table1 and it that combination already exist then it gives me duplicate entry error which is correct as per my functionality.
So in order to handle this duplicate entry error I have used try..catch block. So in catch block I have checked if duplicate entry error occurred then I am calling one recursive function which try different combination until new entry is created in table1.
And my recursive function is promise based. but when new entry in created successfully then I am resolving the promise. But promise does not get returned from where I have called my recursive function for very first time. And because of that timeout occurs.
So please someone suggest me solution so that my promise got resolved and my functionality will continue from point where I have called my recursive function for very first time. So the timeout will not come.
I am providing my code for reference.
var mysql = require('mysql');
var con = mysql.createConnection({
"host": "somehost.com",
"user": "myusername",
"password": "mypassword",
"database": "mydatabase"
});
exports.handler = async (event, context) => {
try {
con.connect();
} catch (error) {
throw error;
return 0;
}
let tableId = '';
let count = '';
try {
var tempUsageData = {
user_id: userId,
code: code,
platform: source,
some_id: some_id,
count: count
};
dbColumns = 'user_id, code, platform, added_on, count, some_id';
let usageData = [
[userId, code, source, new Date(), count, some_id]
];
var tableInsert = await databaseInsert(con, constants.DB_CONSTANTS.DB_USAGE, dbColumns, usageData);
tableId = tableInsert.insertId;
} catch (error) {
console.log('## error insert table1 ##', error);
if (error.errno == 1062) {
try {
// calling recursive function here
let newTableData = await createTableEntry(con, tempUsageData);
tableId = newTableData.new_usage_id;
count = newTableData.new_count;
} catch (error) {
console.log('Error', error);
return 0;
}
} else {
return 0;
}
};
console.log('## EXECUTION DONE ##');
return 1;
}
var createTableEntry = (con, dataObject) => {
return new Promise(async function (resolve, reject) {
console.log('createTableEntry Called for count', dataObject.count);
try {
var newCounter = await getDataFromDatabase(con, dataObject.some_id);
dbColumns = 'user_id, code, platform, added_on, count, some_id';
let tableData = [
[userId, code, source, new Date(), Number(newCounter[0].counter + 1), some_id]
];
var tableInsert = await databaseInsert(con, 'table1', dbColumns, tableData);
let response = {
new_table_id: tableInsert.insertId,
new_count: Number(newCounter[0].counter + 1)
}
return resolve(response);
//function not returning from here once successful entry done and timeout occures
} catch (error) {
console.log('## ERROR ##', error);
if (error.errno == 1062) {
console.log('## CALL FUNCTION AGAIN ##');
dataObject.count = Number(newCounter[0].counter + 1);
await createTableEntry(con, dataObject);
} else {
return reject(error);
}
}
});
};
My final output should be message "EXECUTION DONE" should be displayed once execution done.
Please suggest me good solution for this. Thanks in advance.
Update your catch block
catch (error) {
console.log('## ERROR ##', error);
if (error.errno == 1062) {
console.log('## CALL FUNCTION AGAIN ##');
dataObject.count = Number(newCounter[0].counter + 1);
let result = await createTableEntry(con, dataObject);
return resolve(result);
} else {
return reject(error);
}
}

Synchronously call a REST API in JavaScript

I am new to JavaScript and the npm world. I try to upload some data to my REST service via a REST post call. These data I fetch from a csv file. So far so good. On each fetched line I convert the data (for my needs) and call the REST API for uploading those. Since I have many line (approx. 700) the API gets called quite often consecutively. After some calls (guess 500 or so) I get an Socket error
events.js:136
throw er; // Unhandled 'error' event
^
Error: connect ECONNRESET 127.0.0.1:3000
at Object._errnoException (util.js:999:13)
at _exceptionWithHostPort (util.js:1020:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1207:14)
I guess this is because I call the REST API to often. What I don't understand is:
How should I make the call synchronously in order to avoid so many connections?
Or should't I?
What would be the proper solution in JS for this?
I have tried with Promises and so on but all this didn't helped but moved the issue some function calls priorly...
This is my code:
readCsv()
function readCsv() {
var csvFile = csvFiles.pop()
if (csvFile) {
csv({ delimiter: ";" }).fromFile(csvFile).on('json', async (csvRow) => {
if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
var data = await addCallLog(
csvRow["Datum"],
csvRow["Zeit"],
csvRow["Menge-Zeit"],
csvRow["Zielrufnummer"],
csvRow["Produkt"]);
}
}).on('done', (error) => {
//console.log('end')
readCsv()
})
} else {
}
}
function addCallLog(date, time, duration, number, product) {
return new Promise(resolve => {
args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}
As you can see I had the same issue with reading more than one csv files in parallel. I solved this by calling recursively the readCsv function and pop the next file after the other when the file read was done.
You can't call things synchronously. But, you can sequence the async REST calls which is what I presume you mean.
A problem here is that await addCallLog() won't keep the next json events from being generated so you will end with a zillion requests in flight at the same time and apparently you have so many that you run out of resources.
One way around that is to collect the rows you want into an array and then use a regular for loop to iterate that array and you can use await sucessfully in the for loop. Here's what that would look like:
readCsv()
function readCsv() {
var csvFile = csvFiles.pop()
if (csvFile) {
let rows = [];
csv({ delimiter: ";" }).fromFile(csvFile).on('json', (csvRow) => {
if (/.*\(NX\)|.*\(NI\)|.*\(NA\)|.*\(WE\)|.*\(RA\)|.*\(MX\)/.test(csvRow["Produkt"])) {
rows.push(csvRow);
}
}).on('done', async (error) => {
for (let csvRow of rows) {
var data = await addCallLog(
csvRow["Datum"],
csvRow["Zeit"],
csvRow["Menge-Zeit"],
csvRow["Zielrufnummer"],
csvRow["Produkt"]
);
}
readCsv();
})
} else {
}
}
function addCallLog(date, time, duration, number, product) {
return new Promise(resolve => {
args.data = { number: number, name: "", timestamp: getTimestamp(date, time), duration: getDuration(duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}
Your coding appears to be missing error handling. The client.methods.addCallLog() needs a way to communicate back an error.
You probably also need a error event handler for the csv iterator.
After filling the buffer in a prev. function I check that buffer for data and upload those one by one using the "then" callback of the promise
var callLogBuffer = []
checkForUpload()
function checkForUpload() {
console.log("checkForUpload")
if (callLogBuffer.length > 0) {
addCallLog(callLogBuffer.pop()).then((data) => {
checkForUpload()
})
}
}
function addCallLog(callLog) {
return new Promise(resolve => {
args.data = { number: callLog.number, name: "", timestamp: getTimestamp(callLog.date, callLog.time), duration: getDuration(callLog.duration), type: "OUTGOING" }
client.methods.addCallLog(args, (data, response) => {
// client.methods.getCallLog((data, response) => {
// console.log(data)
// })
//console.log("addCallLog resolve")
resolve(data)
})
})
}

Break out of Bluebird promise chain in Mongoose

I've studied several related questions & answers and still can't find the solution for what I'm trying to do. I'm using Mongoose with Bluebird for promises.
My promise chain involves 3 parts:
Get user 1 by username
If user 1 was found, get user 2 by username
If both user 1 and user 2 were found, store a new record
If either step 1 or step 2 fail to return a user, I don't want to do step 3. Failing to return a user, however, does not cause a database error, so I need to check for a valid user manually.
I can use Promise.reject() in step 1 and it will skip step 2, but will still execute step 3. Other answers suggest using cancel(), but I can't seem to make that work either.
My code is below. (My function User.findByName() returns a promise.)
var fromU,toU;
User.findByName('robfake').then((doc)=>{
if (doc){
fromU = doc;
return User.findByName('bobbyfake');
} else {
console.log('user1');
return Promise.reject('user1 not found');
}
},(err)=>{
console.log(err);
}).then((doc)=>{
if (doc){
toU = doc;
var record = new LedgerRecord({
transactionDate: Date.now(),
fromUser: fromU,
toUser: toU,
});
return record.save()
} else {
console.log('user2');
return Promise.reject('user2 not found');
}
},(err)=>{
console.log(err);
}).then((doc)=>{
if (doc){
console.log('saved');
} else {
console.log('new record not saved')
}
},(err)=>{
console.log(err);
});
Example
All you need to do is something like this:
let findUserOrFail = name =>
User.findByName(name).then(v => v || Promise.reject('not found'));
Promise.all(['robfake', 'bobbyfake'].map(findUserOrFail)).then(users => {
var record = new LedgerRecord({
transactionDate: Date.now(),
fromUser: users[0],
toUser: users[1],
});
return record.save();
}).then(result => {
// result of successful save
}).catch(err => {
// handle errors - both for users and for save
});
More info
You can create a function:
let findUserOrFail = name =>
User.findByName(name).then(v => v || Promise.reject('not found'));
and then you can use it like you want.
E.g. you can do:
Promise.all([user1, user1].map(findUserOrFail)).then(users => {
// you have both users
}).catch(err => {
// you don't have both users
});
That way will be faster because you don't have to wait for the first user to get the second one - both can be queried in parallel - and you can scale it to more users in the future:
let array = ['array', 'with', '20', 'users'];
Promise.all(array.map(findUserOrFail)).then(users => {
// you have all users
}).catch(err => {
// you don't have all users
});
No need to complicate it more than that.
move your error handling out of the inner chain to the place you want to actual catch/handle it. As i don't have mongo installed, here is some pseudocode that should do the trick:
function findUser1(){
return Promise.resolve({
user: 1
});
}
function findUser2(){
return Promise.resolve({
user: 2
});
}
function createRecord(user1, user2){
return Promise.resolve({
fromUser: user1,
toUser: user2,
});
}
findUser1()
.then(user1 => findUser2()
.then(user2 => createRecord(user1, user2))) // better nest your promises as having variables in your outside scope
.then(record => console.log('record created'))
.catch(err => console.log(err)); // error is passed to here, every then chain until here gets ignored
Try it by changing findUser1 to
return Promise.reject('not found 1');
First, I would recommend using throw x; instead of return Promise.reject(x);, simply for readibility reasons. Second, your error logging functions catch all the errors, that's why your promise chain is continuing. Try rethrowing the errors:
console.log(err);
throw err;
Don't put error logging everywhere without actually handling the error - if you pass an error handler callback you'll get back a promise that will fulfill with undefined, which is not what you can need. Just use
User.findByName('robfake').then(fromUser => {
if (fromUser) {
return User.findByName('bobbyfake').then(toUser => {
if (toUser) {
var record = new LedgerRecord({
transactionDate: Date.now(),
fromUser,
toUser
});
return record.save()
} else {
console.log('user2 not found');
}
});
} else {
console.log('user1 not found');
}
}).then(doc => {
if (doc) {
console.log('saved', doc);
} else {
console.log('saved nothing')
}
}, err => {
console.error("something really bad happened somewhere in the chain", err);
});
This will always log one of the "saved" or "something bad" messages, and possibly one of the "not found" messages before.
You can also use exceptions to achieve this, but it doesn't really get simpler:
var user1 = User.findByName('robfake').then(fromUser => {
if (fromUser)
return fromUser;
else
throw new Error('user1 not found');
});
var user2 = user1.then(() => // omit this if you want them to be searched in parallel
User.findByName('bobbyfake').then(toUser => {
if (toUser)
return toUser;
else
throw new Error('user2 not found');
})
);
Promise.all([user1, user2]).then([fromUser, toUser]) =>
var record = new LedgerRecord({
transactionDate: Date.now(),
fromUser,
toUser
});
return record.save();
}).then(doc => {
if (doc) {
console.log('saved', doc);
} else {
console.log('saved nothing')
}
}, err => {
console.error(err.message);
});

Sequelize - update record, and return result

I am using sequelize with MySQL. For example if I do:
models.People.update({OwnerId: peopleInfo.newuser},
{where: {id: peopleInfo.scenario.id}})
.then(function (result) {
response(result).code(200);
}).catch(function (err) {
request.server.log(['error'], err.stack);
).code(200);
});
I am not getting information back if the people model was succesfully updated or not. Variable result is just an array with one element, 0=1
How can I know for certain that the record was updated or not.
Here's what I think you're looking for.
db.connections.update({
user: data.username,
chatroomID: data.chatroomID
}, {
where: { socketID: socket.id },
returning: true,
plain: true
})
.then(function (result) {
console.log(result);
// result = [x] or [x, y]
// [x] if you're not using Postgres
// [x, y] if you are using Postgres
});
From Sequelize docs:
The promise returns an array with one or two elements. The first element x is always the number of affected rows, while the second element y is the actual affected rows (only supported in postgres with options.returning set to true.)
Assuming you are using Postgres, you can access the updated object with result[1].dataValues.
You must set returning: true option to tell Sequelize to return the object. And plain: true is just to return the object itself and not the other messy meta data that might not be useful.
You can just find the item and update its properties and then save it.
The save() results in a UPDATE query to the db
const job = await Job.findOne({where: {id, ownerId: req.user.id}});
if (!job) {
throw Error(`Job not updated. id: ${id}`);
}
job.name = input.name;
job.payload = input.payload;
await job.save();
On Postgres:
Executing (default): UPDATE "jobs" SET "payload"=$1,"updatedAt"=$2 WHERE "id" = $3
Update function of sequelize returns a number of affected rows (first parameter of result array).
You should call find to get updated row
models.People.update({OwnerId: peopleInfo.newuser},
{where: {id: peopleInfo.scenario.id}})
.then(() => {return models.People.findById(peopleInfo.scenario.id)})
.then((user) => response(user).code(200))
.catch((err) => {
request.server.log(['error'], err.stack);
});
Finally i got it. returning true wont work in mysql , we have to use findByPk in order hope this code will help.
return new Promise(function(resolve, reject) {
User.update({
subject: params.firstName, body: params.lastName, status: params.status
},{
returning:true,
where: {id:id }
}).then(function(){
let response = User.findById(params.userId);
resolve(response);
});
});
same thing you can do with async-await, especially to avoid nested Promises
You just need to create async function :)
const asyncFunction = async function(req, res) {
try {
//update
const updatePeople = await models.People.update({OwnerId: peopleInfo.newuser},
{where: {id: peopleInfo.scenario.id}})
if (!updatePeople) throw ('Error while Updating');
// fetch updated data
const returnUpdatedPerson = await models.People.findById(peopleInfo.scenario.id)
if(!returnUpdatedPerson) throw ('Error while Fetching Data');
res(user).code(200);
} catch (error) {
res.send(error)
}
}
There is another way - use findByPk static method and update not-static method together. For example:
let person = await models.People.findByPk(peopleInfo.scenario.id);
if (!person) {
// Here you can handle the case when a person is not found
// For example, I return a "Not Found" message and a 404 status code
}
person = await person.update({ OwnerId: peopleInfo.newuser });
response(person).code(200);
Note this code must be inside an asynchronous function.
You can fetch the model to update first, and call set() then save() on it. Returning this object will give you the updated model.
Although this might not be the shortest way to do it, I prefer it because you can handle not found and update errors separately.
const instance = await Model.findOne({
where: {
'id': objectId
}
});
if (instance && instance.dataValues) {
instance.set('name', objectName);
return await instance.save(); // promise rejection (primary key violation…) might be thrown here
} else {
throw new Error(`No Model was found for the id ${objectId}`);
}
If you're using postgres and updating one row.
try {
const result = await MODELNAME.update(req.body, {
where: { id: req.params.id },
returning: true
});
if (!result) HANDLEERROR()
const data = result[1][0].get();
res.status(200).json({ success: true, data });
} catch (error) {
HANDLEERROR()
}

Categories

Resources