A little preface: I am very new to working with Node so please bear with my ignorance
I am trying to pass some info from an array in Node.js, and check whether it exists in a MongoDB document. I am still struggling to wrap my head around Node and how to work with databases asynchronously.
I have the following code
for (i in articleTitle) {
console.log(articleTitle[i]);
// Use connect method to connect to the Server
MongoClient.connect(mongoUrl, function(err, db) {
if (err) throw err; // Throw error
var query = { title: articleTitle[i] }; // Query Parameter
// Perform Query
db.collection(mongoCollection).find(query).toArray(function(err, result) {
if (err) throw err; // Throw error
if (result == '') {
console.log('No results found for title:', articleTitle[i]);
} else {
console.log('Found an entry');
}
db.close(); // Close connection
});
});
}
In the above code, I have an array of strings called articleTitle (for example: ['Title1', 'Title2', 'Title3']), I then run through each of those titles in the array (using the for() loop) to check if each title exists in the database.
The output I get is as follows:
> Title1
> Title2
> Title3
> No results found for title: Title 3
> No results found for title: Title 3
> No results found for title: Title 3
As evident above it seems to be checking for the last object in the array three times. I have also tried to implement the async package but struggled with that as well.
Any help would be appreciated.
The issue you have is scope of the variable i in the callback function.
Use for (let i in articleTitle) instead.
This creates a new variable i for every iteration and scope is restricted to that iteration.
The answers to this question JavaScript closure inside loops – simple practical example explain in detail about why this happens and about scope and closure in JavaScript. The above question is an exact duplicate of this question.
Related
I'm new to NodeJS and Mongoose and this might be a duplicate question, so don't reply back negatively please. I tried finding a solution but failed to fix this error.
Basically, I get this error when I try to update the database values. The first update I perform works perfectly but from the second one onward, I get this error. The values update but it updates several fields at once giving this error.
Here's my code: (and I have added my github link to the project):
User.js (model)
local: {
...,
education: [{
id: String,
degree: String,
institute: String,
dates: String,
description: String
}],
...
}
router.js
app.put('/put_education/:id', function(req, res) {
var row_id = req.params.id; //get education id from table row
//get the current logged in user
User.findById(req.user._id, function (err, doc) {
if (err) {
console.log('no entry found');
}
//match the table row id with the id in users education model
doc.local.education.forEach(function (education, index) {
console.log(education.id + " " + row_id);
//if rows match, replace the database document with values from client
if(education.id === row_id){
doc.local.education[index] = req.body;
doc.save(function (err) {
if (err) {
console.log(err);
} else {
res.send("Success");
}
});
}
});
});
});
I added a console.log to see the loop operates, the image below shows its fine for the first iteration but acts weirdly for the next ones:
I was thinking of breaking the loop after the id's match but the foreach loop doesnt have a break function, I changed it to a normal for loop but it still gives me the same error. So I dont think breaking the loop is an answer..
Edit: Added image of my website to show what happens after updating (duplicates rows)
Github: https://github.coventry.ac.uk/salmanfazal01/304CEM-Back-End
if break out of an iteration is your issue, then why not use a simple for loop with break statement.
let eduArray = doc.local.education;
for(i=0;i<eduArray.length;i++) {
if(eduArray[i]["id"] == row_id) {
// do your logic
break;
}
}
becuase you are updating the education first time with no problem, then the problem appears afterwards, I suspect that you update the education wrongly, look into this line of code you wrote:
doc.local.education[index] = req.body;
here I see you are assigning whatever inside the request body to education, but have you checked what data is actually inside the req.body?
try to log the req.body to see what you actually assigning to the education.
I'm having problems when i insert several data using promise, sometimes it works but other times give me this error:
And my code is this:
return Promise.all([
Promise.all(createBistamp),
Promise.all(createSlstamp),
listOfResults,
i
]).then(function(listOfResults2) {
for(var j=0; j<resultArticle.length; j++) {
if(arm === 'Arm-1') {
}
if(arm === 'Arm-1-11') {
}
}
if(arm === 'Arm-1') {
console.log("PROMISE ARM-1");
return Promise.all([insertBi,insertBi2,insertSl]).then(function (insertEnd) {
res.send("true");
}).catch(function(err) {
console.log(err);
});
}
if(arm === 'Arm-1-11') {
console.log("PROMISE ARM-1-11");
return Promise.all([insertBi,insertBi2,insertSl,insertSlSaida]).then(function (insertEnd) {
res.send("true");
}).catch(function(err) {
console.log(err);
});
}
}).catch(function(err) {
console.log(err);
});
I remove the code line inside ifs and for but it was inserts in database.
Example of insert:
var insertBi2 = request.query("INSERT INTO bi2 (bi2stamp,alvstamp1,identificacao1,szzstamp1,zona1,bostamp,ousrinis,ousrdata,ousrhora,usrinis,usrdata,usrhora)"+
"VALUES ('"+bistamp+"','AB16083056009,454383576','2','Adm13010764745,450449475','1','"+bostamp+"','WWW','"+data+"','"+time+"','WWW','"+data+"','"+time+"')");
Full Code:
http://pastebin.com/DTjtXvDt
This is my structure and i don't know if i'm working well with promises.
Thank you
I have also faced this problem recently.
error: RequestError: Transaction (Process ID 72) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
Solution -
There was not a single index on the table. So, I created a non-clustered unique index on the unique identifier column.
I was surprised when this solution worked
There was a single update operation in the code and no select operation. So, it made me curious to do some research. I came across lock granularity mechanism for locking resources. In my case, locking has to be at row level instead of page level.
Note:
For clustered tables, the data pages are stored at the leaf level of the (clustered) index structure and are therefore locked with index key locks instead of row locks.
Further Reading
https://www.sqlshack.com/locking-sql-server/
If you are inserting data or updating data in a loop, then it's better to make all queries in the loop and store it and then execute it all at once in a single transaction. Will save yourself with a lot of issues
Here's the relevant code:
var Results = mongoose.model('Results', resultsSchema);
var results_array = [];
_.each(matches, function(match) {
var results = new Results({
id: match.match_id,
... // more attributes
});
results_array.push(results);
});
callback(results_array);
});
}
], function(results_array) {
results_array.insert(function(err) {
// error handling
Naturally, I get a No method found for the results_array. However I'm not sure what else to call the method on.
In other functions I'm passing through the equivalent of the results variable here, which is a mongoose object and has the insert method available.
How can I insert an array of documents here?
** Edit **
function(results_array) {
async.eachLimit(results_array, 20, function(result, callback) {
result.save(function(err) {
callback(err);
});
}, function(err) {
if (err) {
if (err.code == 11000) {
return res.status(409);
}
return next(err);
}
res.status(200).end();
});
});
So what's happening:
When I clear the collection, this works fine.
However when I resend this request I never get a response.
This is happening because I have my schema to not allow duplicates that are coming in from the JSON response. So when I resend the request, it gets the same data as the first request, and thus responds with an error. This is what I believe status code 409 deals with.
Is there a typo somewhere in my implementation?
Edit 2
Error code coming out:
{ [MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error index:
test.results.$_id_ dup key: { : 1931559 }]
name: 'MongoError',
code: 11000,
err: 'insertDocument :: caused by :: 11000 E11000 duplicate key error index:
test.results.$_id_ dup key: { : 1931559 }' }
So this is as expected.
Mongo is responding with a 11000 error, complaining that this is a duplicate key.
Edit 3
if (err.code == 11000) {
return res.status(409).end();
}
This seems to have fixed the problem. Is this a band-aid fix though?
You seem to be trying to insert various documents at once here. So you actually have a few options.
Firstly, there is no .insert() method in mongoose as this is replaced with other wrappers such as .save() and .create(). The most basic process here is to just call "save" on each document you have just created. Also employing the async library here to implement some flow control so everything just doesn't queue up:
async.eachLimit(results_array,20,function(result,callback) {
result.save(function(err) {
callback(err)
});
},function(err) {
// process when complete or on error
});
Another thing here is that .create() can just take a list of objects as it's arguments and simply inserts each one as the document is created:
Results.create(results_array,function(err) {
});
That would actually be with "raw" objects though as they are essentially all cast as a mongooose document first. You can ask for the documents back as additional arguments in the callback signature, but constructing that is likely overkill.
Either way those shake, the "async" form will process those in parallel and the "create" form will be in sequence, but they are both effectively issuing one "insert" to the database for each document that is created.
For true Bulk functionality you presently need to address the underlying driver methods, and the best place is with the Bulk Operations API:
mongoose.connection.on("open",function(err,conn) {
var bulk = Results.collection.initializeUnorderedBulkOp();
var count = 0;
async.eachSeries(results_array,function(result,callback) {
bulk.insert(result);
count++;
if ( count % 1000 == 0 ) {
bulk.execute(function(err,response) {
// maybe check response
bulk = Results.collection.initializeUnorderedBulkOp();
callback(err);
});
} else {
callback();
}
},function(err) {
// called when done
// Check if there are still writes queued
if ( count % 1000 != 0 )
bulk.execute(function(err,response) {
// maybe check response
});
});
});
Again the array here is raw objects rather than those cast as a mongoose document. There is no validation or other mongoose schema logic implemented here as this is just a basic driver method and does not know about such things.
While the array is processed in series, the above shows that a write operation will only actually be sent to the server once every 1000 entries processed or when the end is reached. So this truly does send everything to the server at once.
Unordered operations means that the err would normally not be set but rather the "response" document would contain any errors that might have occurred. If you want this to fail on the first error then it would be .initializeOrderedBulkOp() instead.
The care to take here is that you must be sure a connection is open before accessing these methods in this way. Mongoose looks after the connection with it's own methods so where a method such as .save() is reached in your code before the actual connection is made to the database it is "queued" in a sense awaiting this event.
So either make sure that some other "mongoose" operation has completed first or otherwise ensure that your application logic works within such a case where the connection is sure to be made. Simulated in this example by placing within the "connection open" event.
It depends on what you really want to do. Each case has it's uses, with of course the last being the fastest possible way to do this as there are limited "write" and "return result" conversations going back and forth with the server.
So I'm trying to enter data into a mongodb collection with node. As far as I can tell I have access to the collection.
var collection = db.collection("whatsGoingOnEvents");
if(collection){
console.log("hitting stream");
var stream = collection.find({time: parsedObject.time, endTime: parsedObject.endTime, lon:parsedObject.lon,lat:parsedObject.lat}).stream();
console.log(stream);
stream.on("data",function(data){
console.log("data");
console.log(data);
if(!data){
collection.insert(parsedObject);
console.log("hitting insert");
}
});
stream.on("end",function(){
//dosomething
});
}
parsedObject may or may not have all of those fields - should it matter? I thought if the field was not there then collection.find() is just looking for time to be "undefined", which is still technically a value.
I never hit console.log("data") so I never insert documents. I've been trying to follow this link.
and so am not sure why the insert is not happening. I know that nothing is being added from db.collection.stats();, which tells me the size of the collection is 0.
Oh also, this is what I'm using to connect to Mongo-
var mongo = require('mongodb').MongoClient;
EDIT--
I tried the answer below - that resulted in this error-
lib/mongodb/connection/server.js:481
throw err;
^
Error: Cannot use a writeConcern without a provided callback
at insertAll (/Users/psanker/Google Drive/Coding/Javascript/WhatsGoingOn/node_modules/mongodb/lib/mongodb/collection.js:332:11)
at Collection.insert (/Users/psanker/Google Drive/Coding/Javascript/WhatsGoingOn/node_modules/mongodb/lib/mongodb/collection.js:91:3)
^The above occurred because I hadn't added a callback to the insert.
If your query doesn't match any records (which would seem logical, given that you write that the collection size is 0), the data event handler will never get called (because that will only be called when there's an actual result).
I think you're better off with using findOne and a regular callback:
collection.findOne({ params }, function(err, result) {
if (err)
throw err;
if (result === null) {
collection.insert(parsedObject, { w: 0 });
}
});
Or even an upsert.
[edit]: I updated the previous question with this, using a very specific reproducible example.
This is my entire program.
I create two Schemas ASchema and BSchema with collections A and B respectively, make two objects a and b, and attempt to save them sequentially - that is, first a then b.
mongoose = require('mongoose'),
Schema = mongoose.Schema;
mongoose.connect('mongodb://localhost/test');
ASchema = new Schema({
text: String
});
BSchema = new Schema({
val: Number
});
A = mongoose.model('A', ASchema);
B = mongoose.model('B', BSchema);
a = new A({text: 'this is a'});
b = new B({val: 5});
a.save(function(err) {
if(err) console.log(err.message);
else {
console.log('saved a : ', a);
b.save(function(err) {
if(err) console.log(err.message);
else {
console.log('saved b : ', b);
}
});
}
});
mongoose.disconnect();
What I expect should happen: It should print saved a : followed by the document a, and then saved b :, followed by the document b.
What actually happens: It prints saved a : { text: 'this is a', _id: 4ee4cab00d2c35fc04000001 } and nothing more. The program does not stop either; it stays 'stuck'.
Looking through the mongo shell, I find that a collection as (a pluralized by mongoose, that is okay) has been created, and I can see saved document in it with db.as.find(). However, I cannnot find a collection bs.
Tweak: In the saving code, swapping the places of a and b (order of saving) causes b to be saved, and a to not be saved. So the problem is not specifically with a or b.
Question: Why does it not save the next document?
The answer is very simple, look at your last line:
mongoose.disconnect();
You shouldn't be doing that, since there are still queries that need to be processed and you don't know when (in our case it's the second query). So what happens is the first query gets executed, Mongoose disconnects and it hangs the second query.
Solution
Delete the last line on put mongoose.disconnect(); after the last query gets executed.
mongoose.disconnect(); //Do not do this as async methods keep running in background.
Don't open + close a connection for every query.
Open the connection once, and re-use it.
I do prefer following way:
MongoDBconObj.open(function(err, db) {
//re-use db
// All your application codes goes here...
//..
http.createServer(onRequest).listen(8080);
console.log("HTTP is listening on 127.0.0.1:8080 ");
});
See the link for best practice
https://groups.google.com/forum/#!topic/node-mongodb-native/5cPt84TUsVg