Entering data with MongoDb and Node.js - javascript

So I'm trying to enter data into a mongodb collection with node. As far as I can tell I have access to the collection.
var collection = db.collection("whatsGoingOnEvents");
if(collection){
console.log("hitting stream");
var stream = collection.find({time: parsedObject.time, endTime: parsedObject.endTime, lon:parsedObject.lon,lat:parsedObject.lat}).stream();
console.log(stream);
stream.on("data",function(data){
console.log("data");
console.log(data);
if(!data){
collection.insert(parsedObject);
console.log("hitting insert");
}
});
stream.on("end",function(){
//dosomething
});
}
parsedObject may or may not have all of those fields - should it matter? I thought if the field was not there then collection.find() is just looking for time to be "undefined", which is still technically a value.
I never hit console.log("data") so I never insert documents. I've been trying to follow this link.
and so am not sure why the insert is not happening. I know that nothing is being added from db.collection.stats();, which tells me the size of the collection is 0.
Oh also, this is what I'm using to connect to Mongo-
var mongo = require('mongodb').MongoClient;
EDIT--
I tried the answer below - that resulted in this error-
lib/mongodb/connection/server.js:481
throw err;
^
Error: Cannot use a writeConcern without a provided callback
at insertAll (/Users/psanker/Google Drive/Coding/Javascript/WhatsGoingOn/node_modules/mongodb/lib/mongodb/collection.js:332:11)
at Collection.insert (/Users/psanker/Google Drive/Coding/Javascript/WhatsGoingOn/node_modules/mongodb/lib/mongodb/collection.js:91:3)
^The above occurred because I hadn't added a callback to the insert.

If your query doesn't match any records (which would seem logical, given that you write that the collection size is 0), the data event handler will never get called (because that will only be called when there's an actual result).
I think you're better off with using findOne and a regular callback:
collection.findOne({ params }, function(err, result) {
if (err)
throw err;
if (result === null) {
collection.insert(parsedObject, { w: 0 });
}
});
Or even an upsert.

Related

Stuck on Node + MongoDB asynchronous query issue

A little preface: I am very new to working with Node so please bear with my ignorance
I am trying to pass some info from an array in Node.js, and check whether it exists in a MongoDB document. I am still struggling to wrap my head around Node and how to work with databases asynchronously.
I have the following code
for (i in articleTitle) {
console.log(articleTitle[i]);
// Use connect method to connect to the Server
MongoClient.connect(mongoUrl, function(err, db) {
if (err) throw err; // Throw error
var query = { title: articleTitle[i] }; // Query Parameter
// Perform Query
db.collection(mongoCollection).find(query).toArray(function(err, result) {
if (err) throw err; // Throw error
if (result == '') {
console.log('No results found for title:', articleTitle[i]);
} else {
console.log('Found an entry');
}
db.close(); // Close connection
});
});
}
In the above code, I have an array of strings called articleTitle (for example: ['Title1', 'Title2', 'Title3']), I then run through each of those titles in the array (using the for() loop) to check if each title exists in the database.
The output I get is as follows:
> Title1
> Title2
> Title3
> No results found for title: Title 3
> No results found for title: Title 3
> No results found for title: Title 3
As evident above it seems to be checking for the last object in the array three times. I have also tried to implement the async package but struggled with that as well.
Any help would be appreciated.
The issue you have is scope of the variable i in the callback function.
Use for (let i in articleTitle) instead.
This creates a new variable i for every iteration and scope is restricted to that iteration.
The answers to this question JavaScript closure inside loops – simple practical example explain in detail about why this happens and about scope and closure in JavaScript. The above question is an exact duplicate of this question.

MongoError: cursor killed or timed out - Meteor timeout settings ineffective

My Meteor 1.2.1 program threw MongoError: cursor killed or timed out in a find().forEach() loop, so i found this page that says this code prevents that:
var myCursor = db.users.find().noCursorTimeout()
However, the driver docs and my Meteor say that method doesn't exist: Object [object Object] has no method 'noCursorTimeout'
Mongo autoReconnect is enabled by default and didn't help, nor did the Meteor forum, or even .find({}, {timeout:false}) according to this comment.
2016-07-20 11:21:37 Update started
2016-07-20 11:37:21 Exception while invoking method 'updateCollections' MongoError: cursor killed or timed out
Maybe Meteor got confused by the failed SOAP call at 2016-07-20 09:34:57?
"error": {
"errno": "ETIMEDOUT",
"syscall": "connect",
"code": "ETIMEDOUT"
},
Assuming maxTimeMS would help in this case you can access it by working with rawCollection object instead of the Meteor collection itself.
It's quite simple:
var rawCollection = Meteor.users.rawCollection();
var cursor = rawCollection.find({}).maxTimeMS(5000);
var myData = fetchCursor(cursor);
Where fetchCursor is a simple fiber-aware helper function that can be implemented like this:
var fetchCursor = Meteor.wrapAsync(function fetchCursor (cursor, cb) {
cursor.toArray(cb);
});
Though, I am not sure if this method is exactly what you're looking for.
Edit
If you don't need the entire array of documents but you want to process each one of them independently it may be better to use each instead of toArray, e.g.
var fetchCursor = Meteor.wrapAsync(function fetchCursor (cursor, cb) {
cursor.each(function (err, doc) {
if (err) return cb(err);
if (!doc) return cb(null, { done: true }); // no more documents
// do something with the document ...
});
});

Simple MongoDB query find item age > 10 learnyoumongo find function

I'm going through learnyoumongo and I'm stuck on part 3. Basically a test database is included in the challenge, it is full of parrots, and the goal is to select the parrots whose age is greater than the input. I'm getting a weird error and google is full of mongo 2.x solutions to not exactly the same problem and I'm using mongo 3.0
This is the javascript code:
var mongo = require('mongodb').MongoClient;
var parsedInput = parseInt(process.argv[2]);
var results;
mongo.connect('mongodb://localhost:27017/learnyoumongo', function(err, db){
results = db.collection('parrots').find({ age: { $gt: parsedInput } } ).toArray(function(err, doc) //find if a value exists
{
if(doc) //if it does
{
console.log(doc);
}
else{
console.log(err);
}
});
//console.log(results);
db.close();
});
This is the weird error message:
PS C:\git\learnyoumongo> node .\test.js { [MongoError: server localhost:27017 sockets closed]
name: 'MongoError',
message: 'server localhost:27017 sockets closed' }
I tried restarting mongo, but I'm still not able to pull any of the 'parrots' data out. Even with just find({})
The problem was two pronged - The main issue was I was expecting to be able to run the query with node test.js, and see the results from the parrots collection. But learnyoumongo has atomic tests, meaning they clear the database entirely before and after, so the only way to test was learnyoumongo test.js, and I kept getting an empty result set running the node command.
The other issue was with db.close(), you can't just call db.open and then db.close, because open is async and it would close right after opening, hence the sockets closed error. So you put db.close in the toArray function, or in any other callback of db.open

Express.js collection.find() return Object

I am wanted to display every documents stored in my mongodb. I tried following code which simply get collection.find() and display through res.send()
router.get('/index', function(req,res){
var db = req.db
var collection = db.get('usercollection')
var display = util.inspect(collection.find()));
res.send(display);
});
I expected it to display the actual document stored in mongodb. But instead, it displayed this object format:
{cold:{manager:{driver:[Object], helper:[Object], collection:[Object].....
Is there any other steps needed to display raw mongodb document?
If the library you are using is the official 10gen library, then you can't simply output collection.find() without unwinding it. The easiest way to do that for smaller datasets is
collection.find().toArray(function(err, results) {
if (err) {
// do something error-y
} else {
res.send( results );
}
});
If you post more of your code, and tag your question with the libraries you are using, you'll be able to get more targeted help. If the library you are using returns a promise, this is probably how you'd unwind it:
collection.find().then(function(results){
res.send(results);
}).catch(function(err){
console.error(err);
});

Insert an array of documents into a model

Here's the relevant code:
var Results = mongoose.model('Results', resultsSchema);
var results_array = [];
_.each(matches, function(match) {
var results = new Results({
id: match.match_id,
... // more attributes
});
results_array.push(results);
});
callback(results_array);
});
}
], function(results_array) {
results_array.insert(function(err) {
// error handling
Naturally, I get a No method found for the results_array. However I'm not sure what else to call the method on.
In other functions I'm passing through the equivalent of the results variable here, which is a mongoose object and has the insert method available.
How can I insert an array of documents here?
** Edit **
function(results_array) {
async.eachLimit(results_array, 20, function(result, callback) {
result.save(function(err) {
callback(err);
});
}, function(err) {
if (err) {
if (err.code == 11000) {
return res.status(409);
}
return next(err);
}
res.status(200).end();
});
});
So what's happening:
When I clear the collection, this works fine.
However when I resend this request I never get a response.
This is happening because I have my schema to not allow duplicates that are coming in from the JSON response. So when I resend the request, it gets the same data as the first request, and thus responds with an error. This is what I believe status code 409 deals with.
Is there a typo somewhere in my implementation?
Edit 2
Error code coming out:
{ [MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error index:
test.results.$_id_ dup key: { : 1931559 }]
name: 'MongoError',
code: 11000,
err: 'insertDocument :: caused by :: 11000 E11000 duplicate key error index:
test.results.$_id_ dup key: { : 1931559 }' }
So this is as expected.
Mongo is responding with a 11000 error, complaining that this is a duplicate key.
Edit 3
if (err.code == 11000) {
return res.status(409).end();
}
This seems to have fixed the problem. Is this a band-aid fix though?
You seem to be trying to insert various documents at once here. So you actually have a few options.
Firstly, there is no .insert() method in mongoose as this is replaced with other wrappers such as .save() and .create(). The most basic process here is to just call "save" on each document you have just created. Also employing the async library here to implement some flow control so everything just doesn't queue up:
async.eachLimit(results_array,20,function(result,callback) {
result.save(function(err) {
callback(err)
});
},function(err) {
// process when complete or on error
});
Another thing here is that .create() can just take a list of objects as it's arguments and simply inserts each one as the document is created:
Results.create(results_array,function(err) {
});
That would actually be with "raw" objects though as they are essentially all cast as a mongooose document first. You can ask for the documents back as additional arguments in the callback signature, but constructing that is likely overkill.
Either way those shake, the "async" form will process those in parallel and the "create" form will be in sequence, but they are both effectively issuing one "insert" to the database for each document that is created.
For true Bulk functionality you presently need to address the underlying driver methods, and the best place is with the Bulk Operations API:
mongoose.connection.on("open",function(err,conn) {
var bulk = Results.collection.initializeUnorderedBulkOp();
var count = 0;
async.eachSeries(results_array,function(result,callback) {
bulk.insert(result);
count++;
if ( count % 1000 == 0 ) {
bulk.execute(function(err,response) {
// maybe check response
bulk = Results.collection.initializeUnorderedBulkOp();
callback(err);
});
} else {
callback();
}
},function(err) {
// called when done
// Check if there are still writes queued
if ( count % 1000 != 0 )
bulk.execute(function(err,response) {
// maybe check response
});
});
});
Again the array here is raw objects rather than those cast as a mongoose document. There is no validation or other mongoose schema logic implemented here as this is just a basic driver method and does not know about such things.
While the array is processed in series, the above shows that a write operation will only actually be sent to the server once every 1000 entries processed or when the end is reached. So this truly does send everything to the server at once.
Unordered operations means that the err would normally not be set but rather the "response" document would contain any errors that might have occurred. If you want this to fail on the first error then it would be .initializeOrderedBulkOp() instead.
The care to take here is that you must be sure a connection is open before accessing these methods in this way. Mongoose looks after the connection with it's own methods so where a method such as .save() is reached in your code before the actual connection is made to the database it is "queued" in a sense awaiting this event.
So either make sure that some other "mongoose" operation has completed first or otherwise ensure that your application logic works within such a case where the connection is sure to be made. Simulated in this example by placing within the "connection open" event.
It depends on what you really want to do. Each case has it's uses, with of course the last being the fastest possible way to do this as there are limited "write" and "return result" conversations going back and forth with the server.

Categories

Resources