I'm using jquery ui's sortable function (source) to re-arrange elements. I've built custom callbacks to create an list of those elements. So when i move an element, all elements is given a new position id. It could look like this:
[{
id_of_element_in_database: 12,
new_position: 0
}, {
id_of_element_in_database: 16,
new_position: 1
}, {
id_of_element_in_database: 14,
new_position: 2
}]
I'm sending this list to my back-end by doing a simple Ajax post
$.post('/position', { data: list });
Route
router.post('/position', (req, res) => {
console.log(req.body.data); // This prints the array of objects above.
});
Schema
mongoose.Schema({
id: Number,
position: Number,
...
});
Now I can't figure out how to change the position of all documents effectively. Creating a crappy loop of the array and doing multiple database-requests can't be the best approach.
I've tried that here and this feels so wrong.
for (let i in req.body.data) {
collection.update({ id: req.body.data[i].id }, { position: req.body.data[i].position });
There must be something else i can do to achieve this. I've tried google without any luck.
You could try the bulkWrite API to carry out the updates in a better way without multiple requests to the server:
var callback = function(err, r){
console.log(r.matchedCount);
console.log(r.modifiedCount);
}
// Initialise the bulk operations array
var ops = req.body.data.map(function (item) {
return {
"updateOne": {
"filter": {
"id": parseInt(item.id),
"position": { "$ne": parseInt(item.position) }
},
"update": { "$set": { "position": parseInt(item.position) } }
}
}
});
// Get the underlying collection via the native node.js driver collection object
Model.collection.bulkWrite(ops, callback);
Here's how I did it with Node.js:
var forEach = require('async-foreach').forEach;
var bulk = Things.collection.initializeOrderedBulkOp();
Things.find({}).lean().execAsync(execSettings).then(function(resp){
forEach(resp, function(template, index, arr) {
var done = this.async();
bulk.find({'_id': template._id}).update({$set: {_sid: generateShortId()}});
bulk.execute(function (error) {
done();
});
}, function(notAborted, arr){
res.json({done: true, arr: arr.length});
});
});
Related
I currently have the below code setup to retrieve some data from a MongoDB database and I would like to make an IF function that runs on the documents received. I am trying to have an IF function that runs if the price field in the documents are 10% higher than event.payload.base_price otherwise move on, I am fairly new to coding and MongoDB so can't figure this one out as I am not sure how to write this function. Any help would be incredibly appreciated.
async function run() {
try {
const query = { contract: idSplit[1] , id: idSplit[2] };
const options = {
sort: { name: 1 },
projection: {_id: 0, slug: 1, contract: 1, id: 1, price: 1},
};
const cursor = listings.find(query, options);
await cursor.forEach(console.dir);
} finally {
await client.close();
}
}
run().catch(console.dir);
After you get the array In the cursor, you can use below if conditions in the forEach Loop:
cursor.forEach((element) =>
{
if (element.price > event.payload.base_price + event.payload.base_price * 0.1) {
console.log("Event is happening", element.price);
} else {
console.log("Event is not happening", element.price)
}
});
In the project that I am working on, built using nodejs & mongo, there is a function that takes in a query and returns set of data based on limit & offset provided to it. Along with this data the function returns a total count stating all the matched objects present in the database. Below is the function:
// options carry the limit & offset values
// mongoQuery carries a mongo matching query
function findMany(query, options, collectionId) {
const cursor = getCursorForCollection(collectionId).find(query, options);
return Promise.all([findManyQuery(cursor), countMany(cursor)]);
}
Now the problem with this is sometime when I give a large limit size I get an error saying:
Uncaught exception: TypeError: Cannot read property '_killCursor' of undefined
At first I thought I might have to increase the pool size in order to fix this issue but after digging around a little bit more I was able to find out that the above code is resulting in a race condition. When I changed the code to:
function findMany(query, options, collectionId) {
const cursor = getCursorForCollection(collectionId).find(query, options);
return findManyQuery(cursor).then((dataSet) => {
return countMany(cursor).then((count)=> {
return Promise.resolve([dataSet, count]);
});
);
}
Everything started working perfectly fine. Now, from what I understand with regard to Promise.all was that it takes an array of promises and resolves them one after the other. If the promises are executed one after the other how can the Promise.all code result in race condition and the chaining of the promises don't result in that.
I am not able to wrap my head around it. Why is this happening?
Since I have very little information to work with, I made an assumption of what you want to achieve and came up with the following using Promise.all() just to demonstrate how you should use Promise.all (which will resolve the array of promises passed to it in no particular order. For this reason, there must be no dependency in any Promise on the order of execution of the Promises. Read more about it here).
// A simple function to sumulate findManyQuery for demo purposes
function findManyQuery(cursors) {
return new Promise((resolve, reject) => {
// Do your checks and run your code (for example)
if (cursors) {
resolve({ dataset: cursors });
} else {
reject({ error: 'No cursor in findManyQuery function' });
}
});
}
// A simple function to sumulate countMany for demo purposes
function countMany(cursors) {
return new Promise((resolve, reject) => {
// Do your checks and run your code (for example)
if (cursors) {
resolve({ count: cursors.length });
} else {
reject({ error: 'No cursor in countMany' });
}
});
}
// A simple function to sumulate getCursorForCollection for demo purposes
function getCursorForCollection(collectionId) {
/*
Simulating the returned cursor using an array of objects
and the Array filter function
*/
return [{
id: 1,
language: 'Javascript',
collectionId: 99
}, {
id: 2,
language: 'Dart',
collectionId: 100
},
{
id: 3,
language: 'Go',
collectionId: 100
}, {
id: 4,
language: 'Swift',
collectionId: 99
}, {
id: 5,
language: 'Kotlin',
collectionId: 101
},
{
id: 6,
language: 'Python',
collectionId: 100
}].filter((row) => row.collectionId === collectionId)
}
function findMany(query = { id: 1 }, options = [], collectionId = 0) {
/*
First I create a function to simulate the assumed use of
query and options parameters just for demo purposes
*/
const filterFunction = function (collectionDocument) {
return collectionDocument.collectionId === query.id && options.indexOf(collectionDocument.language) !== -1;
};
/*
Since I am working with arrays, I replaced find function
with filter function just for demo purposes
*/
const cursors = getCursorForCollection(collectionId).filter(filterFunction);
/*
Using Promise.all([]). NOTE: You should pass the result of the
findManyQuery() to countMany() if you want to get the total
count of the resulting dataset
*/
return Promise.all([findManyQuery(cursors), countMany(cursors)]);
}
// Consuming the findMany function with test parameters
const query = { id: 100 };
const collectionId = 100;
const options = ['Javascript', 'Python', 'Go'];
findMany(query, options, collectionId).then(result => {
console.log(result); // Result would be [ { dataset: [ [Object], [Object] ] }, { count: 2 } ]
}).catch((error) => {
console.log(error);
});
There are ways to write this function in a "pure" way for scalability and testing.
So here's your concern:
In the project that I am working on, built using nodejs & mongo, there is a function that takes in a query and returns set of data based on limit & offset provided to it. Along with this data the function returns a total count stating all the matched objects present in the database.
Note: You'll need to take care of edge case.
const Model = require('path/to/model');
function findManyUsingPromise(model, query = {}, offset = 0, limit = 10) {
return new Promise((resolve, reject) => {
model.find(query, (error, data) => {
if(error) {
reject(error);
}
resolve({
data,
total: data.length || 0
});
}).skip(offset).limit(limit);
});
}
// Call function
findManyUsingPromise(Model, {}, 0, 40).then((result) => {
// Do something with result {data: [object array], total: value }
}).catch((err) => {
// Do something with the error
});
I'm trying to update a mongodb json document by single field. I know it would be easy if my data was in object array but beacause of reasons and uninformed early design choices it's just objects within objects.
structure of the single document in my collection:
{
"_id": 123123,
"drivers_teams": {
"drivers" : {
"4" : { <data> },
"5" : { <data indeed> },
...
},
...
}
}
I want to add new object e.g
const collection = db.get('collection');
let drivers = { };
drivers['111'] = { <new data> };
collection.update({}, {$set: { drivers_teams: { drivers } }}, { upsert: true }, function (err, doc) { ... });
But the outcome is that original objects in "drivers_teams" are wiped out and has only the new field in it.
If I try:
collection.update({}, {$set: { drivers }}, { upsert: true }, function (err, doc) { ... });
It non surprisingly inserts a new field "drivers" outside the drivers_teams
{
"_id": 123123,
"drivers" : { <new data> },
"drivers_teams": { <still original> }
}
Similar problem (+solution) here but json isn't nested like mine.
https://groups.google.com/forum/#!topic/mongodb-user/ZxdsuIU94AY
Is there a way to even accomplish what I'm trying to do? Or should I just give up and overwrite the whole document when I want to update a single field?
EDIT
Working solution here is {$set: { 'drivers_teams.drivers.111': <data> }} but what I should have mentioned is that example key '111' is actually unknown, it comes from client when it sends update data. That's why the new object was created by this solution: dynamically name mongo key field
Writing anything like { 'drivers_teams.driver.' + key: <data> } throws error.
It is given in the documentation of '$set' that to specify a field in an embedded document or in an array, you can use the dot notation. For your case you can do:
collection.update({},
{
$set: { "drivers_teams.drivers.111": { <new data> }}
},
{ upsert: true }, function (err, doc) { ... });
https://docs.mongodb.com/manual/reference/operator/update/set/
Edit:
If you are generating the field name dynamically, you can do:
const collection = db.get('collection');
let set_obj= { };
set_obj['drivers_teams.driver.' + key] = { <new data> };
collection.update({}, {$set: set_obj }})
You missed a level above the drivers sub-document. In mongo shell, I believe what you require is:
db.collection.update({}, {$set: {'drivers_teams.drivers.111': {...}}})
which will update the document to be like:
{
"_id": 123123,
"drivers_teams": {
"drivers": {
"4": {...},
"5": {...},
"111": {...}
}
}
}
Please see the Embedded Documents page for more details regarding the dot notation used by MongoDB.
I'm trying to get some data from a MongoDB database with the find() method, returning only those documents that contain a specified "room". Then, I want to return all distinct values, of the found array of rooms, whose key is equal to "variety". I tried this in two different ways and I could be way off in my approach. The first way was to chain the collection methods find() and distinct(). This did not work:
This is what the plantList collection looks like:
[
{
"_id": {
"$oid": "56c11a761b0e60030043cbae"
},
"date added": "10/21/2016",
"variety": "Lettuce",
"room": "Room 1"
},
{
"_id": {
"$oid": "56c11a761b0e60030043cbaf"
},
"date added": "10/21/2015",
"variety": "Tomatoes",
"room": "Room 2"
}
]
server.js
//plantList = db collection
var mongojs = require('mongojs');
var MongoClient = require("mongodb").MongoClient;
MongoClient.connect(process.env.MONGOLAB_URI, function(err, db) {
var plantList = db.collection("plantList");
app.get('/gettypesbyroom/:room', function(req, res) {
var roomReq = req.params.room;
plantList
.find({"room":roomReq})
.toArray()
.distinct("variety", function(err, docs) {
if (err) throw err;
res.json(docs);
});
});
});
My second approach was to chain promises with .then() and use underscore.js to select the keys of the array of rooms (also did not work):
app.get('/gettypesbyroom/:room', function(req, res) {
var roomReq = req.params.room;
plantList
.find({"room":roomReq})
.toArray(function(err, docs) {
if (err) throw err;
return docs;
}).then(function(docs) {
_.keys(docs, function(docs) { return docs.variety; });
}).then(function(varieties) {
res.json(varieties); //not inside scope of .get function?
});
});
Is there something I could do differently to make these work or perhaps a different approach altogether?
Try calling it without toArray:
//plantList = db collection
app.get('/gettypesbyroom/:room', function(req, res) {
var roomReq = req.params.room;
plantList
.find({ room: roomReq })
.distinct('type', function(err, docs) {
if (err) throw err;
res.json(docs);
});
});
See How Do I Query For Distinct Values in Mongoose.
You can do it two more ways. One is to make it easier with a projection operator - just project what you need. Eg. if you have a document that looks like:
{
room: 123,
type: 'soundproof_room',
other: 'stuff'
}
...you can project it with a query like this to just select the type:
db.rooms.find({ room: room }, { type: 1 }).toArray();
That would give you an array of objects like this:
let roomTypes = [{type: 'soundproof_room'}, {type: 'windy_room'}, {type: 'soundproof_room'}, {type: 'room without doors'}]
(Obviously, I'm making types up, I don't know what they really are.)
And then use a simple map:
return res.json(
roomTypes
// extract the type
.map(room => room.type)
// filter out duplicates
.filter((type, idx, self) => self.indexOf(type) === idx)
);
(I'm using ES6 arrow fn, hope you can read it, if not: babeljs.io/repl/)
Another thing to try is an aggregation.
db.rooms.aggregate({
// first find your room
$match: { room: room }
},
{
// group by type, basically make distinct types
$group: {
_id: '$type',
count: {$sum: 1} // inc this by 1 for each room of this type
}
});
That one would get you your room, and it would return you only the types of rooms and the count per type, as an added bonus.
I need to use skip and limit for pagination, and the distinct for not return equal values.
If i use
MyModel.find().distinct('blaster', function(err, results) {
res.render('index', {
data: results
});
});
This works.
If i use
MyModel.find().sort('brand').skip((page-1)*15).limit(15).exec(function(err, results) {
res.render('index', {
data: results
});
});
This is also working, but how use both?
If i try, the error will show:
Error: skip cannot be used with distinct
You don't do that. .distinct() is a method that returns an "array", and therefore you cannot modify something that is not a "Cursor" with "cursor modifiers" like .limit() and .skip().
What you want is the .aggregate() method. Much more than just adding things up:
MyModel.aggregate(
[
{ "$group": { "_id": "$blaster" } },
{ "$skip": ( page-1 ) * 15 },
{ "$limit": 15 }
],
function(err,results) {
// results skipped and limited in here
}
);
The aggregation framework provides another way to achieve "distinct" results. But in a more flexible way. See the operators for $group, $skip and $limit.