sails js mongodb populate - javascript

I would like to use .populate() with Sails js and MongoDB and so I did something like this:
I am using the latest version of Sails and MongoDB and Node.
My models look like:
// User.js
module.exports = {
attributes: {
name: {
type: 'string'
},
pets: {
collection: 'pet',
via: 'owner'
}
}
};
// Pet.js
module.exports = {
attributes: {
name: {
type: 'string'
},
owner: {
model: 'user'
}
}
};
And I used the following which is the same as the documentation:
User.find()
.populate('pets')
.exec(function(err, users) {
if(err) return res.serverError(err);
res.json(users);
});
But I get the following output:
[
{
"pets": [],
"id": "58889ce5959841098d186b5a",
"firstName": "Foo",
"lastName": "Bar"
}
]

If you just need the query for dogs. You could just as easily reverse the query.
Pet.find() //or what you want with {}
.populate('users')
.exec(err, petsWithUsers)
But if not:
User.find()
.populate('pets')
.exec(err, users) ...
This will return all users (User.find()) and only populate pets of type dog (populate('pets', {type: 'dog'})). In the case you'll have users without dogs in your results.
More information and Reference: here

Related

change values of nested objects by id with mongoose

I'm new to MongoDB, and I'm trying to do a very simple task, but however I can't get it right.
What I want is to change the process status but I tried "FindAndUpdate", "UpdateOne" and "FindByIdAndUpdate" but it won't work.
Maybe it has to do with my Schema. Should I create a new Schema for the Process?
My Database entry inside a MongoDB Collection:
_id: 622c98cfc872bcb2578b97a5
username:"foo"
__v:0
process:Array
0: Object
processname:"bar"
process_status:"stopped"
_id: 6230c1a401c66fc025d3cb88
My current Schema:
const User = new mongoose.Schema(
{
username: { type: String, required: true },
process: [
{
processname: {
type: String,
},
process_status: {
type: String,
},
},
],
},
{ collection: "user-data" }
);
My current code:
const startstopprocess = await User.findByIdAndUpdate(
{ _id: "6230c1a401c66fc025d3cb88" },
{ process_status: "started" }
).then(function (error, result) {
console.log(error);
console.log(result);
});
You can use positional operator $ in this way:
db.collection.update({
"process._id": "6230c1a401c66fc025d3cb88"
},
{
"$set": {
"process.$.process_status": "started"
}
})
Note how using positional operator you can say mongo "from the object you have found in find stage, update the process_status variable to started"
Example here

How to seed roles and capabilities in MongoDB

I am new to working on a MongoDB and Docker, I am working on an application and couldn't find a more subtle way to seed my database using an npm run command. First I created a file called seed.js and then associated it to npm run seed command on the package.json file.
On the seed.js file I import Mongoose and the models but two things I will need to do is:
Create roles, if they don’t exist yet
Create capabilities, if they don’t exist yet and associate it to the
roles
The Roles that i want to create are:
admin (description: Administrator)
viewer (description: Viewer)
Capabilities
I need to check each endpoint of the Users service that should require authentication and create an adequate capability. Example: updateUser updates the user data. This could be done by the own user (so there must be an updateUserOwn capability) and by an administrator (that will have an updateUsers capability). I will have to analyse each endpoint and judge what is adequate but I cannot still find a way around getting the initial role and capabilities to the database.
UPDATE:
On the seeding itself, the updated solution works, but it requires lot of code and repetition that could probably be fixed by loops. I’d like to start creating the roles first which means creating an array with objects, with the data from the roles to be created. Each role has the fields role and description
const userRole = [{
role: admin
description: Administrator
},
{
role: viewer
description: Viewer
}]
The idea is that if the role exist it doesn't need to update but I don't know how do I loop through the array and create a role only if it doesn’t exist. Something like using updateOne, with the upsert: true option, but with the data on $setOnInsert as this will add the data only if a document is inserted.
I only need create and not update because in the future I’ll edit roles directly through the API. So, if a change was made on the admin role, for example, the seed will not overwrite it
During the loop, I'll need to create an associative array called rolesIds that will store the ObjectId of the created roles. It should result in something like this:
[
"admin": "iaufh984whrfj203jref",
"viewer": "r9i23jfeow9iefd0ew0",
]
Also each capability must have an array of roles it must be associated to. Example:
{
capability: "updateUsers",
description: "Update the data of all users",
roles: ["admin"]
}
How do I loop through the array on each element, prepare it to be inserted using the array with object IDs. Instead of roles: ["admin"]? something like roles: ["iaufh984whrfj203jref"], otherwise there’ll be a cast error. Remember each capability may be associated to more than one role, so I'll probably need to loop through them but I cannot find a way to create that logic.
Users Model
const userSchema = new mongoose.Schema(
{
.......
role: {
ref: "roles",
type: mongoose.Schema.Types.ObjectId,
},
);
module.exports = mongoose.model("User", userSchema);
Role Model:
const roles = new mongoose.Schema({
role: {
type: String,
required: true,
},
capabilities: [
{
type: mongoose.Schema.Types.ObjectId,
ref: "capabilities",
},
],
});
module.exports = mongoose.model("roles", roles);
Capabilities Model:
const capabilities = new mongoose.Schema({
capability: {
type: String,
required: true,
},
name: {
type: String,
},
});
module.exports = mongoose.model("capabilities", capabilities);
UPDATED: seed file:
const seedDB = async () => {
if (!process.env.DB_URI) {
throw new Error("Error connecting to MongoDB: DB_URI is not defined.");
}
try {
await mongoose.connect(process.env.DB_URI, {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
});
console.log("Connected to MongoDB");
const tasks = [
Capability.findOneAndUpdate(
{ name: "updateUserOwn" },
{ capability: "updateUser" },
{ upsert: true }
).exec(),
Capability.findOneAndUpdate(
{ name: "updateUsers" },
{ capability: "updateUser" },
{ upsert: true }
).exec(),
// Seed more...
];
const [updateUserOwn, updateUsers] = await Promise.all(tasks);
Role.bulkWrite([
{
updateOne: {
filter: { role: "Admin" },
update: { capabilities: [updateUsers] },
upsert: true,
},
},
{
updateOne: {
filter: { role: "Viewer" },
update: { capabilities: [updateUserOwn] },
upsert: true,
},
},
]);
console.log("seeded data", tasks);
} catch (error) {
console.log(`Error connecting to MongoDB: ${error}`);
}
};
seedDB();
You are on the right path overall.
Because capabilities are used as a reference you'd have to fetch or create them (get a ref) before assigning them to a role.
This could be your seed logic:
const tasks = [
Capability.findOneAndUpdate(
{ name: 'updateUserOwn' }, // matches or creates this capability
{ capability: 'updateUser' }, // adds this to the object
{ upsert: true, new: true } // `new` guarantees an object is always returned
}).exec(),
Capability.findOneAndUpdate(
{ name: 'updateUsers' },
{ capability: 'updateUser' },
{ upsert: true, new: true }
}).exec(),
// Seed more...
];
const [
updateUserOwn,
updateUsers,
] = await Promise.all(tasks);
// We can use bulk write for the second transaction so it runs in one go
await Role.bulkWrite([
{
updateOne: {
filter: { role: 'Admin' },
update: { capabilities: [updateUsers] },
upsert: true,
}
},
{
updateOne: {
filter: { role: 'Viewer' },
update: { capabilities: [updateUserOwn] },
upsert: true,
}
}
]);
We seed capabilities one by one using findOneAndUpdate so we can get a reference to each capability we intend to use on the roles
Then we use bulkWrite to seed the roles
I might have swapped the capabilities and their names but I hope you get the general idea
The seed would have been simpler if there weren't references involved - you could just use bulkWrite everything in one go, but in order to create object with inner references or add references to such object you first need to have the actual reference
You can create static mapping and loop through which would reduce the code a bit, and make things easier. This would also allow you to skip seeding items that already exist
Since capabilities are reused through roles I want to create them first, but it's no problem to alter the logic to first create roles and then capabilities, though it might not be as straight forward
Also each capability must have an array of roles it must be associated to.
This is called a "many to many" relationship (as roles also have an array of references to capabilities) which would only complicate logic. Are you sure you really need it - mongoose/monogo won't manage it automatically for you:
when you add a capability to a role you'd also need to sync and add the role inside capability.roles - manually
and the reverse - adding a role inside capability.roles you'd need to sync this and also manually add the capability to role.capabilities
the same thing for deleting capabilities or roles - manual cleanup
it can fail and would need to recover - e.g. a capability is added to role.capabilities but for some reason execution stopped and the role was not added to capability.roles - so the whole handling might need to be wrapped in a transaction
there are ways to cross reference roles and capabilities without have to have a "many to many" relationship
Here's a simple approach using save middleware to sync many to many relationships for create/update
Role.js
const mongoose = require('mongoose');
const roles = new mongoose.Schema({
role: {
type: String,
required: true,
},
description: String,
capabilities: [
{
type: mongoose.Schema.Types.ObjectId,
ref: 'capabilities',
},
],
});
roles.pre('save', async function save() {
// Doesn't need to run if there are no capabilities
if (!this.capabilities || this.capabilities.length === 0) return;
const Capability = mongoose.model('capabilities');
await Capability.updateMany(
{ _id: {$in: this.capabilities} },
// Adds only if it's missing
{ $addToSet: { roles: this._id }},
);
});
// Todo: similar logic to remove from capabilities if role is deleted
module.exports = mongoose.model("roles", roles);
Capability.js
const mongoose = require('mongoose');
const capabilities = new mongoose.Schema({
capability: {
type: String,
required: true,
},
description: {
type: String,
},
roles: [
{
type: mongoose.Schema.Types.ObjectId,
ref: 'roles',
}
]
});
capabilities.pre('save', async function save() {
if (!this.roles || this.roles.length === 0) return;
const Role = mongoose.model('roles');
await Role.updateMany(
{_id: {$in: this.roles}},
{$addToSet: {capabilities: this._id}},
);
})
// Todo: similar logic to remove from roles if capability is deleted
module.exports = mongoose.model("capabilities", capabilities);
Here's an update seed routine:
Seed.js
const mongoose = require('mongoose');
const Capability = require('./models/Capability');
const Role = require('./models/Role');
const CAPABILITIES = {
UPDATE_USERS: {
capability: 'updateUsers',
description: 'Update the data of all users',
},
VIEW_USERS: {
capability: 'viewUsers',
description: 'View public data of users',
},
UPDATE_OWN_RECORD: {
capability: 'updateUserOwn',
description: 'Update user own data',
}
}
const ROLES_TO_SEED = [
{
role: 'admin',
description: 'Administrator',
capabilities: [CAPABILITIES.UPDATE_USERS, CAPABILITIES.VIEW_USERS],
},
{
role: 'viewer',
description: 'Viewer',
capabilities: [CAPABILITIES.VIEW_USERS, CAPABILITIES.UPDATE_OWN_RECORD],
}
]
const seedDB = async () => {
await connectToDb();
await seedRoles();
};
const connectToDb = async () => {
if (!process.env.DB_URI) throw new Error('DB_URI is not defined.');
console.info('Connecting to database...');
await mongoose.connect(process.env.DB_URI, {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
useFindAndModify: false,
});
console.info('Connected \n');
}
const seedRoles = async () => {
console.log('Seeding Roles...');
// runs sequentially to skip creating duplicate capabilities
for (const role of ROLES_TO_SEED) {
await findOrCreateRole(role);
}
console.log('Complete \n');
}
const findOrCreateRole = async ({capabilities, role, ...defaults}) => {
console.info('Looking for role: ', role);
const fromDb = await Role.findOne({role}).exec();
if (fromDb) {
console.info('Role already exists skipping... \n');
return fromDb;
}
console.info('Role does not exist - creating new \n');
const doc = new Role({role, ...defaults});
// All capabilities (per role) can be created/found in parallel
const roleCapabilities = await Promise.all(capabilities.map(findOrCreateCapability));
doc.capabilities = roleCapabilities.map(c => c._id);
await doc.save();
console.info('Role created: ', role);
console.info('');
return doc;
}
const findOrCreateCapability = async ({capability, ...defaults}) => {
console.info('Looking for capability: ', capability);
let doc = await Capability.findOne({capability}).exec();
if (doc) {
console.info(`Capability ${capability} found - using existing...`);
}
else {
console.info(`Capability ${capability} does not exist - creating new`);
doc = new Capability({capability, ...defaults});
await doc.save();
}
return doc;
}
seedDB()
.then(() => {
console.info('Exiting...: ');
process.exit(0);
})
.catch(error => {
console.error('Seed failed');
console.error(error);
process.exit(1);
})
We have a dictionary of capabilities and a list of roles that we can map to db operations.
The idea is that each role should contain the full definition of a capability, it can be used to either find the existing capability or create it if it doesn't exist
For each role in the list we make a query to see if it exists.
When it exists we do nothing and move to the next role
When it doesn't exist we have all the data needed to create it and create/find any capabilities that it might need
When you figure out all the roles and capabilities of the application you just put them in the ROLES_TO_SEED and CAPABILITIES static mappings
The script relies on the above mentioned middleware modifications in models
And a small bonus
You don't need many to many relationship to match capabilities to the roles they are used in. Here's how you can aggregate that information if only the Role model have an array of capabilities (refs). Run this after the database is seeded:
const showCapabilitiesUsages = async () => {
const result = await Capability.aggregate([
{
$lookup: {
from: 'roles',
let: {searched: '$_id'},
pipeline: [
{
$match: {
$expr: {
$in: ['$$searched', '$capabilities']
}
}
}
],
as: 'roles'
}
}, {
$project: {
_id: 0,
capability: 1,
description: 1,
usedInRoles: {
$map: {
input: '$roles',
as: 'role',
in: '$$role.role',
}
}
}
}
]).exec();
console.log('Aggregate result: ', result);
}
You should get a result like:
Aggregate result: [
{
capability: 'updateUsers',
description: 'Update the data of all users',
usedInRoles: [ 'admin' ]
},
{
capability: 'viewUsers',
description: 'View public data of users',
usedInRoles: [ 'admin', 'viewer' ]
},
{
capability: 'updateUserOwn',
description: 'Update user own data',
usedInRoles: [ 'viewer' ]
}
]
Try something like this, it should would work:
const roles = [
{
name: 'admin',
description: 'Administrator',
},
{
name: 'viewer',
description: 'Viewer',
},
];
const capabilities = [
// Capabilities
{
name: 'createCapability',
description: 'Create a new capability',
roles: ['admin'],
},
{
name: 'deleteCapability',
description: 'Delete a capability',
roles: ['admin'],
}
// Roles
{
name: 'createRole',
description: 'Create a new role',
roles: ['admin'],
},
{
name: 'deleteRole',
description: 'Delete a role',
roles: ['admin'],
},
// Users
{
name: 'updateUser',
description: 'Update current user data',
roles: ['viewer'],
},
{
name: 'updateUsers',
description: 'Update the data from any user',
roles: ['admin'],
},
];
const seedRoles = async (roles) => {
if (0 == roles.length || !Array.isArray(roles)) {
return;
}
console.log('');
for (const role of roles) {
const savedRole = await Role.findOneAndUpdate(
{name: role.name},
{$setOnInsert: role},
{upsert: true, new: true, rawResult: true},
);
if (!savedRole) {
console.log(`Role “${savedRole.value.name}” already on database.`);
} else {
console.log(`Role “${savedRole.value.name}” added to database.`);
}
}
};
const seedCapabilities = async (capabilities) => {
if (0 == capabilities.length || !Array.isArray(capabilities)) {
return;
}
console.log('');
for (const capability of capabilities) {
const rolesToPush = capability.roles;
delete capability.roles;
const addedCapability = await Capability.findOneAndUpdate(
{name: capability.name},
{$setOnInsert: capability},
{upsert: true, new: true, rawResult: true},
);
if (!addedCapability) {
console.log(
`Capability “${addedCapability.value.name}” ` +
`already on database.`,
);
} else {
console.log(
`Capability “${addedCapability.value.name}” ` +
`added to database.`,
);
if (rolesToPush && Array.isArray(rolesToPush)) {
rolesToPush.forEach(async (role) => {
const roleToPush = await Role.findOne({name: role});
if (roleToPush) {
roleToPush.capabilities.push(addedCapability.value);
await roleToPush.save();
}
});
}
}
}
};
const seedDb = async (roles, capabilities, users) => {
try {
await seedRoles(roles);
await seedCapabilities(capabilities);
console.log('roles', roles);
} catch (error) {
console.error(error);
}
};
module.exports = seedDb;

MongoDB Aggregate is not matching specific field

I'm new to Aggregation in MongoDB and I'm trying to understand the concepts of it by making examples.
I'm trying to paginate my subdocuments using aggregation but the returned document is always the overall values of all document's specific field.
I want to paginate my following field which contains an array of Object IDs.
I have this User Schema:
const UserSchema = new mongoose.Schema({
username: {
type: String,
unique: true,
required: true
},
firstname: String,
lastname: String,
following: [{
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
}],
...
}, { timestamps: true, toJSON: { virtuals: true }, toObject: { getters: true, virtuals: true } });
Without aggregation, I am able to paginate following,
I have this route which gets the user's post by their username
router.get(
'/v1/:username/following',
isAuthenticated,
async (req, res, next) => {
try {
const { username } = req.params;
const { offset: off } = req.query;
let offset = 0;
if (typeof off !== undefined && !isNaN(off)) offset = parseInt(off);
const limit = 2;
const skip = offset * limit;
const user = await User
.findOne({ username })
.populate({
path: 'following',
select: 'profilePicture username fullname',
options: {
skip,
limit,
}
})
res.status(200).send(user.following);
} catch (e) {
console.log(e);
res.status(500).send(e)
}
}
);
And my pagination version using aggregate:
const following = await User.aggregate([
{
$match: { username }
},
{
$lookup: {
'from': User.collection.name,
'let': { 'following': '$following' },
'pipeline': [
{
$project: {
'fullname': 1,
'username': 1,
'profilePicture': 1
}
}
],
'as': 'following'
},
}, {
$project: {
'_id': 0,
'following': {
$slice: ['$following', skip, limit]
}
}
}
]);
Suppose I have this documents:
[
{
_id: '5fdgffdgfdgdsfsdfsf',
username: 'gagi',
following: []
},
{
_id: '5fgjhkljvlkdsjfsldkf',
username: 'kuku',
following: []
},
{
_id: '76jghkdfhasjhfsdkf',
username: 'john',
following: ['5fdgffdgfdgdsfsdfsf', '5fgjhkljvlkdsjfsldkf']
},
]
And when I test my route for user john: /john/following, everything is fine but when I test for different user which doesn't have any following: /gagi/following, the returned result is the same as john's following which aggregate doesn't seem to match user by username.
/john/following | following: 2
/kuku/following | following: 0
Aggregate result:
[
{
_id: '5fdgffdgfdgdsfsdfsf',
username: 'kuku',
...
},
{
_id: '5fgjhkljvlkdsjfsldkf',
username: 'gagi',
...
}
]
I expect /kuku/following to return an empty array [] but the result is same as john's. Actually, all username I test return the same result.
I'm thinking that there must be wrong with my implementation since I've only started exploring aggregation.
Mongoose uses a DBRef to be able to populate the field after it has been retrieved.
DBRefs are only handled on the client side, MongoDB aggregation does not have any operators for handling those.
The reason that aggregation pipeline is returning all of the users is the lookup's pipeline does not have a match stage, so all of the documents in the collection are selected and included in the lookup.
The sample document there is showing an array of strings instead of DBRefs, which wouldn't work with populate.
Essentially, you must decide whether you want to use aggregation or populate to handle the join.
For populate, use the ref as shown in that sample schema.
For aggregate, store an array of ObjectId so you can use lookup to link with the _id field.

Node.js - Mongoose - Update nested array with all values in req.body

I have an object that looks like this.
{
_id: '577fe7a842c9b447',
name: 'Jacob\'s Bronze Badges',
competitors: [
{
_id: '577fe7a842c9bd6d',
name: 'Peter\'s Silver Badges',
sites: [
{
_id: '577fe7a842c9bd6d',
name: 'Facebook',
url: 'fb.com/peter'
},
{
_id: '577fe7a842c9bd6d'
name: 'Google',
url: 'google.com/peter'
}
]
},
{
_id: '599fe7a842c9bd6d',
name: 'Paul\'s Gold Badges',
sites: [
{
'_id': '577fe7a842c9bd6d',
name: 'Facebook',
url: 'fb.com/paul'
},
{
_id: '577fe7a842c9bd6d',
name: 'Google',
url: 'google.com/paul'
}
]
}
]
}
My goal is to reference the competitors array and update items inside with all of the values from req.body. I based this code off of this answer, as well as this other one.
Location.update(
{ 'competitors._id': req.params.competitorId, },
{ $set: { 'competitors.$': req.body, }, },
(err, result) => {
if (err) {
res.status(500)
.json({ error: 'Unable to update competitor.', });
} else {
res.status(200)
.json(result);
}
}
);
I send my HTTP PUT to localhost:3000/competitors/577fe7a842c9bd6d to update Peter's Silver Badges. The request body is:
{
"name": "McDonald's"
}
The problem is that when I use $set to set the competitor with _id: req.params.competitorId, I don't know what is in req.body. I want to use the entire req.body to update the object in the array, but when I do, that object is overwritten, so instead of getting a new name, Peter's Silver Badges becomes:
{
name: 'McDonald\'s',
sites: []
}
How can I update an object within an array when I know the object's _id with all of the fields from req.body without removing fields that I want to keep?
I believe that the sites array is empty because the object was reinitialized. In my schema I have sites: [sitesSchema] to initialize it. So I am assuming that the whole competitors[_id] object is getting overwritten with the new name and then the sites: [sitesSchema] from myschema.
You would need to use the $ positional operator in your $set. In order to assign those properties dynamically, based on what is in your req.body, you would need to build up your $set programmatically.
If you want to update the name you would do the following:
Location.update(
{ 'competitors._id': req.params.competitorId },
{ $set: { 'competitors.$.name': req.body.name }},
(err, result) => {
if (err) {
res.status(500)
.json({ error: 'Unable to update competitor.', });
} else {
res.status(200)
.json(result);
}
}
);
One way you might programatically build up the $set using req.body is by doing the following:
let updateObj = {$set: {}};
for(var param in req.body) {
updateObj.$set['competitors.$.'+param] = req.body[param];
}
See this answer for more details.
To update embedded document with $ operator, in most of the cases, you have to use dot notation on the $ operator.
Location.update(
{ _id: '577fe7a842c9b447', 'competitors._id': req.params.competitorId, },
{ $set: { 'competitors.$.name': req.body.name, }, },
(err, result) => {
if (err) {
res.status(500)
.json({ error: 'Unable to update competitor.', });
} else {
res.status(200)
.json(result);
}
}
);

Sails.js query model filtering by collection attribute

Suppose that I have the following models in Sails.js v0.10:
Person.js
module.exports = {
attributes: {
name: 'string',
books: {
collection: 'book',
via: 'person'
}
}
};
Book.js
module.exports = {
attributes: {
name: 'string',
person: {
model: 'person'
}
}
};
I want a query to return an array of people that have a certain associated book. I would like to do something like the following query, but i don't know how:
Person.find()
.populate('books')
.where({ books.name: 'Game of Thrones'})
.exec(function(err, person) {
if (err) return res.send(err, 500);
res.json(person);
});
Any ideas if this is possible to do using a simple query?
First off, you'll need to adjust your Book model to make it a many-to-many association:
module.exports = {
attributes: {
name: 'string',
people: {
collection: 'person',
via: 'books'
}
}
};
Querying by properties of an association is not currently possible with Sails, although it's on the roadmap. But once you've set up your Book model as above, you can get what you want by querying in reverse:
Book.findOne({name: 'Game of Thrones'}).populate('people').exec(...);

Categories

Resources