How to test code when dealing with MongooseArray in NodeJS - javascript

I am trying to do some integration tests using Supertest. My object schema contains an array:
const schema = new mongoose.Schema({
name: {
type: String,
required: true,
minlength: 3,
maxlength: 50
},
tags: {
type: Array,
lowercase: true
}
});
I am using Mongoose and when running my tests I am always encounter this issue with getting an MongooseArray instead of Array and I am not sure how to deal with this.
- Expected value
+ Received value
- CoreMongooseArray [
+ Array [
"tag1",
]
What can I do to always get an Array ?
Where do I have to make my changes ? In my code or test ?
Here is an example(test) when I use lean that works:
await exec();
const updatedCategory = await Category.findById(category._id).lean();
expect(updatedCategory.name).toBe(name);
expect(updatedCategory.tags).toEqual(tags);
And here is one that doesn't work:
const res = await exec();
expect(res.body).toHaveProperty('_id', category._id.toHexString());
expect(res.body).toHaveProperty('name', category.name);
expect(res.body).toHaveProperty('tags', category.tags);
In this case I am checking if the property exists and doesn't work.
edit:
Here is the exec() function for the test that works (PUT route):
const exec = async () => {
return await request(app)
.put('/api/categories/' + id)
.set('x-auth-token', token)
.send({ name: name, tags: tags });
}
Here is the exec() function for the test that fails (DELETE route):
const exec = async () => {
return await request(app)
.delete('/api/categories/' + id)
.set('x-auth-token', token)
.send();
}
edit3:
Here is the result when displaying res.body on the console:
{
"_id": "5db58e63fa9c143794484eea",
"tags": [
"tag1"
],
"name": "category1",
"__v": 0
}

Related

Rename database field within array in MongoDB

I need to change a few database fields in my backend controller before returning the object to the frontend.
Currently, I am doing it in the front end like this with my returned object:
for (let contribution of contributions) {
contribution["title"] = "You Added One"
contribution["launchName"] = contribution.name
contribution["launchId"] = contribution._id
contribution["date"] = contribution.addedAt
contribution["content"] = contribution.description
}
But I am now trying to do this work in the backend using Mongo.
This is my controller:
const Launch = require('../models/launch')
const User = require('../models/user')
async function getRecentActivityByUserId (req, res) {
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.find({ _id: { $in: user.contributions } })
return res.status(200).send(contributions.reverse())
}
So this correctly returns an object to the frontend but I still need to change the database field names.
So I tried this:
async function getRecentActivityByUserId (req, res) {
let recents = []
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.find({ _id: { $in: user.contributions } }).aggregate([
{
$addFields: {
plans: {
$map:{
input: "$launch",
as: "l",
in: {
title: "You Added One",
launchName: "$$l.name",
launchId: "$$l._id",
date: "$$l.addedAt",
content: "$$l.description",
}
}
}
}
},
{
$out: "launch"
}
])
return res.status(200).send(contributions.reverse())
}
The above throws an error saying that I .aggregrate is not a function on .find. Even if I remove the .find, the object returned is just an empty array so I'm obviously not aggregating correctly.
How can I combine .find with .aggregate and what is wrong with my .aggregate function??
I also tried combining aggregate with find like this and get the error Arguments must be aggregate pipeline operators:
const contributions = await Launch.aggregate([
{
$match: {
_id: { $in: user.contributions }
},
$addFields: {
plans: {
$map:{
input: "$launch",
as: "l",
in: {
title: "You Added a Kayak Launch",
launchName: "$$l.name",
launchId: "$$l._id",
date: "$$l.addedAt",
content: "$$l.description",
}
}
}
}
},
{
$out: "launch"
}
])
EDIT: Just realized that I have the word plans in the aggregate function and that is not relevant to my code. I copied this code from elsewhere so not sure what the value should be.
I figured it out. This is the solution:
async function getRecentActivityByUserId (req, res) {
let recents = []
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.aggregate([
{
$match: {
_id: { $in: user.contributions }
}
},
{
$addFields: {
title: "You Added One" ,
launchName: "$name",
launchId: "$_id",
date: "$addedAt",
content: "$description"
}
}
])
if(contributions) {
recents = recents.concat(contributions);
}
return res.status(200).send(recents.reverse())
}
The actual problem from the question was a small syntax error which has been noted and corrected in the self-reported answer here.
I noted in the comments there that the current approach of issuing two separate operations (a findOne() followed by an aggregate() that uses the results) could be simplified into a single query to the database. The important thing here is that you will $match against the first collection (users or whatever the collection name is in your environment) and then use $lookup to perform the "match" against the subsequent launches collection.
Here is a playground demonstrating the basic approach. Adjust as needed to the specifics of your environment.

How to make a search api in nodejs and mongodb

I am trying to make a search api using nodejs and MongoDB. I tried to google about this and I did find something there but while trying to implement I get an error saying. I don't know how to fix this honestly I don't know anything about making search API. So any help or suggestion will be helpful for me.
This is the link of the post I found on google Building a simple search api.
error
{
"error": {
"message": "Cast to ObjectId failed for value \"search\" at path \"_id\" for model \"Post\"",
"name": "CastError",
"stringValue": "\"search\"",
"kind": "ObjectId",
"value": "search",
"path": "_id"
}
}
This is my code
postController.search = (req, res) => {
var response = [];
if(typeof req.query.title !== 'undefined'){
db.Post.filter(function(post) {
if(post.title === req.query.title){
console.log(req.body);
response.push(post);
console.log(post);
}
});
}
response = _.uniqBy(response, '_id');
if(Object.key(req.query).length === 0){
response = db.Post
}
res.json(response);
};
data in the collection
"data": [
{
"isDeleted": false,
"_comments": [],
"_id": "5d39122036117d2ea81b434c",
"title": "facebook post",
"link": "facebook.com",
"_creator": {
"createdAt": "2019-07-25T01:42:21.252Z",
"username": "adityakmr"
},
"createdAt": "2019-07-25T02:21:20.634Z",
"__v": 0
},
]
If you're trying to create an API to search mongoDB collection based on title i.e; a text field try implementing text search feature of mongoDB : text search in mongoDB
, Just create a text index on title field & then create an API with post method which takes in parameter that can be queried against title field.
Text search can be a bit tricky it can help you for fuzzy/partial/full text searches - use of regex is also much beneficial.
Checkout links for node.js API example :
MongoDB NodeJs Docs
Full Text Search with MongoDB & Node.js
Text Searching with MongoDB
First of all, you need to use async/await for modularize your code. I suggest don't write your whole code in your controller.js file, API can be made by following the way (routes - controller - utils ).
postRoutes.js
postRouter.get('/search-post', postCtr.searchPost);
postController.js
const postUtils = require('./postUtils');
const postController = {};
postController.searchPost = async (req, res) => {
try {
const { title } = req.query;
const result = await postUtils.searchPost(title);
return res.status(200).json(result);
} catch (err) {
return res.status(err.code).json({ error: err.error });
}
};
module.exports = postController;
postUtils.js
const Post = require('./postModel');
const postUtils = {};
postUtils.searchPost = async (title) => {
try {
let result = [];
if(title){
// Even you can perform regex in your search
result = await Post.find({ title: title });
}
return result;
} catch (err) {
const errorObj = { code: 500, error: 'Internal server error' }; // It can be dynamic
throw errorObj;
}
};
module.exports = postUtils;
postModel.js
const mongoose = require('mongoose');
const postSchema = new mongoose.Schema({
user: {
type: mongoose.Schema.Types.ObjectId,
ref: 'user',
required: true,
},
// Your fields ...
}, { collection: 'post', timestamps: true });
const post = mongoose.model('post', postSchema);
module.exports = post;
Using this structure you can easily debug your code and It's also manageable.
In the link you specified above, they are using array of objects stored in file called store.js, but not mongoDB. So directly they are filtering using Array.filter method.
But in mongoDB using mongoose(object modeling tool) you can make use of collection.find() method.
So solution to your problem is as follows
postController.search = async (req, res) => {
var response = [];
if (req.query.title) {
response = await db.Post.find({title: req.query.title});
}
res.json(response);
};
find is inbuilt query method which helps in querying the collections, you can pass multiple properties for querying.

Import CSV Using Mongoose Schema

Currently I need to push a large CSV file into a mongo DB and the order of the values needs to determine the key for the DB entry:
Example CSV file:
9,1557,358,286,Mutantville,4368,2358026,,M,0,0,0,1,0
9,1557,359,147,Wroogny,4853,2356061,,D,0,0,0,1,0
Code to parse it into arrays:
var fs = require("fs");
var csv = require("fast-csv");
fs.createReadStream("rank.txt")
.pipe(csv())
.on("data", function(data){
console.log(data);
})
.on("end", function(data){
console.log("Read Finished");
});
Code Output:
[ '9',
'1557',
'358',
'286',
'Mutantville',
'4368',
'2358026',
'',
'M',
'0',
'0',
'0',
'1',
'0' ]
[ '9',
'1557',
'359',
'147',
'Wroogny',
'4853',
'2356061',
'',
'D',
'0',
'0',
'0',
'1',
'0' ]
How do I insert the arrays into my mongoose schema to go into mongo db?
Schema:
var mongoose = require("mongoose");
var rankSchema = new mongoose.Schema({
serverid: Number,
resetid: Number,
rank: Number,
number: Number,
name: String,
land: Number,
networth: Number,
tag: String,
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
module.exports = mongoose.model("Rank", rankSchema);
The order of the array needs to match the order of the schema for instance in the array the first number 9 needs to always be saved as they key "serverid" and so forth. I'm using Node.JS
You can do it with fast-csv by getting the headers from the schema definition which will return the parsed lines as "objects". You actually have some mismatches, so I've marked them with corrections:
const fs = require('mz/fs');
const csv = require('fast-csv');
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
const rankSchema = new Schema({
serverid: Number,
resetid: Number,
rank: Number,
name: String,
land: String, // <-- You have this as Number but it's a string
networth: Number,
tag: String,
stuff: String, // the empty field in the csv
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
const Rank = mongoose.model('Rank', rankSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
console.log(headers);
await new Promise((resolve,reject) => {
let buffer = [],
counter = 0;
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }))
.on("error", reject)
.on("data", async doc => {
stream.pause();
buffer.push(doc);
counter++;
log(doc);
try {
if ( counter > 10000 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
} catch(e) {
stream.destroy(e);
}
stream.resume();
})
.on("end", async () => {
try {
if ( counter > 0 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
resolve();
}
} catch(e) {
stream.destroy(e);
}
});
});
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
As long as the schema actually lines up to the provided CSV then it's okay. These are the corrections that I can see but if you need the actual field names aligned differently then you need to adjust. But there was basically a Number in the position where there is a String and essentially an extra field, which I'm presuming is the blank one in the CSV.
The general things are getting the array of field names from the schema and passing that into the options when making the csv parser instance:
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }))
Once you actually do that then you get an "Object" back instead of an array:
{
"serverid": "9",
"resetid": "1557",
"rank": "358",
"name": "286",
"land": "Mutantville",
"networth": "4368",
"tag": "2358026",
"stuff": "",
"gov": "M",
"gdi": "0",
"protection": "0",
"vacation": "0",
"alive": "1",
"deleted": "0"
}
Don't worry about the "types" because Mongoose will cast the values according to schema.
The rest happens within the handler for the data event. For maximum efficiency we are using insertMany() to only write to the database once every 10,000 lines. How that actually goes to the server and processes depends on the MongoDB version, but 10,000 should be pretty reasonable based on the average number of fields you would import for a single collection in terms of the "trade-off" for memory usage and writing a reasonable network request. Make the number smaller if necessary.
The important parts are to mark these calls as async functions and await the result of the insertMany() before continuing. Also we need to pause() the stream and resume() on each item otherwise we run the risk of overwriting the buffer of documents to insert before they are actually sent. The pause() and resume() are necessary to put "back-pressure" on the pipe, otherwise items just keep "coming out" and firing the data event.
Naturally the control for the 10,000 entries requires we check that both on each iteration and on stream completion in order to empty the buffer and send any remaining documents to the server.
That's really what you want to do, as you certainly don't want to fire off an async request to the server both on "every" iteration through the data event or essentially without waiting for each request to complete. You'll get away with not checking that for "very small files", but for any real world load you're certain to exceed the call stack due to "in flight" async calls which have not yet completed.
FYI - a package.json used. The mz is optional as it's just a modernized Promise enabled library of standard node "built-in" libraries that I'm simply used to using. The code is of course completely interchangeable with the fs module.
{
"description": "",
"main": "index.js",
"dependencies": {
"fast-csv": "^2.4.1",
"mongoose": "^5.1.1",
"mz": "^2.7.0"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Actually with Node v8.9.x and above then we can even make this much simpler with an implementation of AsyncIterator through the stream-to-iterator module. It's still in Iterator<Promise<T>> mode, but it should do until Node v10.x becomes stable LTS:
const fs = require('mz/fs');
const csv = require('fast-csv');
const streamToIterator = require('stream-to-iterator');
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
const rankSchema = new Schema({
serverid: Number,
resetid: Number,
rank: Number,
name: String,
land: String,
networth: Number,
tag: String,
stuff: String, // the empty field
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
const Rank = mongoose.model('Rank', rankSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
//console.log(headers);
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }));
const iterator = await streamToIterator(stream).init();
let buffer = [],
counter = 0;
for ( let docPromise of iterator ) {
let doc = await docPromise;
buffer.push(doc);
counter++;
if ( counter > 10000 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
}
if ( counter > 0 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
Basically, all of the stream "event" handling and pausing and resuming gets replaced by a simple for loop:
const iterator = await streamToIterator(stream).init();
for ( let docPromise of iterator ) {
let doc = await docPromise;
// ... The things in the loop
}
Easy! This gets cleaned up in later node implementation with for..await..of when it becomes more stable. But the above runs fine on the from the specified version and above.
By saying #Neil Lunn need headerline within the CSV itself.
Example using csvtojson module.
const csv = require('csvtojson');
const csvArray = [];
csv()
.fromFile(file-path)
.on('json', (jsonObj) => {
csvArray.push({ name: jsonObj.name, id: jsonObj.id });
})
.on('done', (error) => {
if (error) {
return res.status(500).json({ error});
}
Model.create(csvArray)
.then((result) => {
return res.status(200).json({result});
}).catch((err) => {
return res.status(500).json({ error});
});
});
});

Async await unit testing issue

This is the update function I want to test in mocked database
import Book from '../model/book';
function bookRepository(db) {
this.db = db;
};
bookRepository.prototype.update = async function(id, data) {
return await Book.findOneAndUpdate({ _id: id }, { $set: data });
}
export default bookRepository;
This is test script I wrote for it
import chai from 'chai';
import chaiAsPromised from 'chai-as-promised';
chai.use(chaiAsPromised);
const expect = chai.expect;
import app from '../../server';
import bookRepo from '../../repository/book';
const Book = new bookRepo(app.db);
describe('Test repository: book', () => {
describe('update', () => {
let id;
beforeEach(async() => {
let book = {
name: 'Records of the Three Kingdoms',
type: 'novel',
description: 'History of the late Eastern Han dynasty (c. 184–220 AD) and the Three Kingdoms period (220–280 AD)',
author: 'Luo Guanzhong',
language: 'Chinese'
};
let result = await Book.insert(book);
id = await result.id;
return;
});
it('Update successfully', async() => {
let data = {
type: 'history',
author: 'Chen Shou'
};
let result = await Book.update(id, data);
await expect(result).to.be.an('object');
await expect(result.type).to.be.equal('history');
return expect(result.author).to.be.equal('Chen Shou');
});
});
});
And I received this error
AssertionError: expected 'novel' to equal 'history'
+ expected - actual
When I check the mocked database, it does update data, but why does its assertion fail? It should have already updated after completing await call
The findOneAndUpdate method takes options as the third argument. One of the options is returnNewDocument: <boolean>. This is false by default. If you don't set this option as true then MongoDB updates the document and returns the old document as a result. If you set this option to true then MongoDB returns the new updated document.
From the official docs -
Returns either the original document or, if returnNewDocument: true, the updated document.
So in your update method, make the following change -
return await Book.findOneAndUpdate({ _id: id }, { $set: data }, { returnNewDocument : true });
You can read about it here.
Edit - If using mongoose then use the {new: true} option instead of the above option as mongoose uses the findAndModify underneath the findOneAndUpdate method.

Execute Sequelize queries synchronously

I am building a website using Node.js and Sequelize (with a Postgres backend). I have a query that returns many objects with a foreign key, and I want to pass to the view a list of the objects that the foreign key references.
In the example, Attendances contains Hackathon keys, and I want to return a list of hackathons. Since the code is async, the following thing of course does not work in Node:
models.Attendance.findAll({
where: {
UserId: req.user.id
}
}).then(function (data) {
var hacks = [];
for (var d in data) {
models.Hackathon.findOne({
where: {
id: data[d].id
}
}).then(function (data1) {
hacks.append(data1);
});
}
res.render('dashboard/index.ejs', {title: 'My Hackathons', user: req.user, hacks: hacks});
});
Is there any way to do that query in a synchronous way, meaning that I don't return the view untill I have the "hacks" list filled with all the objects?
Thanks!
Use Promise.all to execute all of your queries then call the next function.
models.Attendance.findAll({
where: {
UserId: req.user.id
}
}).then(function (data) {
// get an array of the data keys, (not sure if you need to do this)
// it is unclear whether data is an object of users or an array. I assume
// it's an object as you used a `for in` loop
const keys = Object.keys(data)
// map the data keys to [Promise(query), Promise(query), {...}]
const hacks = keys.map((d) => {
return models.Hackathon.findOne({
where: {
id: data[d].id
}
})
})
// user Promise.all to resolve all of the promises asynchronously
Promise.all(hacks)
// this will be called once all promises have resolved so
// you can modify your data. it will be an array of the returned values
.then((users) => {
const [user1, user2, {...}] = users
res.render('dashboard/index.ejs', {
title: 'My Hackathons',
user: req.user,
hacks: users
});
})
});
The Sequelize library has the include parameter which merges models in one call. Adjust your where statement to bring the Hackathons model into Attendance. If this does not work, take the necessary time to setup Sequelize correctly, their documentation is constantly being improved. In the end, you'll save loads of time by reducing error and making your code readable for other programmers.
Look how much cleaner this can be...
models.Attendance.findAll({
include: [{
model: Hackathon,
as: 'hackathon'
},
where: {
UserId: req.user.id
}
}).then(function (data) {
// hackathon id
console.log(data.hackathon.id)
// attendance id
console.log(data.id)
})
Also..
Hackathon.belongsTo(Attendance)
Attendance.hasMany(Hackathon)
sequelize.sync().then(() => {
// this is where we continue ...
})
Learn more about Sequelize includes here:
http://docs.sequelizejs.com/en/latest/docs/models-usage/
Immediately invoke asynchronous function expression
This is one of the techniques mentioned at: How can I use async/await at the top level? Toplevel await is likely coming soon as of 2021, which will be even better.
Minimal runnable example:
const assert = require('assert');
const { Sequelize, DataTypes } = require('sequelize');
const sequelize = new Sequelize({
dialect: 'sqlite',
storage: 'db.sqlite',
});
const IntegerNames = sequelize.define(
'IntegerNames', {
value: { type: DataTypes.INTEGER, allowNull: false },
name: { type: DataTypes.STRING, },
}, {});
(async () => {
await IntegerNames.sync({force: true})
await IntegerNames.create({value: 2, name: 'two'});
await IntegerNames.create({value: 3, name: 'three'});
await IntegerNames.create({value: 5, name: 'five'});
// Fill array.
let integerNames = [];
integerNames.push(await IntegerNames.findOne({
where: {value: 2}
}));
integerNames.push(await IntegerNames.findOne({
where: {value: 3}
}));
// Use array.
assert(integerNames[0].name === 'two');
assert(integerNames[1].name === 'three');
await sequelize.close();
})();
Tested on Node v14.16.0, sequelize 6.6.2, seqlite3 5.0.2, Ubuntu 20.10.

Categories

Resources