MongoDB - Mongoose - TypeError: save is not a function - javascript

I am attempting to perform an update to a MongoDB document (using mongoose) by first using .findById to get the document, then updating the fields in that document with new values. I am still a bit new to this so I used a tutorial to figure out how to get it working, then I have been updating my code for my needs. Here is the tutorial: MEAN App Tutorial with Angular 4. The original code had a schema defined, but my requirement is for a generic MongoDB interface that will simply take whatever payload is sent to it and send it along to MongoDB. The original tutorial had something like this:
exports.updateTodo = async function(todo){
var id = todo.id
try{
//Find the old Todo Object by the Id
var oldTodo = await ToDo.findById(id);
}catch(e){
throw Error("Error occured while Finding the Todo")
}
// If no old Todo Object exists return false
if(!oldTodo){
return false;
}
console.log(oldTodo)
//Edit the Todo Object
oldTodo.title = todo.title
oldTodo.description = todo.description
oldTodo.status = todo.status
console.log(oldTodo)
try{
var savedTodo = await oldTodo.save()
return savedTodo;
}catch(e){
throw Error("And Error occured while updating the Todo");
}
}
However, since I don't want a schema and want to allow anything through, I don't want to assign static values to specific field names like, title, description, status, etc. So, I came up with this:
exports.updateData = async function(update){
var id = update.id
// Check the existence of the query parameters, If they don't exist then assign a default value
var dbName = update.dbName ? update.dbName : 'test'
var collection = update.collection ? update.collection : 'testing';
const Test = mongoose.model(dbName, TestSchema, collection);
try{
//Find the existing Test object by the Id
var existingData = await Test.findById(id);
}catch(e){
throw Error("Error occurred while finding the Test document - " + e)
}
// If no existing Test object exists return false
if(!existingData){
return false;
}
console.log("Existing document is " + existingData)
//Edit the Test object
existingData = JSON.parse(JSON.stringify(update))
//This was another way to overwrite existing field values, but
//performs a "shallow copy" so it's not desireable
//existingData = Object.assign({}, existingData, update)
//existingData.title = update.title
//existingData.description = update.description
//existingData.status = update.status
console.log("New data is " + existingData)
try{
var savedOutput = await existingData.save()
return savedOutput;
}catch(e){
throw Error("An error occurred while updating the Test document - " + e);
}
}
My original problem with this was that I had a lot of issues getting the new values to overwrite the old ones. Now that that's been solved, I am getting the error of "TypeError: existingData.save is not a function". I am thinking the data type changed or something, and now it is not being accepted. When I uncomment the static values that were in the old tutorial code, it works. This is further supported by my console logging before and after I join the objects, because the first one prints the actual data and the second one prints [object Object]. However, I can't seem to figure out what it's expecting. Any help would be greatly appreciated.
EDIT: I figured it out. Apparently Mongoose has its own data type of "Model" which gets changed if you do anything crazy to the underlying data by using things like JSON.stringify. I used Object.prototype.constructor to figure out the actual object type like so:
console.log("THIS IS BEFORE: " + existingData.constructor);
existingData = JSON.parse(JSON.stringify(update));
console.log("THIS IS AFTER: " + existingData.constructor);
And I got this:
THIS IS BEFORE: function model(doc, fields, skipId) {
model.hooks.execPreSync('createModel', doc);
if (!(this instanceof model)) {
return new model(doc, fields, skipId);
}
Model.call(this, doc, fields, skipId);
}
THIS IS AFTER: function Object() { [native code] }
Which showed me what was actually going on. I added this to fix it:
existingData = new Test(JSON.parse(JSON.stringify(update)));
On a related note, I should probably just use the native MongoDB driver at this point, but it's working, so I'll just put it on my to do list for now.

You've now found a solution but I would suggest using the MongoDB driver which would make your code look something along the lines of this and would make the origional issue disappear:
// MongoDB Settings
const MongoClient = require(`mongodb`).MongoClient;
const mongodb_uri = `mongodb+srv://${REPLACE_mongodb_username}:${REPLACE_mongodb_password}#url-here.gcp.mongodb.net/test`;
const db_name = `test`;
let db; // allows us to reuse the database connection once it is opened
// Open MongoDB Connection
const open_database_connection = async () => {
try {
client = await MongoClient.connect(mongodb_uri);
} catch (err) { throw new Error(err); }
db = client.db(db_name);
};
exports.updateData = async update => {
// open database connection if it isn't already open
try {
if (!db) await open_database_connection();
} catch (err) { throw new Error(err); }
// update document
let savedOutput;
try {
savedOutput = await db.collection(`testing`).updateOne( // .save() is being depreciated
{ // filter
_id: update.id // the '_id' might need to be 'id' depending on how you have set your collection up, usually it is '_id'
},
$set: { // I've assumed that you are overwriting the fields you are updating hence the '$set' operator
update // update here - this is assuming that the update object only contains fields that should be updated
}
// If you want to add a new document if the id isn't found add the below line
// ,{ upsert: true }
);
} catch (err) { throw new Error(`An error occurred while updating the Test document - ${err}`); }
if (savedOutput.matchedCount !== 1) return false; // if you add in '{ upsert: true }' above, then remove this line as it will create a new document
return savedOutput;
}
The collection testing would need to be created before this code but this is only a one-time thing and is very easy - if you are using MongoDB Atlas then you can use MongoDB Compass / go in your online admin to create the collection without a single line of code...
As far as I can see you should need to duplicate the update object. The above reduces the database calls from 2 to one and allows you to reuse the database connection, potentially anywhere else in the application which would help to speed things up. Also don't store your MongoDB credentials directly in the code.

Related

Override Mongoose save method to retry on `duplicate key error`

My Mongoose schema uses a custom _id value and the code I inherited does something like this
const sampleSchema = new mongoose.Schema({
_id: String,
key: String,
});
sampleSchema.statics.generateId = async function() {
let id;
do {
id = randomStringGenerator.generate({length: 8, charset: 'hex', capitalization: 'uppercase'});
} while (await this.exists({_id: id}));
return id;
};
let SampleModel = mongoose.model('Sample', sampleSchema);
A simple usage looks like this:
let mySample = new SampleModel({_id: await SampleModel.generateId(), key: 'a' });
await mySample.save();
There are at least three problems with this:
Every save will require at least two trips to the database, one to test for a unique id and one to save the document.
For this to work, it is necessary to manually call generateId() before each save. An ideal solution would handle that for me, like Mongoose does with ids of type ObjectId.
Most significantly, there is a potential race condition that will result in duplicate key error. Consider two clients running this code. Both coincidentally generate the same id at the same time, both look in the database and find the id absent, both try to write the record to the database. The second will fail.
An ideal solution would, on save, generate an id, save it to the database and on duplicate key error, generate a new id and retry. Do this in a loop until the document is stored successfully.
The trouble is, I don't know how to get Mongoose to let me do this.
Here's what I tried: Based on this SO Question, I found a rather old sample (using a very old mongoose version) of overriding the save function to accomplish something similar and based this attempt off it.
// First, change generateId() to force a collision
let ids = ['a', 'a', 'a', 'b'];
let index = 0;
let generateId = function() {
return ids[index++];
};
// Configure middleware to generate the id before a save
sampleSchema.pre('validate', function(next) {
if (this.isNew)
this._id = generateId();
next();
});
// Now override the save function
SampleModel.prototype.save_original = SampleModel.prototype.save;
SampleModel.prototype.save = function(options, callback) {
let self = this;
let retryOnDuplicate = function(err, savedDoc) {
if (err) {
if (err.code === 11000 && err.name === 'MongoError') {
self.save(options, retryOnDuplicate);
return;
}
}
if (callback) {
callback(err, savedDoc);
}
};
return self.save_original(options, retryOnDuplicate);
}
This gets me close but I'm leaking a promise and I'm not sure where.
let sampleA = new SampleModel({key: 'a'});
let sampleADoc = await sampleA.save();
console.log('sampleADoc', sampleADoc); // prints undefined, but should print the document
let sampleB = new SampleModel({key: 'b'});
let sampleBDoc = await sampleB.save();
console.log('sampleBDoc', sampleBDoc); // prints undefined, but should print the document
let all = await SampleModel.find();
console.log('all', all); // prints `[]`, but should be an array of two documents
Output
sampleADoc undefined
sampleBDoc undefined
all []
The documents eventually get written to the database, but not before the console.log calls are made.
Where am I leaking a promise? Is there an easier way to do this that addresses the three problems I outlined?
Edit 1:
Mongoose version: 5.11.15
I fixed the problem by changing the save override. The full solution looks like this:
const sampleSchema = new mongoose.Schema({
_id: String,
color: String,
});
let generateId = function() {
return randomStringGenerator.generate({length: 8, charset: 'hex', capitalization: 'uppercase'});
};
sampleSchema.pre('validate', function() {
if (this.isNew)
this._id = generateId();
});
let SampleModel = mongoose.model('Sample', sampleSchema);
SampleModel.prototype.save_original = SampleModel.prototype.save;
SampleModel.prototype.save = function(options, callback) {
let self = this;
let isDupKeyError = (error, field) => {
// Determine whether the error is a duplicate key error on the given field
return error?.code === 11000 && error?.name === 'MongoError' && error?.keyValue[field];
}
let saveWithRetries = (options, callback) => {
// save() returns undefined if used with callback or a Promise otherwise.
// https://mongoosejs.com/docs/api/document.html#document_Document-save
let promise = self.save_original(options, callback);
if (promise) {
return promise.catch((error) => {
if (isDupKeyError(error, '_id')) {
return saveWithRetries(options, callback);
}
throw error;
});
}
};
let retryCallback;
if (callback) {
retryCallback = (error, saved, rows) => {
if (isDupKeyError(error, '_id')) {
saveWithRetries(options, retryCallback);
} else {
callback(error, saved, rows);
}
}
}
return saveWithRetries(options, retryCallback);
}
This will generate an _id repeatedly until a successful save is called and addresses the three problems outlined in the original question:
The minimum trips to the database has been reduced from two to one. Of course, if there are collisions, more trips will occur but that's the exceptional case.
This implementation takes care of generating the id itself with no manual step to take before saving. This reduces complexity and removes the required knowledge of prerequisites for saving that are present in the original method.
The race condition has been addressed. It won't matter if two clients attempt to use the same key. One will succeed and the other will generate a new key and save again.
To improve this:
There ought to be a maximum number of save attempts for a single document followed by failure. In this case, you've perhaps used up all the available keys in whatever domain you're using.
The unique field may not be named _id or you might have multiple fields that require a unique generated value. The embedded helper function isDupKeyError() could be updated to look for multiple keys. Then on error you could add logic to regenerate just the failed key.

Uncaught DOMException: Failed to execute 'put' on 'IDBObjectStore': Evaluating the object store's key path did not yield a value at request.onsuccess

I'm trying to Store some application data using indexedDB
Here is my code
function _getLocalApplicationCache(_, payload) {
const indexedDB = window.indexedDB || window.mozIndexedDB || window.webkitIndexedDB || window.shimIndexedDB;
if (!indexedDB) {
if (__DEV__) {
console.error("IndexedDB could not found in this browser.");
}
}
const request = indexedDB.open("ApplicationCache", 1);
request.onerror = event => {
if (__DEV__) {
console.error("An error occurred with IndexedDB.");
console.error(event);
}
return;
};
request.onupgradeneeded = function () {
const db = request.result;
const store = db.createObjectStore("swimlane", {keyPath: "id", autoIncrement: true});
store.createIndex("keyData", ["name"], {unique: false});
};
request.onsuccess = () => {
// creating the transition
const db = request.result;
const transition = db.transaction("swimlane", "readwrite");
// Reference to our object store that holds the swimlane data;
const store = transition.objectStore("swimlane");
const swimlaneData = store.index("keyData");
payload = JSON.parse(JSON.stringify(payload));
store.put(payload);
const Query = swimlaneData.getAll(["keyData"]);
Query.onsuccess = () => {
if (__DEV__) {
console.log("Application Cache is loaded", Query.result);
}
};
transition.oncomplete = () => {
db.close();
};
};
}
If I do use different version then 1 here --> indexedDB.open("ApplicationCache", 1);
I'm getting a error like they keyPath is already exist. And other than than for version 1 I'm getting this error.
Can someone please help me where i'm doing wrong.
Review the introductory materials on using indexedDB.
If you did something like connect and create a database without a schema, or created an object store without an explicit key path, and then you stored some objects, and then you edited the upgradeneeded callback to specify the keypath, and then never triggered the upgradeneeded callback to run because you continue to use current version number instead of a newer version number, it would be one possible explanation for this error.
The upgradeneeded callback needs to have logic that checks for whether the object stores and indices already exist, and only create them if they do not exist. If the store does not exist, create it and its indices. If the store exists and the indices do not, add indices to the store. If the store exists and the indices exist, do nothing.
You need to trigger the upgradeneeded callback to run after changing your database schema by connecting with a higher version number. If you do not connect with a higher version number, the callback never runs, so you will end up connecting to the older version where your schema changes have not taken place.

Azure CosmosDb Stored Procedure IfMatch Predicate

In a DocDb stored procedure, as the first step in a process retrieving data that I'm mutating, I read and then use the data iff it matches the etag like so:
collection.readDocument(reqSelf, function(err, doc) {
if (doc._etag == requestEtag) {
// Success - want to update
} else {
// CURRENTLY: Discard the read result I just paid lots of RUs to read
// IDEALLY: check whether response `options` or similar indicates retrieval
was skipped due to doc not being present with that etag anymore
...
// ... Continue with an alternate strategy
}
});
Is there a way to pass an options to the readDocument call such that the callback will be informed "It's changed so we didn't get it, as you requested" ?
(My real problem here is that I can't find any documentation other than the readDocument undocumentation in the js-server docs)
Technically you can do that by creating a responseOptions object and passing it to the call.
function sample(selfLink, requestEtag) {
var collection = getContext().getCollection();
var responseOptions = { accessCondition: { type: "IfMatch", condition: requestEtag } };
var isAccepted = collection.readDocument(selfLink, responseOptions, function(err, doc, options) {
if(err){
throw new Error('Error thrown. Check the status code for PreconditionFailed errors');
}
var response = getContext().getResponse();
response.setBody(doc);
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
However, even if the etag you provide is not the one that the document has, you won't get an error and you will properly get the document itself back. It's just not supposed to work with that using the readDocument function in a stored procedure.
Thanks to some pushing from #Nick Chapsas, and this self-answer from #Redman I worked out that in my case I can achieve my goal (either read the current document via the self-link, or the newer one that has replaced it bearing the same id) by instead generating an Alt link within the stored procedure like so:
var docId = collection.getAltLink() + "/docs/"+req.id;
var isAccepted = collection.readDocument(docId, {}, function (err, doc, options) {
if (err) throw err;
// Will be null or not depending on whether it exists
executeUpsert(doc);
});
if (!isAccepted) throw new Error("readDocument not Accepted");

Model return values from another file not working with Sequelize.js

I have the following code that does not work currently.
var config = require('./libs/sequelize-lib.js');
var connection = config.getSequelizeConnection();//Choosing to not pass in variable this time since this should only run via script.
var models = config.setModels(connection);//Creates live references to the models.
//Alter table as needed but do NOT force the change. If an error occurs we will fix manually.
connection.sync({ alter: true, force: false }).then(function() {
models.users.create({
name: 'joe',
loggedIn: true
}).then( task => {
console.log("saved user!!!!!");
});
process.exit();//close the nodeJS Script
}).catch(function(error) {
console.log(error);
});
sequelize-lib.js
var Sequelize = require('sequelize');
exports.getSequelizeConnection = function(stage){
var argv = require('minimist')(process.argv.slice(2)); //If this file is being used in a script, this will attempt to get information from the argument stage passed if it exists
//Change connection settings based on stage variable. Assume localhost by default.
var dbname = argv['stage'] ? argv['stage']+"_db" : 'localdb';
var dbuser = argv['stage'] ? process.env.RDS_USERNAME : 'admin';
var dbpass = argv['stage'] ? process.env.RDS_PASSWORD : 'local123';
var dbhost = argv['stage'] ? "database-"+argv['stage']+".whatever.com" : 'localhost';
//If state variable used during require overide any arguments passed.
if(stage){
dbname = stage+"_db";
dbuser = process.env.RDS_USERNAME
dbpass = process.env.RDS_PASSWORD
dbhost = "database-"+stage+".whatever.com"
}
var connection = new Sequelize(dbname,dbuser,dbpass, {
dialect: 'mysql',
operatorsAliases: false, //This gets rid of a sequelize deprecated warning , refer https://github.com/sequelize/sequelize/issues/8417
host: dbhost
});
return connection;
}
exports.setModels = function(connection){
//Import all the known models for the project.
const fs = require('fs');
const dir = __dirname+'/../models';
var models = {}; //empty model object for adding model instances in file loop below.
//#JA - Wait until this function finishes ~ hence readdirSync vs regular readdir which is async
fs.readdirSync(dir).forEach(file => {
console.log(file);
//Split the .js part of the filename
var arr = file.split(".");
var name = arr[0].toLowerCase();
//Create a modle object using the filename as the reference without the .js pointing to a created sequelize instance of the file.
models[name] = connection.import(__dirname + "/../models/"+file);
})
//Showcase the final model.
console.log(models);
return models; //This returns a model with reference to the sequelize models
}
I can't get the create command to work however with this setup. My guess is the variables must not be passing through correctly somehow. I'm not sure what I'm doing wrong?
The create command definitely works because if in the sequelize-lib.js I modify the setModels function to this...
exports.setModels = function(connection){
//Import all the known models for the project.
const fs = require('fs');
const dir = __dirname+'/../models';
var models = {}; //empty model object for adding model instances in file loop below.
//#JA - Wait until this function finishes ~ hence readdirSync vs regular readdir which is async
fs.readdirSync(dir).forEach(file => {
console.log(file);
//Split the .js part of the filename
var arr = file.split(".");
var name = arr[0].toLowerCase();
//Create a modle object using the filename as the reference without the .js pointing to a created sequelize instance of the file.
models[name] = connection.import(__dirname + "/../models/"+file);
models[name].create({
"name":"joe",
"loggedIn":true
});
})
//Showcase the final model.
console.log(models);
return models; //This returns a model with reference to the sequelize models
}
Then it works and I see the item added to the database! (refer to proof image below)
Take note, I am simply running create on the variable at this point. What am I doing wrong where the model object is not passing between files correctly? Weird part is I don't get any errors thrown in the main file?? It's as if everything is defined but empty or something and the command is never run and nothing added to the database.
I tried this in the main file also and no luck.
models["users"].create({
name: 'joe',
loggedIn: true
}).then( task => {
console.log("saved user!!!!!");
});
The purpose of this all is to read models automatically from the model directory and create instances that are ready to go for every model, even if new one's are added in the future.
UPDATE::
So I did another test that was interesting, it seems that the create function won't work in the .then() function of the sync command. It looks like it was passing it correctly though. After changing the front page to this...
var config = require('./libs/sequelize-lib.js');
var connection = config.getSequelizeConnection();//Choosing to not pass in variable this time since this should only run via script.
var models = config.setModels(connection);//Creates live references to the models using connection previosly created.
models["users"].create({
"name":"joe",
"loggedIn":true
});
//Alter table as needed but do NOT force the change. If an error occurs we will fix manually.
connection.sync({ alter: true, force: false }).then(function() {
process.exit();//close the nodeJS Script
}).catch(function(error) {
console.log(error);
});
Doing this seems to get create to work. I'm not sure if this is good form or not though since the database might not be created at this point? I need a way to get it to work in the sync function.
Well I answered my question finally, but I'm not sure I like the answer.
var config = require('./libs/sequelize-lib.js');
var connection = config.getSequelizeConnection();//Choosing to not pass in variable this time since this should only run via script.
var models = config.setModels(connection);//Creates live references to the models using connection previosly created.
//Alter table as needed but do NOT force the change. If an error occurs we will fix manually.
connection.sync({ alter: false, force: false }).then( () => {
models["users"].create({
"name":"joe",
"loggedIn":true
}).then( user => {
console.log("finished, with user.name="+user.name);
process.exit();
}).catch( error => {
console.log("Error Occured");
console.log(error);
});
}).catch(function(error) {
console.log(error);
});
turns out that process.exit was triggering before create would occur because create happens async. This means that all my code will have to constantly be running through callbacks...which seems like a nightmare a bit. I wonder if there is a better way?

Parse.com master key doesn't let to write PFUser currentUser

UPDATE: In a nutshell, I would like to use the Master key, because I need to write an other user object with my current user, but I don't want to override all security, I just wanna use it in one function. The accepted answer in this question gave a very nice starting point, however I couldn't make it to work. It's the last code block in this question.
I have two separated functions. The first is pure objective-c, it deletes users from the currentUser's firstRelation. It worked well without any problems until i added a different CloudCode function into a different view controller. The CloudCode function uses the master key and adds currentUser to otherUser's sampleRelation & adds otherUser to currentUser's sampleRelation (firstRelation and sampleRelation is two different column inside the User class).
So the problem is when I delete a user from currentUser's firstRelation (with current user) my app crashes, because the user must be authenticated via logIn or signUp. Actually i don't understand this, because in this case I'm writing the currentUser with the currentUser instead of another user, so it must work without any problems (and worked before the CloudCode).
I'm almost sure that it's because I'm using the master key with the CloudCode, but have no idea how can I avoid it. Everything else is still working, for example I can upload images with currentUser.
Here is the code that I'm using for the CloudCode, JavaScript is totally unknown for me, maybe somebody will see what causes the problem.
Parse.Cloud.define('editUser', function(request, response) {
Parse.Cloud.useMasterKey();
var userQuery = new Parse.Query(Parse.User);
userQuery.get(request.params.userId)
.then(function (user) {
var relation = user.relation("sampleRelation");
relation.add(request.user);
// chain the promise
return user.save();
}).then(function (user) {
var currentUser = request.user;
var relation = currentUser.relation("sampleRelation");
relation.add(user);
// chain the new promise
return currentUser.save();
}).then(function () {
response.success();
}, function (error) {
response.error(error);
});
});
It crashes when i try to remove the object:
PFUser *user = [self.friends objectAtIndex:indexPath.row];
PFRelation *myFriendsRel = [self.currentUser relationForKey:#"simpleRelation"];
if ([self isFriend:user]) {
for (PFUser *friendName in self.friends) {
if ([friendName.objectId isEqualToString:user.objectId]){
[self.friends removeObject:friendName];
break; // to exit a loop
}
}
// remove from parse
[myFriendsRel removeObject:user];
NSLog(#"deleted: %#", user.username);
}
[self.currentUser saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (error){
NSLog(#"Error %# %#", error, [error userInfo]);
}
}];
This is the newest attempt, that based Fosco's answer from the other question. It works, but the same way as the earlier versions.
Parse.Cloud.define('editUser', function(request, response) {
var userId = request.params.userId;
var User = Parse.Object.extend('_User'),
user = new User({ objectId: userId });
var currentUser = request.user;
var relation = user.relation("friendsRelation");
relation.add(currentUser);
user.save(null, { useMasterKey:true}).then(function(user) {
response.success(user);
}, function(error) {
response.error(error)
});
});
At a quick glance it looks like its failing because you're trying to remove an object from an array whilst it is being iterated. I know this causes a crash in Objective C regardless of whether you're using Parse objects or not.
Try re-writing this segment:
for (PFUser *friendName in self.friends) {
if ([friendName.objectId isEqualToString:user.objectId]){
[self.friends removeObject:friendName];
break; // to exit a loop
}
}
To something like this:
NSMutableArray *tempArray = [[NSMutableArray alloc]init];
for (PFUser *friendName in self.friends) {
if (![friendName.objectId isEqualToString:user.objectId]) {
[tempArray addObject:friendName];
}
self.friends = [NSArray arrayWithArray:tempArray];
Again, only had a quick glance so not 100% if that is your problem but it looks like it, let me know if it helps

Categories

Resources