Conditional upsert based on custom id of nested documents in Meteor - javascript

This doesn't work, but hopefully this and the mouthful of a title gets the point across
function addToDB(account_id)
{
obj = {};
if (!Accounts.findOne({account_id: account_id})) obj.createdAt = new Date().valueOf();
obj.account_id = account_id;
Accounts.upsert({account_id: account_id}, {$set: obj});
}
I need to use the account_id instead of the MongoDB object id, so it has to be indexable/unique. If it's an update, the createdAt shouldn't change.
UPDATED so it works in the original context, but I prefer the solution I comment with to the correct answer.

I doubt this is supported in the minimongo side of things, but you can always implement the logic on the server and publish the method. But MongoDB supplies a $setOnInsert operator which has the purpose of only applying that value to the update performed when the "upsert" in fact inserts a new document:
Accounts.upsert(
{ "account_id": obj.account_id },
{
"$setOnInsert": { "created_at": new Date() },
"$set": obj // account_id would be automatically added but it doesn't matter
}
);
Of course presuming that obj does not already contain a a created_at field and the data you want is actually a BSON "Date" type rather than an epoch timestamp ( BSON Dates actually store as an epoch timestamp internally anyway ) otherwise convert using .valueOf() as shown in your example.

Related

How to define a Date Scalar using GraphQL-JS?

I am trying to define a custom scalar in GraphQL so I can query & process the Dates in my MongoDB collections. I am not sure I understand 100% what a scalar is or does, but it seems to be a sort of type that I define myself. All the examples & tutorials I found were using Apollo or some other type of notation, but I would like to see a solution using GraphQL-JS
So far, I have defined my scalar like so:
const Date = new GraphQLScalarType({
name: "Date",
serialize: (value) => {
return value; //is it correct, to just return the value? Do I need to parse it or turn it into a Date first?
},
parseValue: () => {
return "serialise"; //I am just returning this string here, because I do not know what this function is for
},
parseLiteral(ast) {
return null; //I am just returning null here, because I do not know what this function is for
},
});
I am not sure I understand what each of these functions are supposed to do. And wouldn't there also have to be a deserialize function?
When I query now against my graphql endpoint I do get back something like:
{
"myDate": "2020-07-15T00:00:00.000Z"
}
I guess that my serialise function is at play here? The Date is certainly correct, but I am not sure if I should do anything else with the data before returning it from serialize? Right now I just return whatever I get from my MongoDB database.
Urigo, from The Guild, created graphql-scalars that contains definitions for multiple common scalars used in GraphQL
//is it correct, to just return the value? Do I need to parse it or turn it into a Date first?
It would be wise to validate that value is a Date before returning it.
And yes, just return the value.
//I am just returning null here, because I do not know what this function is for
This is the entry from the abstract-syntax-tree (ast).
See Urigo's code below to see how the ast object is accessed
ast.kind
ast.value
Additionally, take a look at this SO post that describes the difference between parseValue and parseLiteral
Take a look at localDate and that may provide you the example you need to answer your question :)
https://github.com/Urigo/graphql-scalars/blob/master/src/scalars/LocalDate.ts#L34
export const GraphQLLocalDate = /*#__PURE__*/ new GraphQLScalarType({
name: 'LocalDate',
description:
'A local date string (i.e., with no associated timezone) in `YYYY-MM-DD` format, e.g. `2020-01-01`.',
serialize(value) {
// value sent to client as string
return validateLocalDate(value);
},
parseValue(value) {
// value from client as json
return validateLocalDate(value);
},
parseLiteral(ast) {
// value from client in ast
if (ast.kind !== Kind.STRING) {
throw new GraphQLError(
`Can only validate strings as local dates but got a: ${ast.kind}`,
);
}
return validateLocalDate(ast.value);
},
});

Method for coverting all Firestore Timestamps in a snapshot to JS dates?

How would you write a method to convert all Firestore Timestamps in a snapshot to JavaScript Dates?
For example, a snapshot from a posts collection might return a couple Firestore Timestamps (createdDtTm, modifyDtTm):
[
{
text: 'My new post', 
uid: 'nzjNp3Q', 
createDtTm: Timestamp { seconds: 1596239999, nanoseconds: 999000000 },
modifyDtTm: Timestamp { seconds: 1596239999, nanoseconds: 999000000 },  
},
{
text: 'Another post', 
uid: 'nzjNp3Q', 
createDtTm: Timestamp { seconds: 1596239999, nanoseconds: 999000000 },
  modifyDtTm: Timestamp { seconds: 1596239999, nanoseconds: 999000000 },  
},
]
Converting the individual dates is simple enough by mapping over the array and using the toDate() method for each Timestamp (e.g., createDtTm.toDate())
But what is a more generalized approach for converting these two (or any any arbitrary number of) Firestore timestamps, without explicitly specifying the Timestamp fields?
For instance, do Firestore Timestamps have a unique type that could be used for identification? Would an assumption of naming conventions be required (e.g., field name contains DtTm)? Other?
Previous questions answer how to convert a single Timestamp or a single timestamp in multiple documents in a snapshot. But I haven't found a generalized approach for converting all Timestamps within a snapshot when multiple Timestamps exist. More specifically, I'm interested in an approach for use within a React Provider that would pass JavaScript dates (not Firestore Timestamps) to its Consumers, while also not creating a dependency to update the Provider each time a new Timestamp field is added to the data model / collection.
I don't think there is any global method for this, but I think its easy to create a function that will analyze snapshot and change it. It's not very complicated. In node.js I have done it like this:
function globalToData (snapshot) {
for (const field in snapshot)
if (snapshot[field] instanceof Firestore.Timestamp) {
snapshot[field] = snapshot[field].toDate()
}
else if (snapshot[field] instanceof Object) {
globalToData (snapshot[field]);
}
return snapshot;
};
if you get DocumentSnapshot as ex. snap you can call it like:
globalToData(snap.data())
This should convert all Timestamps in all levels of Document snapshot ( I tested to 3rd level of nesting mixed map and array). We do not have your solution, but you can implement this somewhere in the middle of your app.

Meteor.js / MongoDB between dates query not returning data

I have the following piece of code for getting results back from a Mongo Collection.
var currentDate = moment().toISOString();
// RETURNING: 2016-12-10T20:36:04.494Z
var futureDate = moment().add(10, "days").toISOString();
// RETURNING: 2016-12-20T20:36:04.495Z
return agenda = Agendas.find({
"agendaDate": { '$gte': currentDate, '$lte': futureDate }
});
And the date is stored in MongoDB Collection like below;
{
"_id" : ObjectId("584877e56466dd236cd95f15"),
"agendaDate" : ISODate("2016-12-12T17:28:25.000+0000"),
"agendaTime" : "20:59",
"agendaEvent" : "Test event"
}
However, I am not getting any results returning as all. I have set up 3 test documents, 2 in the range, 1 outside.
Can anyone explain what I'm doing wrong and help rectify the code?
You need to compare dates against actual date objects, not strings representing them.
That is, you need to get the date from your moment objects, using the toDate() method.
var futureDate = moment().add(10, "days").toDate();
Well actually moment.toISOString() returns a string, so you can't use it to compare with date object in your mongodb query.
You should consider creating a date object for that.
Regs,
Yann

Is there a native feature to convert string based JSON into Mongoose Schema object instance?

I am using Express and I am looking for a convenient way to convert this kind of object (which comes on the request req.body.myObject):
{
"name": "Foo",
"someNumber": "23",
"someBoolean": "on"
}
Into an instance of this Schema:
var myObjectSchema = new Schema({
name: String,
someNumber: Number,
someBoolean: Boolean
});
Notice that the first object comes from the request, so its made entirely by Strings.
Is there some nice way to achieve this? If not, would you have any suggestions on how to implement this feature as a middleware???
By referring to this thread Mongoose : Inserting JS object directly into db I figured out that yes, theres a built in feature for this.
You simply build a new model passing request values (from the form) as parameters:
function add(req, res){
new Contact(req.body.contact).save(function(err){
console.log("Item added");
res.send();
});
};
It automatically converts stuff for you!
I know this answer has already been accepted, but I wanted to point out that mongoose takes care of most of the casting for you... most of the time. While it's convenient that mongoose does this, it abstracts away the true behavior of mongo. For example, mongoose lets you do something like this:
PersonModel.findById("4cdf00000000000000007822", ...);
However, if you tried to query the database directly (without mongoose), this would not work:
PersonCollection.find({_id: "4cdf00000000000000007822"}, ...);
This is because ObjectIds are not strings... they are objects. Internally, mongoose converts that string to an ObjectId and then performs a query against the database so that the final query looks kinda like this:
PersonCollection.find({_id: ObjectId("4cdf00000000000000007822")}, ...);
Also, each path in a schema has a "caster" method. This is a private method, but it's darn handy when you need it. PLEASE NOTE THAT THE caster METHODS DESCRIBED BELOW ARE UNDOCUMENTED AND CAN CHANGE WITHOUT WARNING. USE AT YOUR OWN RISK (sorry for yelling):
// Use PersonModel.schema.paths() to get all paths and loop over them if you want
var key = "name";
var pathObj = PersonModel.schema.path( key );
if( !pathObj ) pathObj = PersonModel.schema.virtualpath( key );
if( !pathObj ) { /* not found: return, continue, exit, whatever */ }
// UNDOCUMENTED: USE AT YOUR OWN RISK
var caster = pathObj.caster || pathObj;
var castedValue = caster.cast( req.body.name );
Why do I know this? Because if you want to use some of the more advanced features of mongo such as aggregation, you will need to cast your own values as you build the pipeline. I have also needed to manually cast values for certain queries which used the $in operator... maybe this is not needed any more. Point is, if you are having trouble getting the results you expect, try casting the values yourself.
Provided the schema is static, one could theoretically go the lazy, non-sophisticated way and just hardcode the values instead of passing the object itself:
var someObject = {
name: "Foo",
someNumber: "23",
someBoolean: "on"
}
var myObjectSchema = new Schema({
name: someObject.name,
someNumber: parseInt(someObject.someNumber, 10),
someBoolean: (someObject.someBoolean == "on")
});
Possibly not the answer you were looking for, but might be something to consider if nothing better is available.

Couch DB filter by key and sort by another field

In couchdb I need to filter by key and which is done like this.
{
"_id": "_design/test",
"_rev": "6-cef7048c4fadf0daa67005fefe",
"language": "javascript",
"views": {
"all": {
"map": "function(doc) { if (doc.blogId) {emit(doc.key, doc);} }"
}
}
}
However the results should be ordered by another key (doc.anotherkey). So using the same function how do I achieve both filtering and ordering by another key.
Thank you
If you only need to query by single key, you can use the following map:
function (doc) {
if (doc.blogId) {
emit([doc.key, doc.anotherkey], 1);
}
}
and query for "KEY" with ?startkey=["KEY"]&endkey=["KEY",{}]&include_docs=true.
Here, according to the collation specification of CouchDB:
["KEY"] is a value lesser than any ["KEY","OTHER"] value (because longer arrays sort after their prefixes), but greater than any ["KEY2","OTHER"] with "KEY2" < "KEY";
and ["KEY",{}] is a value greater than any ["KEY","OTHER"] value, if doc.otherkey is never a JSON object (because JSON objects comes after any other JSON value), but lesser than any ["KEY2","OTHER"] with "KEY2" > "KEY".
Of course this is not limited to strings. Any type of value will work, as long as the collation is right.
Remember to URL encode the values in startkey and endkey. For example, using curl and assuming your database is "DB":
curl 'http://localhost:5984/DB/_design/test/_view/all?startkey=%5B%22KEY%22%5D&endkey=%5B%22KEY%22,%7B%7D%5D&include_docs=true'
Note that I've used the include_docs query parameter, instead of emitting the entire document with emit(..., doc), to save disk space. Query parameters are documented on CouchDB documentation.
To sort results in descending order, use the descending=true query parameter and swap the values of startkey and endkey as documented in the definitive guide book.
curl 'http://localhost:5984/DB/_design/test/_view/all?endkey=%5B%22KEY%22%5D&startkey=%5B%22KEY%22,%7B%7D%5D&include_docs=true&descending=true'

Categories

Resources