Am I using Mongo's $and and $expr incorrectly? - javascript

Here is my query:
ctas.updateMany({
$and: [
{$expr: { $lt: ['$schedule.start', () => Date.now()] }},
{$expr: { $gt: ['$schedule.end', () => Date.now()] }}
]
},
{
$set: {isActive: true}
}).then(res => {
const { matchedCount, modifiedCount } = res;
console.log(`Successfully matched ${matchedCount} and modified ${modifiedCount} items.`)
}).catch(e => console.error(e));
I'm absolutely positive that start is less than Date.now() and end is greater than Date.now(), but I'm not getting any matches. Is my syntax wrong?
a snippet of my document in mongo:
schedule: {
start: 1642564718042,
end: 3285129434744
}
Edit: In case it makes a difference, I'm writing this code as a mongo scheduled trigger.
Update: If I replace the second expression with an obviously truth expression, { isActive: false }, it matches all the documents. Obviously Date.now()*2 (what I used to set schedule.end) is greater than Date.now(), so why is that second expression failing?

Missing $. And make sure your field paths are correct. $schedule.start and $schedule.end.
And another concern is that both schedule.start and schedule.end are with Timespan value. So you need to cast them to date via $toDate.
db.collection.update({
$and: [
{
$expr: {
$lt: [
{
$toDate: "$schedule.start"
},
new Date()
]
}
},
{
$expr: {
$gt: [
{
$toDate: "$schedule.end"
},
new Date()
]
}
}
]
},
{
$set: {
isActive: true
}
})
Sample Mongo Playground

Related

Mongoose return grouped by date

My model is the following :
const scheduleTaskSchema = new Schema({
activity: { type: Object, required: true },
date: { type: Date, required: true },
crew: Object,
vehicle: Object,
pickups: Array,
details: String,
});
pickups is an array of objects with the following structure :
{
time : //iso date,
...rest //irrelevant
}
i want to return my data grouped by date,and also every group within has to be sorted by pickup[0].time so it will finally return something like this example :
{
"2022-08-28T00:00:00.000+00:00": [
{
date: "2022-08-28T00:00:00.000+00:00",
pickups: [
{
time: "2022-08-28T07:30:00.000Z",
...rest //irrelevant
}
],
...rest //irrelevant
},
{
date: "2022-08-28T00:00:00.000+00:00",
pickups: [
{
time: "2022-08-28T09:30:00.000Z",
...rest //irrelevant
}
],
...rest //irrelevant
}
]
,
"2022-08-29T00:00:00.000+00:00": [
{
date: "2022-08-29T00:00:00.000+00:00",
pickups: [
{
time: "2022-08-29T10:00:00.000Z",
...rest //irrelevant
}
],
...rest //irrelevant
},
{
date: "2022-08-29T00:00:00.000+00:00",
pickups: [
{
time: "2022-08-29T11:30:00.000Z",
...rest //irrelevant
}
],
...rest //irrelevant
}
]
}
You need to use the aggregation framework for this.
Assuming the dates are exactly the same (down to the millisecond), your code would look something like this.
const scheduledTasksGroups = await ScheduledTask.aggregate([{
$group: {
_id: '$date',
scheduledTasks: { $push: '$$ROOT' }
}
}]);
The output will be something like:
[
{ _id: "2022-08-29T10:00:00.000Z", scheduledTasks: [...] },
{ _id: "2022-08-28T10:00:00.000Z", scheduledTasks: [...] }
]
If you want to group by day instead of by millisecond, your pipeline would look like this:
const scheduledTasksGroups = await ScheduledTask.aggregate([{
$group: {
// format date first to `YYYY-MM-DD`, then group by the new format
_id: { $dateToString: { format: '%Y-%m-%d', date: '$date' } },
scheduledTasks: { $push: '$$ROOT' }
}
}]);
For what it's worth, this is a MongoDB feature, the grouping happens on the MongoDB server side; mongoose doesn't do anything special here; it just sends the command to the server. Then the server is responsible for grouping the data and returning them back.
Also, keep in mind that mongoose does not cast aggregation pipelines by default, but this plugin makes mongoose cast automatically whenever possible.

Aggregate match doesn't work once I add more than 1 match?

I'm having some trouble with this aggregate function. It works correctly when I only have a single match argument (created_at), however when I add a second one (release_date) it never returns any results, even though it should. I've also tried the matches with the '$and' parameter with no luck.
Here is the code. Anyone know what I'm doing wrong?
Thanks!
db.collection('votes).aggregate([
{
$match: {
$and:
[
{ created_at: { $gte: ISODate("2021-01-28T05:37:58.549Z") }},
{ release_date: { $gte: ISODate("2018-01-28T05:37:58.549Z") }}
]
}
},
{
$group: {
_id: '$title',
countA: { $sum: 1 }
}
},
{
$sort: { countA: -1 }
}
])

Running sequelize with two where conditions

I have a mysql db instance with a table consisting of a various fields. Relevant fields are start, start time, and status
start: YYYY-MM-DD
startTime: HH:mm:ss
status: ENUM('cancelled', 'scheduled, etc)
If I want to get a list of all entries that don't have status = 'cancelled' and that occur today or after, I would write this:
return {
where: {
status: {
$ne: 'cancelled'
},
$or: {
start: { $gte: moment().utc().format('YYYY-MM-DD') },
$and: {
isRepeating: 1,
$or: [{
end: {
$gte: moment().format(),
}
},
{
end: {
$eq: null,
}
}]
},
}
},
I am trying to modify this query to not only give me entries that occur today or after, but also greater than right now (time wise, UTC). My attempt was to first filter based on startTime, and then filter based on startDate, but it does not seem to be working:
return {
where: {
status: {
$ne: 'cancelled'
},
$or: {
startTime: { $gt: moment.utc().format('HH:mm:ss') },
$and: {
start: { $gte: moment().utc().format('YYYY-MM-DD') },
$and: {
isRepeating: 1,
$or: [{
end: {
$gte: moment().format(),
}
},
{
end: {
$eq: null,
}
}]
}
},
}
},
(does not work, because it just returns everything!)
I also cannot do something more simple like
where: {
startTime: { $gt: moment.utc().format('HH:mm:ss') },
start: { $gte: moment().utc().format('YYYY-MM-DD') },
}
Because then it will ignore, for example, entries that occur tomorrow date wise, but occur earlier in the day than the current timestamp.
Thanks!
You can use Op.and operator to combine those conditions.
const { Op } = require("sequelize");
...
where: {
[Op.and]: [
startTime: { $gt: moment.utc().format('HH:mm:ss') },
start: { $gte: moment().utc().format('YYYY-MM-DD') }
]
}
...

Mongoose aggregate does not work with $or

I am working on a Chat application. My schema looks like this:
{
from: String,
to: String,
message: String,
attachment: {
name: String,
size: Number,
type: String,
},
unread: Boolean,
sent: Date,
seen: Date,
}
The following code works and returns the latest messages:
Query 1:
ChatDB.aggregate([
{ $match: {
$or: [
{ from, to },
{ from: to, to: from },
],
}},
{ $sort: { sent: -1 }},
{ $limit: messageBatchSize },
{ $sort: { sent: 1 }},
]);
But, when I try to paginate by including a timestamp in the query, it does not work anymore:
Query 2:
ChatDB.aggregate([
{ $match: {
sent: { $lt: new Date(beforeTimestamp) },
$or: [
{ from, to },
{ from: to, to: from },
],
}},
{ $sort: { sent: -1 }},
{ $limit: messageBatchSize },
{ $sort: { sent: 1 }},
]);
If I remove the $or portion and keep only the timestamp check on sent, things work, but (of course) it returns results for all users, which is not what I want:
Query 3:
ChatDB.aggregate([
{ $match: {
sent: { $lt: new Date(beforeTimestamp) },
}},
{ $sort: { sent: -1 }},
{ $limit: messageBatchSize },
{ $sort: { sent: 1 }},
]);
At first I thought it has got to do something with not converting the ids from string to ObjectId and changed my code to use Types.ObjectId accordingly. But that did not help even. I mean, Query 1 works correctly without any conversion.
Any idea what is going on? My mongoose version:
"mongoose": "^5.8.2",
Edit:
I tried running the query in mongo console and it returned the results correctly:
> db.chats.aggregate([
... {
... $match: {
... $or: [
... { from: '5f0319f87278d056876952d5', to: 'org' },
... { to: '5f0319f87278d056876952d5', from: 'org' },
... ],
... sent: { $lt: new Date('2020-07-08T17:05:34.288Z') }
... }
... },
... { $sort: { sent: -1 }},
... { $limit: 20 },
... { $sort: { sent: 1 }}
... ]);
I feel kinda stupid for posting this in the first place.
The problem turned out to be that the values in from and to were of type Types.ObjectId because they were being retrieved from a different collection.
The values stored in ChatDB were strings. Because of this, the query from mongo console worked fine (because I was providing string correctly) and the one with mongoose in the code did not work.
However, I still don't know why Query 1 worked.

Returning count w/ data in MongoDB Aggregation

I've written a MongoDB aggregation query that uses a number of stages. At the end, I'd like the query to return my data in the following format:
{
data: // Array of the matching documents here
count: // The total count of all the documents, including those that are skipped and limited.
}
I'm going to use the skip and limit features to eventually pare down the results. However, I'd like to know the count of the number of documents returned before I skip and limit them. Presumably, the pipeline stage would have to occur somewhere after the $match stage but before the $skip and $limit stages.
Here's the query I've currently written (it's in an express.js route, which is why I'm using so many variables:
const {
minDate,
maxDate,
filter, // Text to search
filterTarget, // Row to search for text
sortBy, // Row to sort by
sortOrder, // 1 or -1
skip, // rowsPerPage * pageNumber
rowsPerPage, // Limit value
} = req.query;
db[source].aggregate([
{
$match: {
date: {
$gt: minDate, // Filter out by time frame...
$lt: maxDate
}
}
},
{
$match: {
[filterTarget]: searchTerm // Match search query....
}
},
{
$sort: {
[sortBy]: sortOrder // Sort by date...
}
},
{
$skip: skip // Skip the first X number of doucuments...
},
{
$limit: rowsPerPage
},
]);
Thanks for your help!
We can use facet to run parallel pipelines on the data and then merge the output of each pipeline.
The following is the updated query:
db[source].aggregate([
{
$match: {
date: {
$gt: minDate, // Filter out by time frame...
$lt: maxDate
}
}
},
{
$match: {
[filterTarget]: searchTerm // Match search query....
}
},
{
$set: {
[filterTarget]: { $toLower: `$${filterTarget}` } // Necessary to ensure that sort works properly...
}
},
{
$sort: {
[sortBy]: sortOrder // Sort by date...
}
},
{
$facet:{
"data":[
{
$skip: skip
},
{
$limit:rowsPerPage
}
],
"info":[
{
$count:"count"
}
]
}
},
{
$project:{
"_id":0,
"data":1,
"count":{
$let:{
"vars":{
"elem":{
$arrayElemAt:["$info",0]
}
},
"in":{
$trunc:"$$elem.count"
}
}
}
}
}
]).pretty()
I think I figured it out. But if someone knows that this answer is slow, or at least faulty in some way, please let me know!
It's to add a $group stage, passing null as the value, then pushing each document, $$ROOT, into the data array, and for each one, incrementing count by 1 with the $sum operator.
Then, in the next $project stage, I simply remove the _id property, and slice down the array.
db[source].aggregate([
{
$match: {
date: {
$gt: minDate, // Filter out by time frame...
$lt: maxDate
}
}
},
{
$match: {
[filterTarget]: searchTerm // Match search query....
}
},
{
$set: {
[filterTarget]: { $toLower: `$${filterTarget}` } // Necessary to ensure that sort works properly...
}
},
{
$sort: {
[sortBy]: sortOrder // Sort by date...
}
},
{
$group: {
_id: null,
data: { $push: "$$ROOT" }, // Push each document into the data array.
count: { $sum: 1 }
}
},
{
$project: {
_id: 0,
count: 1,
data: {
$slice: ["$data", skip, rowsPerPage]
},
}
}
]).pretty()

Categories

Resources