The right way to do pagination in mongo db - javascript

I am trying to create a blog website with pagination using MongoDB and express
where users will be able to get pages with limit on how many posts on every page sorted by newest to oldest
My implementation of the pagination is as follows...
For example, if a client want to make request to get posts for the third page, with page size of 5 posts per page, he will make the following request to the server
GET// {serverUrl}/posts?pagesize=5&pagenumber=3
this is the code in node/express
app.get('/posts', async function(req, res, next) {
const pagesize = req.query.pagesize || 0; //5
const pagenumber = req.query.pagenumber || 0; //3
const postsToSkip = (pagenumber -1) * pagesize; //10
try {
// skip 10 and limit 5
const posts = await posts.find().sort({date: -1}).skip(postsToSkip).limit(pagesize)
return res.status(200).send(posts.toArray());
} catch(error) {
return res.status(500).send('something went wrong');
}
});
The code works perfectly fine, the problem is as follows
for simplicity peruse lat's say that there are 100 posts in the database, and they all have an ID 1-100, so the oldest post has an ID of 1 and the newest post has an ID of 100
let's say that a user is going to the blog posts website (build by React, for example)
the React make a request to the server
GET// {serverUrl}/posts?pagesize=5&pagenumber=1
the server then makes a request to MongoDB to get the newest 5 posts (ID's 96-100)
posts.find().skip(0).limit(5)
and then he sends it back to React, and react render them on to the page,
now if for example before the user goes to the next page, 5 new posts are added to the database by a different user (they all get ID's of 101-105), now when the user goes to the next page, React will make a request to the server get the second page
GET// {serverUrl}/posts?pagesize=5&pagenumber=2
the server will now make a request to MongoDB to get page number 2
posts.find().skip(5).limit(5)
but now the second page will be identical to first page, since when the user first went to the website and was on the first page the 5 newest posts were with ID of 96-100, but when he went to the second page after the addition of the 5 posts, the 5 newest posts were ID 101-105 and the second 5 newest posts will be the posts with ID's of 96-100 same as were on the first page when the user first went to the website, thus causing the second page to be identical to the first page
I would like to know if there are any implementation to overcome this weird behavior
This is my first question I am posting to stackoverflow.com, I would like to hear your feedback...
thank you very much 😊😊😊

try with this example.
app.get('/posts', async function (req, res, next) {
try {
const pagesize = 10; // 10 records per page
const pageNumber = req.query.pageNumber; // pageNumber
const posts = await posts.find().skip((pageNumber - 1) *
pagesize).limit(pagesize)
return res.status(200).send(posts.toArray());
} catch (error) {
return res.status(500).send('something went wrong');
}
});

Related

show long API call progress with progress bar in NextJS

When the user clicks a button on my NextJS website, a NextJS API is called. There, a Puppeteer client is started, an external API is called, and the code loops through this response and crawls through some data.
This takes a long time, and I wanted to give the user some kind of information on how the progress is going.
For instance: I get several pages and items on each page from the external API — let's say, 3 pages with 100 items each. Then I'd show the user "processing item 1 of 300". As the items go by, this number would be updated.
The problem is that right now, I'm using res.send, and it closes the connection with a 200 status. I wanted to send back this data without closing.
Some people told me to research HTTP Streaming, but I couldn't find any practical explanation on how to do it — especially using NextJS.
Pseudocode:
// api/index.ts
export default async function handler(
req: NextApiRequest,
res: NextApiResponse<Data>,
) {
// Start crawler instance
const { page } = await crawler.up()
const items = await getItems()
// Close crawler before ending
await crawler.down(page)
res.status(200).json(items)
}
// getItems.ts
export const getItems = async () => {
const items = fetch('external-url')
const result = []
for (const index in items) {
// instead of this console.log, I wanted to send this as a message to the website, so it could update a progress bar
console.log(`Processing ${index + 1} of ${items.length}`)
const processed = await processResult(items[index]) // this will take a while
result.push(processed)
}

Pagination with many elements is incredibly slow in Mongoose and Keystone

Versions: Keystone v4
I have a Mongo database with >20k items. What I want is a paginator that would allow the user to quickly scroll through the Mongo database 25 elements at a time. Currently, this feature is implemented, but the server takes >40 seconds to return the results because it queries the entire (20k item) database. However, only 25 elements are displayed on a single page, so I feel like if it just fetches 25 results instead of 20k, it should be quicker. How could I implement this? I know about the .limit() function, but I can't seem to figure out pagination in keystone while using that.
Current Code:
var q = Items.model.find();
q.exec(function(err, newss) {
console.log('There are %d', newss.length); // Prints out 20k number
...//skip
locals.cnts = newss;
// console.log(newss[0])
locals.pagerr = pager({
page: parseInt(req.query.page, 10) || 1,
perPage: 25,
total: newss.length
});
locals.itemsss = locals.cnts.slice(
locals.pagerr.first - 1,
locals.pagerr.last
);
next();
})
In it's current implmentation, it takes >40 seconds to return the paginated results. How can I fix this?
The model.find() function you're using here is equivalent to the Mongoose find() function. As you're calling it without any filters, this code is retrieving all 25k items from the database each time it runs. This data is being transferred to the web server/node process where the body of your function(err, newss) {...} function is run. Only then are the 25 items you're after being extracted from the set.
Instead, if you want to use offset-based pagination like this, you should be using the query.limit() and query.skip() functions. If you need to count the total items first, do so in a separate query using query.count().
I haven't tested this code (and it's been a while since I used Mongoose), but I think you want something like this:
// Warning! Untested example code
Items.model.find().count(function (err, count) {
console.log('There are %d', count);
locals.pager = pager({
page: parseInt(req.query.page, 10) || 1,
perPage: 25,
total: count
});
Items.model.find()
.skip(locals.pager.first)
.limit(25)
.exec(function(err, results) {
locals.results = results;
next();
});
});
On a more general note – if you like Keystone and want to use Mongo, keep an eye on the Keystone 6 updates. Keystone 6 uses Prisma 2 as it's ORM layer and they recently released support for Mongo. As soon as that functionality production ready, we'll be supporting it in Keystone too.

Fetch remaining data on last page in pagination

I'm making a simple pagination for the comments on the projects i'm working on. But i have an issue where i can keep requesting more comments and get a blank page.
My API URL for fetching comments is: {{URL}}/responses?id={{id}}&skip={{skip}}&take=10
nextComments = () => {
if (this.state.skip <= this.state.responses.total) {
this.setState((prevState) => ({
skip: prevState.skip +
}), async () => {
const responsesbyId = await getResponsesbyOrgId(this.state.orgId, this.state.skip);
this.setState({
responses: responsesbyId
})
console.log(this.state.responses);
});
}
};
I've tried setting a max, but then an another issue is that when there is e.g. 16 comments i can skip 10 then 10 more and end up with a blank page again.
Is there a smarter way to deal with this? so i round up to 6 when there is less than 10 comments left?
hope i'm clear in my question.
I'm assuming you have control over your backend based on the question, so ideally, this is something your API should make the front end aware of. The API should return the total number of results from a query, as well as the number of results per page and what page you're currently viewing (starts at 1). Then the front end can look at those values and dynamically control the pagination logic.
So for example, API response could be something like:
const res = {
results: [...],
resultCount: 16,
page: 1,
resultsPerPage: 10
};
(you'd be storing the current page in the front end state of course, but just added it to the backend response since it usually doesn't hurt to return the request params).
Then in your front end, where you're storing the current page value, the logic could be
if ((currentPage * resultsPerPage) < resultCount) { /* You can let them fetch another page */}
This would satisfy your requirements of not letting them view more pages if they shouldn't be able to, and also lets the results per page variable change in the backend without the front end having to do a thing.

post times are incorrect on deployment, but not when using app locally

So I ran into this strange issue. I have a feed where you can add posts. One of these posts are event posts where you pick a time for when an event happens. I tested this locally and it seems to work fine. It shows the time I had picked. However, when I deploy the app it suddenly displays the wrong time when I post an event. e.g I posted an event that took place at 3:30PM and when it displays the event time is set at 09:30PM.
I am currently in Mexico City and I assume this is 6 hours difference from the UTC. I just don't understand why it is correct when I test it locally and incorrect when the app is deployed. Can someone help me out? I'll post some of my images below to show what the issue is.
(Maybe it can help to mention that I work together on this project with several other people internationally in different timezones)
this is the post when I try it locally: the content is the time I had intented it to be and as you can see it displays the correct time
the database also seems to save the correct time (see highlighted line)
Here is the app when it's deployed. As you can see I intented to post an event set for 3:30PM, but it ended up showing 9:30PM. So locally it displays correctly, but when deployed not
this is the code that handles adding the post to the database. below you'll see the result of the first console.log in this piece of code
const add = async (req, res, next) => {
try {
const postData = req.body;
console.log('req.body', req.body);
// // Id it's event post, convert due_to date to UTC before storing
// if (postData.type === 'event') {
// postData['event.due_to'] = moment.utc(postData['event.due_to']).format();
// }
const post = await Post.create(postData);
if (post._content_mentions.length !== 0) {
// Create Notification for mentions on post content
notifications.newPostMentions(post);
// start the process to send an email to every user mentioned
post._content_mentions.forEach((user, i) => {
sendMail.userMentionedPost(post, user, i);
});
}
// Send Email notification after post creation
switch (post.type) {
case 'task':
await notifications.newTaskAssignment(post);
await sendMail.taskAssigned(post);
break;
case 'event':
await notifications.newEventAssignments(post);
await sendMail.eventAssigned(post);
break;
default:
break;
}
return res.status(200).json({
message: 'New post created!',
post
});
} catch (err) {
return sendErr(res, err);
}
};
console.log(req.body) in the code above
req.body { 'event._assigned_to': '5c34e8b57b82e019b09f569d',
content: '<p>2:30PM</p>',
type: 'event',
_posted_by: '5c34e8b57b82e019b09f569d',
_group: '5c34e8b57b82e019b09f569e',
'event.due_to': '2019-01-09 02:30:00.000' }
part of the angular front end that handles the time settings before sending the new post to the server
// create date object for this event
const date = new Date(this.model_date.year, this.model_date.month -1, this.model_date.day, this.model_time.hour, this.model_time.minute);
const post = {
content: this.post.content,
type: this.post.type,
_posted_by: this.user_data.user_id,
_group: this.group_id,
event: {
due_date: moment(date).format('YYYY-MM-DD'),
due_time: moment(date).format('hh:mm:ss.SSS'),
due_to: moment(date).format('YYYY-MM-DD hh:mm:ss.SSS'),
// problem: assignedUsers will always be empty
_assigned_to: assignedUsers,
_content_mentions: this.content_mentions
},
files: this.filesToUpload
};

Implementing "coins" to my users on passportjs and changing them

I actually have 2 questions. One is what is the best way to go about adding other information to my user object? I have a very simple passportjs login/register system and I want to expand and also add things like coins and other things that are specific to a user's account. Currently, I just added a new property called "coins" along with the password/email and stuff.
My second question is with my coins property for my user object, how do I edit it? Currently I just have a simple request to add a coin for the user and it isn't working.
router.get('/coin', function(req, res) {
User.getUserByUsername(req.user.username, function(err, user){
user.coins = user.coins + 1
console.log(user.coins)
});
res.redirect('/')
});
It is just console logging 1 every time I press the button(the button calls for /coin)
You are loading the user, incrementing one coin, but you are not saving it. So, every time you call the function, load previous state of User.
You should call User.update or whatever as the function of your api is called to save the user data before redirecting.
router.get('/coin', function(req, res) {
User.getUserByUsername(req.user.username, function(err, user){
user.coins = user.coins + 1
User.update(user) // Stores modified data.
console.log(user.coins)
});
res.redirect('/')
});

Categories

Resources