Access last N documents of Google Firestore collection - javascript

I want to access last N documents from google firestore collection. The key for documents are timestamp in ISO format. The collection first needs to be sorted on timestamp keys and the last N documents updated are to be retrieved. This operation has to be realtime and fast.
events(collection) =>
documents as below:
2019-01-07T08:21:18.600Z => {
symbolname: '10900CE',
price: '10825.0',
timestamp: '2019-01-07T13:51:18Z',
quantity: '2.0',
side: 'SELL' }
2019-01-07T08:21:28.614Z => {
symbolname: '10800PE',
price: '10825.0',
timestamp: '2019-01-07T13:51:28Z',
quantity: '2.0',
side: 'BUY' }
2019-01-07T09:45:24.783Z => { side: 'SELL',
symbolname: '10800PE',
price: '10805.0',
timestamp: '2019-01-07T15:15:24Z',
quantity: '2.0' }
I currently loop through the collection and add the keys to an array and then sort the array and get last N document by those keys as following:
var ids = []
db.collection('events').get()
.then((snapshot) => {
snapshot.forEach((doc) => {
ids.push(doc.id)
});
})
.then(()=>{
//sortIds(ids);
console.log(JSON.stringify(ids[ids.length-2])) //second last entry 2019-01-07T08:21:28.614Z
})
.catch((err) => {
console.log('Error getting documents', err);
});
This operation takes a long time if collection size increases. Is there an optimal way of achieving this (sorting/orderBy-descending collection and fetch last N documents)?
Thanks

Perhaps something like this:
db.collection('events').orderBy("timestamp").limit(n)

Related

Add multiple records without overwriting or updating existing ones in a MongoDB database

I have a .csv file containing 1.7 million records. Some of these records are already in my database collection (2.2 million records), others are not. I want to add the records which are not yet in the collection without overwriting or updating the existing records. Each record has an enterprise number as unique property.
So far I could only found solutions which are first creating all the operations and than use the model.bulkWrite() method to perform these operations. Since these operations use updateOne() the already existing records will be updated.
So I'm looking for a way to add new records in bulk without updating or overwriting existing records.
Code
// Read data
const data = await readCSV('data/companies.csv');
// Format data
const formatted_data = data.map(record => ({
name: record.name,
enterprise_number: record.enterprise_number,
vat_number: record.vat_number,
legal_form: record.juridical_form || null,
activity_codes: (!record.nace_codes || record.nace_codes === '') ? [] : JSON.parse(record.nace_codes.replace(/'/g, '"')),
address: {
street: record.street,
house_number: record.house_number,
additional: record.additional,
postal_code: record.postal_code,
city: record.city,
country: record.country
},
phone: record.tel || null,
mobile: record.mobile || null,
email: record.email || null,
establishment_date: record.start_date,
status: record.is_active === '1' ? 'active' : 'inactive',
}));
const batchSize = 10000;
// Create batches of 10.000 records
for (let i = 0; i < formatted_data.length; i += batchSize) {
let batch = formatted_data.slice(i, i + batchSize);
let operations = batch.map((row) => ({
updateOne: {
filter: { enterprise_number: row.enterprise_number },
update: row,
upsert: true
}
}));
try {
let result = await Company.bulkWrite(operations);
console.log(`Batch ${i} done: ${result.modifiedCount} records modified, ${result.upsertedCount} records upserted`);
} catch (error) {
console.error(`Error updating batch ${i}: ${error}`);
}
}
console.log('All done');
Based on the answer of #Fraction which was a link to this topic Mongodb how to insert ONLY if does not exists (no update if exist)? I updated my code to the following
for (let i = 0; i < formatted_data.length; i += batch_size) {
let batch = formatted_data.slice(i, i + batch_size);
let operations = batch.map((row) => ({
updateOne: {
filter: { enterprise_number: row.enterprise_number },
update: { $setOnInsert: row },
upsert: true
}
}));
try {
let result = await Kbo.bulkWrite(operations);
console.log(`Batch ${i} done: ${result.result.nUpserted} records inserted`);
} catch (error) {
console.error(`Error inserting batch ${i}: ${error}`);
}
}
This is kind of working. It's upserting new records and it does not update the properties of a existing record BUT it's updating the updated_at property of all matching records.

how to take result of aggregation/JSON to update a collection document?

I am making a polling website. Currently I have two collections Polls and Votes. I am using the aggregation pipeline to get the number of votes for each movie. I am having difficulty wrapping my head around updating the poll based on the vote collection. This is the vote Schema:
poll: objectId
votedMovies: Array
0: Object
id: ObjectId
title: string
This is my poll Schema:
_id: ObjectID
pollType: String
totalVotes: Number
movies: Array
0: Object
id: ObjectID
title: String
votes: Number
So far I have an aggregation pipeline that does the following:
let voteCollection = await db.collection('votes').aggregate([
{
$match: {poll: id}
},
{
$unwind: "$votedMovies"
},
{
$group: {_id: "$votedMovies.id", totalVotes: {$sum: 1}}
}
]).toArray()
That spits out something like this:
[{"_id":10674,"totalVotes":2},
{"_id":99861,"totalVotes":1},
{"_id":299534,"totalVotes":4},
{"_id":637157,"totalVotes":3},
{"_id":24428,"totalVotes":5}]
How do I update the poll document so that it has the current number of votes? Am I on the right track with the aggregation pipeline?
You should be able to update each movie votes with:
for (const vote of voteCollection) {
await db.collection('polls').updateOne(
{
_id: id, // pool id
'movies._id': vote._id,
},
{
'movies.$.votes': vote.totalVotes,
}
);
}

How to extract data from a nested map in firestore?

I have a collection users and it has multiple documents. Inside each document, there is a field called contacts. It is of map type. Inside contacts, I have another map-type data.
My database:
I am trying to store my contacts data in contactDetailsArr array like:
[{userId: GA3yfqLaaTaDugSrFOQujnj34Y13,
lastMessage:"Here there!!!",
time: t {seconds: 1663220632, nanoseconds: 36000000}
},
{userId: TZjQb8yoYfQbowloQk1uLRCCPck1,
lastMessage:"How are you?",
time:t {seconds: 1663306634, nanoseconds: 859000000}
}]
but I am getting my array as
[{userId: GA3yfqLaaTaDugSrFOQujnj34Y13,
lastMessage:undefined,
time:undefined
},
{userId: TZjQb8yoYfQbowloQk1uLRCCPck1,
lastMessage:undefined,
time:undefined
}]
Here is my code:
export const getUserContacts= () =>{
const contactDetailsArr=[];
db.collection("users").doc(userId).get() //userId is equal to "3aTGU..."
.then(docs=>{
const contactsObject= docs.data().contacts;
for(let contact in contactsObject){
contactDetailsArr.push({userId: contact,
lastMessage: contact.lastMsg,
time: contact.lastMsgTime
})
}
})
.catch(err=>{
console.log(err);
})
}
If maps are objects then why I am able to extract data as we do in the case of objects.
Please guide what I am doing wrong.
If you log the values in your for loop, it is easy to see the problem:
for(let contact in contactsObject){
console.log(contact);
}
This logs:
0
1
So you need to still look up the object at the index:
for(let i in contactsObject){
console.log(contactsObject[i]);
}
Or with a forEach list comprehension:
contactsObject.forEach((contact) => {
console.log(contact, contact.userId, contact.lastMessage, contact.time);
}

Reduce function returning NaN when pulling object from array - MongoDB database, React frontend

I have the following Schema:
const SubmitDebtSchema = new Schema ({
balance: [{
balanceDate: Date,
newBalance: Number
}],
});
I am attempting to loop through my database entries, pull the 'newBalance' out from each object in the balance array, and then reduce / sum them together.
However, it is returning 'NaN' - and I can't figure out why.
Here is my Axios call to get the data:
componentDidMount() {
axios.get("/api/fetch/fetchDebtCards")
.then((response) => {
this.setState({
debts: response.data
})
console.log(this.state.debts.balance.newBalance)
})
}
The console log in there successfully retrieves the database entries.
And here is my reduce function:
const sumBalance = this.state.debts.reduce(function(previousValue, currentValue) {
return (
previousValue + currentValue.balance.newBalance
)
}, 0)
You can see, I'm attempting to tap into 'balance.newBalance' to access the newBalance within each of the balance objects.
Can anyone point out what I'm doing wrong?
EDIT: My console log, with two entries. What I want to do is get the newBalance array.length -1 from these, and sum them together by reducing.
[Log] Array
0 Object
balance: [{_id: "5fbbddd1077c56000828973c", balanceDate:
"2020-11-23T16:05:36.124Z", newBalance: 400}]
1 Object
balance: [{_id: "5fbc06f58b2f98000865df54", balanceDate:
"2020-11-23T19:01:07.789Z", newBalance: 300}] (1)
if "console.log(this.state.debts.balance.newBalance)" works then debts is not an array, how are you using map on debts then? Map can only be used on arrays.
I'm not sure how your debts object/array is exactly. Maybe a log of it would be helpful.
If it is an array, then this might work.
const sumBalance = this.state.debts.map(x => x.balance).flat().reduce((a,b) => a + b.newBalance, 0)
whereas if it's an object, then this might work
const sumBalance = this.state.debts.balance.reduce((a,b) => a+b.newBalance, 0)
If none of these work, just log "this.state.debts" and let us see what you have there.
Edit: Ok so you only need the last values of the balance arrays (the latest balance), something like this?
const sumBalance = this.state.debts.map(x => x.balance[x.balance.length-1].newBalance).reduce((a,b) => a + b, 0)

how to add items in a database for billing systems

I'm developing a cloud based billing system and I have two tables in my database namely bill_history and sold_items. I want to store the Bill number, date, customer name, phone number and total amount and then to return the bill number from bill_history and store the array of objects containing item no, item name, price, quantity, amount with the returned bill no in sold_items. I'm using the following code:
app.post('/billed', (req, res) => {
const { items, total, date } = req.body;
console.log(items, total, date);
db.transaction(trx => {
db.insert({
total: total,
date: date,
}).into('billhead')
.transacting(trx)
.returning('billno')
.then(num => {
for (var i = 0; i < items.length; i++) {
trx.insert({
billno: num,
prodname: items[i].name,
quantity: items[i].quantity,
netprice: items[i].amount
}).into('billdetails')
}).then(trx.commit())
.catch(trx.rollback())
})
})
Now Entries are found in bill_history but not entered in sold_items. I can't find the mistake! Help me with this error. The console and terminal shows No Error
Important thing to remember when working with knex queries: they are promises and they will only execute if:
You call then on the knex object itself
You return the knex query inside a promise chain, and call then somewhere down the chain
Inside your for loop, you have only stated what the knex object should do and because of syntax errors didn't call then on the knex object itself.
.into('billdetails').then(inserts => { /// })
It does work if you return trx.insert()...
That being said, it wouldn't suit your use case, as when inserting multiple values inside a transaction, you need to make sure all inserts have been succesfull. Using for loops the way you did in async fashion is dangerous and won't guarantee all individual inserts have completed without errors and that it's safe to commit the transaction.
One way of achieving this in a safe manner would be modifying this section of your code:
// ...
.returning('billno')
.then(num => {
// We create an array of individual inserts
// Each element in the array will be a single knex
// object/promise that inserts one row into the database
const billDetailInserts = items.map(item => trx.insert({
billno: num,
prodname: item.name,
quantity: item.quantity,
netprice: item.amount
).into('billdetails')
})
// we utilize the Promise.all method that will resolve when
// all individual inserts have completed succesfully
return Promise.all(billDetailInserts);
})
.then(inserts => {
// ... commits, rollbacks, logging etc

Categories

Resources