Function returned undefined, expected Promise or value and unable to delete old data from firebase database using cloud functions - javascript

I'm trying to delete multiple nodes on my database that are older than 12hrs. I"m using a pub/sub function to trigger this event. I don't know if my code is actually looping through all nodes as I'm not using the onWrite, onCreate database triggers on specific. Here is the image sample of the database
this is the pub/sub code
exports.deletejob = functions.pubsub.topic('Oldtask').onPublish(() => {
deleteOldItem();
})
and the deleteOldItem function
function deleteOldItem(){
const CUT_OFF_TIME = 12 * 60 * 1000; // 12 Hours in milliseconds.
//var ref = admin.database().ref(`/articles/${id}`);
const ref = admin.database().ref(`/articles`);
const updates = {};
ref.orderByChild('id').limitToLast(100).on('value', function (response) {
var index = 0;
response.forEach(function (child) {
var element = child.val();
const datetime = element.timestamp;
const now = Date.now();
const cutoff = now - datetime;
if (CUT_OFF_TIME < cutoff){
updates[element.key] = null;
}
});
//This is supposed to be the returened promise
return ref.child(response.key).update(updates);
});
If there's something I'm doing wrong, I'll like to know. The pub/sub is triggered with a JobScheduler already setup on google cloud scheduler

You had several problems in your code that were giving you trouble.
The handling of promises wasn't correct. In particular, your top level function never actually returned a promise, it just called deleteOldItems().
You should use the promise form of once() instead of calling on() with a callback since you don't want to install a listener in this case, you just need the result a single time, and you want to handle it as part of a promise chain.
To delete nodes, you should call remove() on a reference to that node. It also generates a promise for you to use here.
You didn't calculate 12 hours in milliseconds properly, you calculated 12 minutes in milliseconds :)
Here's what I came up with. It uses an http function instead of a pubsub function as well as adding a log statement for my testing, but the modification you need should be trivial/obvious (just change the prototype and remove the response after deleteOldItems, but do make sure you keep returning the result of deleteOldItems()):
const functions = require('firebase-functions');
const admin = require('firebase-admin');
function deleteOldItems() {
const CUT_OFF_TIME = 12 * 60 * 60 * 1000; // 12 Hours in milliseconds.
const ref = admin.database().ref('/articles');
return ref.orderByChild('id').limitToLast(100).once('value')
.then((response) => {
const updatePromises = [];
const now = Date.now();
response.forEach((child) => {
const datetime = child.val().timestamp;
const cutoff = now - datetime;
console.log(`processing ${datetime} my cutoff is ${CUT_OFF_TIME} and ${cutoff}`);
if (CUT_OFF_TIME < cutoff){
updatePromises.push(child.ref.remove())
}
});
return Promise.all(updatePromises);
});
}
exports.doIt = functions.https.onRequest((request, response) => {
return deleteOldItems().then(() => { return response.send('ok') });
}
While I have not tested it, I'm pretty sure this will work to include inside your original function call for cloud scheduler:
exports.deletejob = functions.pubsub.topic('Oldtask').onPublish(() => {
return deleteOldItems();
})
Of course, this is still more complicated than you need, since ordering by id doesn't really gain you anything here. Instead, why not just use the query to return the earliest items before the cut off time (e.g. exactly the ones you want to remove)? I've also switched to limitToFirst to ensure the earliest entries get thrown out, which seems more natural and ensures fairness:
function deleteOldItems() {
const cutOffTime = Date.now() - (12 * 60 * 60 * 1000); // 12 Hours earlier in milliseconds.
const ref = admin.database().ref('/articles');
return ref.orderByChild('timestamp').endAt(cutOffTime).limitToFirst(100).once('value')
.then((response) => {
const updatePromises = [];
response.forEach((child) => {
updatePromises.push(child.ref.remove())
});
return Promise.all(updatePromises);
});
}
If you do this on more than a few items, of course, you probably want to add an index on the timestamp field so the range query is more efficient.

Related

Firestore listen only to root component [duplicate]

Is it possible to count how many items a collection has using the new Firebase database, Cloud Firestore?
If so, how do I do that?
2023 Update
Firestore now supports aggregation queries.
Node SDK
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Web v9 SDK
const coll = collection(db, "cities");
const snapshot = await getCountFromServer(coll);
console.log('count: ', snapshot.data().count);
Notable Limitation - You cannot currently use count() queries with real-time listeners and offline queries. (See below for alternatives)
Pricing - Pricing depends on the number of matched index entries rather than the number of documents. One index entry contains multiple documents making this cheaper than counting documents individually.
Old Answer
As with many questions, the answer is - It depends.
You should be very careful when handling large amounts of data on the front end. On top of making your front end feel sluggish, Firestore also charges you $0.60 per million reads you make.
Small collection (less than 100 documents)
Use with care - Frontend user experience may take a hit
Handling this on the front end should be fine as long as you are not doing too much logic with this returned array.
db.collection('...').get().then(snap => {
size = snap.size // will return the collection size
});
Medium collection (100 to 1000 documents)
Use with care - Firestore read invocations may cost a lot
Handling this on the front end is not feasible as it has too much potential to slow down the users system. We should handle this logic server side and only return the size.
The drawback to this method is you are still invoking Firestore reads (equal to the size of your collection), which in the long run may end up costing you more than expected.
Cloud Function:
db.collection('...').get().then(snap => {
res.status(200).send({length: snap.size});
});
Front End:
yourHttpClient.post(yourCloudFunctionUrl).toPromise().then(snap => {
size = snap.length // will return the collection size
})
Large collection (1000+ documents)
Most scalable solution
FieldValue.increment()
As of April 2019 Firestore now allows incrementing counters, completely atomically, and without reading the data prior. This ensures we have correct counter values even when updating from multiple sources simultaneously (previously solved using transactions), while also reducing the number of database reads we perform.
By listening to any document deletes or creates we can add to or remove from a count field that is sitting in the database.
See the firestore docs - Distributed Counters
Or have a look at Data Aggregation by Jeff Delaney. His guides are truly fantastic for anyone using AngularFire but his lessons should carry over to other frameworks as well.
Cloud Function:
export const documentWriteListener = functions.firestore
.document('collection/{documentUid}')
.onWrite((change, context) => {
if (!change.before.exists) {
// New document Created : add one to count
db.doc(docRef).update({ numberOfDocs: FieldValue.increment(1) });
} else if (change.before.exists && change.after.exists) {
// Updating existing document : Do nothing
} else if (!change.after.exists) {
// Deleting document : subtract one from count
db.doc(docRef).update({ numberOfDocs: FieldValue.increment(-1) });
}
return;
});
Now on the frontend you can just query this numberOfDocs field to get the size of the collection.
Simplest way to do so is to read the size of a "querySnapshot".
db.collection("cities").get().then(function(querySnapshot) {
console.log(querySnapshot.size);
});
You can also read the length of the docs array inside "querySnapshot".
querySnapshot.docs.length;
Or if a "querySnapshot" is empty by reading the empty value, which will return a boolean value.
querySnapshot.empty;
As far as I know there is no build-in solution for this and it is only possible in the node sdk right now.
If you have a
db.collection('someCollection')
you can use
.select([fields])
to define which field you want to select. If you do an empty select() you will just get an array of document references.
example:
db.collection('someCollection').select().get().then(
(snapshot) => console.log(snapshot.docs.length)
);
This solution is only a optimization for the worst case of downloading all documents and does not scale on large collections!
Also have a look at this:
How to get a count of number of documents in a collection with Cloud Firestore
Aggregate count query just landed as a preview in Firestore.
Announced at the 2022 Firebase Summit: https://firebase.blog/posts/2022/10/whats-new-at-Firebase-Sumit-2022
Excerpt:
[Developer Preview] Count() function: With the new count function in
Firstore [sic], you can now get the count of the matching documents when you
run a query or read from a collection, without loading the actual
documents, which saves you a lot of time.
Code sample they showed at the summit:
During the Q&A, someone asked about pricing for aggregated queries, and the answer the Firebase team provided was that it'll cost 1 / 1000th of the price of a read (rounded up to the nearest read, see comments below for more details), but will count all records that are part of the aggregate.
Be careful counting number of documents for large collections. It is a little bit complex with firestore database if you want to have a precalculated counter for every collection.
Code like this doesn't work in this case:
export const customerCounterListener =
functions.firestore.document('customers/{customerId}')
.onWrite((change, context) => {
// on create
if (!change.before.exists && change.after.exists) {
return firestore
.collection('metadatas')
.doc('customers')
.get()
.then(docSnap =>
docSnap.ref.set({
count: docSnap.data().count + 1
}))
// on delete
} else if (change.before.exists && !change.after.exists) {
return firestore
.collection('metadatas')
.doc('customers')
.get()
.then(docSnap =>
docSnap.ref.set({
count: docSnap.data().count - 1
}))
}
return null;
});
The reason is because every cloud firestore trigger has to be idempotent, as firestore documentation say: https://firebase.google.com/docs/functions/firestore-events#limitations_and_guarantees
Solution
So, in order to prevent multiple executions of your code, you need to manage with events and transactions. This is my particular way to handle large collection counters:
const executeOnce = (change, context, task) => {
const eventRef = firestore.collection('events').doc(context.eventId);
return firestore.runTransaction(t =>
t
.get(eventRef)
.then(docSnap => (docSnap.exists ? null : task(t)))
.then(() => t.set(eventRef, { processed: true }))
);
};
const documentCounter = collectionName => (change, context) =>
executeOnce(change, context, t => {
// on create
if (!change.before.exists && change.after.exists) {
return t
.get(firestore.collection('metadatas')
.doc(collectionName))
.then(docSnap =>
t.set(docSnap.ref, {
count: ((docSnap.data() && docSnap.data().count) || 0) + 1
}));
// on delete
} else if (change.before.exists && !change.after.exists) {
return t
.get(firestore.collection('metadatas')
.doc(collectionName))
.then(docSnap =>
t.set(docSnap.ref, {
count: docSnap.data().count - 1
}));
}
return null;
});
Use cases here:
/**
* Count documents in articles collection.
*/
exports.articlesCounter = functions.firestore
.document('articles/{id}')
.onWrite(documentCounter('articles'));
/**
* Count documents in customers collection.
*/
exports.customersCounter = functions.firestore
.document('customers/{id}')
.onWrite(documentCounter('customers'));
As you can see, the key to prevent multiple execution is the property called eventId in the context object. If the function has been handled many times for the same event, the event id will be the same in all cases. Unfortunately, you must have "events" collection in your database.
In 2020 this is still not available in the Firebase SDK however it is available in Firebase Extensions (Beta) however it's pretty complex to setup and use...
A reasonable approach
Helpers... (create/delete seems redundant but is cheaper than onUpdate)
export const onCreateCounter = () => async (
change,
context
) => {
const collectionPath = change.ref.parent.path;
const statsDoc = db.doc("counters/" + collectionPath);
const countDoc = {};
countDoc["count"] = admin.firestore.FieldValue.increment(1);
await statsDoc.set(countDoc, { merge: true });
};
export const onDeleteCounter = () => async (
change,
context
) => {
const collectionPath = change.ref.parent.path;
const statsDoc = db.doc("counters/" + collectionPath);
const countDoc = {};
countDoc["count"] = admin.firestore.FieldValue.increment(-1);
await statsDoc.set(countDoc, { merge: true });
};
export interface CounterPath {
watch: string;
name: string;
}
Exported Firestore hooks
export const Counters: CounterPath[] = [
{
name: "count_buildings",
watch: "buildings/{id2}"
},
{
name: "count_buildings_subcollections",
watch: "buildings/{id2}/{id3}/{id4}"
}
];
Counters.forEach(item => {
exports[item.name + '_create'] = functions.firestore
.document(item.watch)
.onCreate(onCreateCounter());
exports[item.name + '_delete'] = functions.firestore
.document(item.watch)
.onDelete(onDeleteCounter());
});
In action
The building root collection and all sub collections will be tracked.
Here under the /counters/ root path
Now collection counts will update automatically and eventually! If you need a count, just use the collection path and prefix it with counters.
const collectionPath = 'buildings/138faicnjasjoa89/buildingContacts';
const collectionCount = await db
.doc('counters/' + collectionPath)
.get()
.then(snap => snap.get('count'));
Limitations
As this approach uses a single database and document, it is limited to the Firestore constraint of 1 Update per Second for each counter. It will be eventually consistent, but in cases where large amounts of documents are added/removed the counter will lag behind the actual collection count.
I agree with #Matthew, it will cost a lot if you perform such query.
[ADVICE FOR DEVELOPERS BEFORE STARTING THEIR PROJECTS]
Since we have foreseen this situation at the beginning, we can actually make a collection namely counters with a document to store all the counters in a field with type number.
For example:
For each CRUD operation on the collection, update the counter document:
When you create a new collection/subcollection: (+1 in the counter) [1 write operation]
When you delete a collection/subcollection: (-1 in the counter) [1 write operation]
When you update an existing collection/subcollection, do nothing on the counter document: (0)
When you read an existing collection/subcollection, do nothing on the counter document: (0)
Next time, when you want to get the number of collection, you just need to query/point to the document field. [1 read operation]
In addition, you can store the collections name in an array, but this will be tricky, the condition of array in firebase is shown as below:
// we send this
['a', 'b', 'c', 'd', 'e']
// Firebase stores this
{0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'e'}
// since the keys are numeric and sequential,
// if we query the data, we get this
['a', 'b', 'c', 'd', 'e']
// however, if we then delete a, b, and d,
// they are no longer mostly sequential, so
// we do not get back an array
{2: 'c', 4: 'e'}
So, if you are not going to delete the collection , you can actually use array to store list of collections name instead of querying all the collection every time.
Hope it helps!
As of October 2022, Firestore has introduced a count() method on the client SDKs. Now you can count for a query without downloads.
For 1000 documents, it will charge you for 1 document read.
Web (v9)
Introduced in Firebase 9.11.0:
const collectionRef = collection(db, "cities");
const snapshot = await getCountFromServer(collectionRef);
console.log('count: ', snapshot.data().count);
Web V8
Not Available.
Node (Admin)
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Android (Kotlin)
Introduced in firestore v24.4.0 (BoM 31.0.0):
val query = db.collection("cities")
val countQuery = query.count()
countQuery.get(AggregateSource.SERVER).addOnCompleteListener { task ->
if (task.isSuccessful) {
val snapshot = task.result
Log.d(TAG, "Count: ${snapshot.count}")
} else {
Log.d(TAG, "Count failed: ", task.getException())
}
}
Apple Platforms (Swift)
Introduced in Firestore v10.0.0:
do {
let query = db.collection("cities")
let countQuery = query.countAggregateQuery
let snapshot = try await countQuery.aggregation(source: AggregateSource.server)
print(snapshot.count)
} catch {
print(error)
}
Increment a counter using admin.firestore.FieldValue.increment:
exports.onInstanceCreate = functions.firestore.document('projects/{projectId}/instances/{instanceId}')
.onCreate((snap, context) =>
db.collection('projects').doc(context.params.projectId).update({
instanceCount: admin.firestore.FieldValue.increment(1),
})
);
exports.onInstanceDelete = functions.firestore.document('projects/{projectId}/instances/{instanceId}')
.onDelete((snap, context) =>
db.collection('projects').doc(context.params.projectId).update({
instanceCount: admin.firestore.FieldValue.increment(-1),
})
);
In this example we increment an instanceCount field in the project each time a document is added to the instances sub collection. If the field doesn't exist yet it will be created and incremented to 1.
The incrementation is transactional internally but you should use a distributed counter if you need to increment more frequently than every 1 second.
It's often preferable to implement onCreate and onDelete rather than onWrite as you will call onWrite for updates which means you are spending more money on unnecessary function invocations (if you update the docs in your collection).
No, there is no built-in support for aggregation queries right now. However there are a few things you could do.
The first is documented here. You can use transactions or cloud functions to maintain aggregate information:
This example shows how to use a function to keep track of the number of ratings in a subcollection, as well as the average rating.
exports.aggregateRatings = firestore
.document('restaurants/{restId}/ratings/{ratingId}')
.onWrite(event => {
// Get value of the newly added rating
var ratingVal = event.data.get('rating');
// Get a reference to the restaurant
var restRef = db.collection('restaurants').document(event.params.restId);
// Update aggregations in a transaction
return db.transaction(transaction => {
return transaction.get(restRef).then(restDoc => {
// Compute new number of ratings
var newNumRatings = restDoc.data('numRatings') + 1;
// Compute new average rating
var oldRatingTotal = restDoc.data('avgRating') * restDoc.data('numRatings');
var newAvgRating = (oldRatingTotal + ratingVal) / newNumRatings;
// Update restaurant info
return transaction.update(restRef, {
avgRating: newAvgRating,
numRatings: newNumRatings
});
});
});
});
The solution that jbb mentioned is also useful if you only want to count documents infrequently. Make sure to use the select() statement to avoid downloading all of each document (that's a lot of bandwidth when you only need a count). select() is only available in the server SDKs for now so that solution won't work in a mobile app.
UPDATE 11/20
I created an npm package for easy access to a counter function: https://code.build/p/9DicAmrnRoK4uk62Hw1bEV/firestore-counters
I created a universal function using all these ideas to handle all counter situations (except queries).
The only exception would be when doing so many writes a second, it
slows you down. An example would be likes on a trending post. It is
overkill on a blog post, for example, and will cost you more. I
suggest creating a separate function in that case using shards:
https://firebase.google.com/docs/firestore/solutions/counters
// trigger collections
exports.myFunction = functions.firestore
.document('{colId}/{docId}')
.onWrite(async (change: any, context: any) => {
return runCounter(change, context);
});
// trigger sub-collections
exports.mySubFunction = functions.firestore
.document('{colId}/{docId}/{subColId}/{subDocId}')
.onWrite(async (change: any, context: any) => {
return runCounter(change, context);
});
// add change the count
const runCounter = async function (change: any, context: any) {
const col = context.params.colId;
const eventsDoc = '_events';
const countersDoc = '_counters';
// ignore helper collections
if (col.startsWith('_')) {
return null;
}
// simplify event types
const createDoc = change.after.exists && !change.before.exists;
const updateDoc = change.before.exists && change.after.exists;
if (updateDoc) {
return null;
}
// check for sub collection
const isSubCol = context.params.subDocId;
const parentDoc = `${countersDoc}/${context.params.colId}`;
const countDoc = isSubCol
? `${parentDoc}/${context.params.docId}/${context.params.subColId}`
: `${parentDoc}`;
// collection references
const countRef = db.doc(countDoc);
const countSnap = await countRef.get();
// increment size if doc exists
if (countSnap.exists) {
// createDoc or deleteDoc
const n = createDoc ? 1 : -1;
const i = admin.firestore.FieldValue.increment(n);
// create event for accurate increment
const eventRef = db.doc(`${eventsDoc}/${context.eventId}`);
return db.runTransaction(async (t: any): Promise<any> => {
const eventSnap = await t.get(eventRef);
// do nothing if event exists
if (eventSnap.exists) {
return null;
}
// add event and update size
await t.update(countRef, { count: i });
return t.set(eventRef, {
completed: admin.firestore.FieldValue.serverTimestamp()
});
}).catch((e: any) => {
console.log(e);
});
// otherwise count all docs in the collection and add size
} else {
const colRef = db.collection(change.after.ref.parent.path);
return db.runTransaction(async (t: any): Promise<any> => {
// update size
const colSnap = await t.get(colRef);
return t.set(countRef, { count: colSnap.size });
}).catch((e: any) => {
console.log(e);
});;
}
}
This handles events, increments, and transactions. The beauty in this, is that if you are not sure about the accuracy of a document (probably while still in beta), you can delete the counter to have it automatically add them up on the next trigger. Yes, this costs, so don't delete it otherwise.
Same kind of thing to get the count:
const collectionPath = 'buildings/138faicnjasjoa89/buildingContacts';
const colSnap = await db.doc('_counters/' + collectionPath).get();
const count = colSnap.get('count');
Also, you may want to create a cron job (scheduled function) to remove old events to save money on database storage. You need at least a blaze plan, and there may be some more configuration. You could run it every sunday at 11pm, for example.
https://firebase.google.com/docs/functions/schedule-functions
This is untested, but should work with a few tweaks:
exports.scheduledFunctionCrontab = functions.pubsub.schedule('5 11 * * *')
.timeZone('America/New_York')
.onRun(async (context) => {
// get yesterday
const yesterday = new Date();
yesterday.setDate(yesterday.getDate() - 1);
const eventFilter = db.collection('_events').where('completed', '<=', yesterday);
const eventFilterSnap = await eventFilter.get();
eventFilterSnap.forEach(async (doc: any) => {
await doc.ref.delete();
});
return null;
});
And last, don't forget to protect the collections in firestore.rules:
match /_counters/{document} {
allow read;
allow write: if false;
}
match /_events/{document} {
allow read, write: if false;
}
Update: Queries
Adding to my other answer if you want to automate query counts as well, you can use this modified code in your cloud function:
if (col === 'posts') {
// counter reference - user doc ref
const userRef = after ? after.userDoc : before.userDoc;
// query reference
const postsQuery = db.collection('posts').where('userDoc', "==", userRef);
// add the count - postsCount on userDoc
await addCount(change, context, postsQuery, userRef, 'postsCount');
}
return delEvents();
Which will automatically update the postsCount in the userDocument. You could easily add other one to many counts this way. This just gives you ideas of how you can automate things. I also gave you another way to delete the events. You have to read each date to delete it, so it won't really save you to delete them later, just makes the function slower.
/**
* Adds a counter to a doc
* #param change - change ref
* #param context - context ref
* #param queryRef - the query ref to count
* #param countRef - the counter document ref
* #param countName - the name of the counter on the counter document
*/
const addCount = async function (change: any, context: any,
queryRef: any, countRef: any, countName: string) {
// events collection
const eventsDoc = '_events';
// simplify event type
const createDoc = change.after.exists && !change.before.exists;
// doc references
const countSnap = await countRef.get();
// increment size if field exists
if (countSnap.get(countName)) {
// createDoc or deleteDoc
const n = createDoc ? 1 : -1;
const i = admin.firestore.FieldValue.increment(n);
// create event for accurate increment
const eventRef = db.doc(`${eventsDoc}/${context.eventId}`);
return db.runTransaction(async (t: any): Promise<any> => {
const eventSnap = await t.get(eventRef);
// do nothing if event exists
if (eventSnap.exists) {
return null;
}
// add event and update size
await t.set(countRef, { [countName]: i }, { merge: true });
return t.set(eventRef, {
completed: admin.firestore.FieldValue.serverTimestamp()
});
}).catch((e: any) => {
console.log(e);
});
// otherwise count all docs in the collection and add size
} else {
return db.runTransaction(async (t: any): Promise<any> => {
// update size
const colSnap = await t.get(queryRef);
return t.set(countRef, { [countName]: colSnap.size }, { merge: true });
}).catch((e: any) => {
console.log(e);
});;
}
}
/**
* Deletes events over a day old
*/
const delEvents = async function () {
// get yesterday
const yesterday = new Date();
yesterday.setDate(yesterday.getDate() - 1);
const eventFilter = db.collection('_events').where('completed', '<=', yesterday);
const eventFilterSnap = await eventFilter.get();
eventFilterSnap.forEach(async (doc: any) => {
await doc.ref.delete();
});
return null;
}
I should also warn you that universal functions will run on every
onWrite call period. It may be cheaper to only run the function on
onCreate and on onDelete instances of your specific collections. Like
the noSQL database we are using, repeated code and data can save you
money.
There is no direct option available. You cant't do db.collection("CollectionName").count().
Below are the two ways by which you can find the count of number of documents within a collection.
1 :- Get all the documents in the collection and then get it's size.(Not the best Solution)
db.collection("CollectionName").get().subscribe(doc=>{
console.log(doc.size)
})
By using above code your document reads will be equal to the size of documents within a collection and that is the reason why one must avoid using above solution.
2:- Create a separate document with in your collection which will store the count of number of documents in the collection.(Best Solution)
db.collection("CollectionName").doc("counts")get().subscribe(doc=>{
console.log(doc.count)
})
Above we created a document with name counts to store all the count information.You can update the count document in the following way:-
Create a firestore triggers on the document counts
Increment the count property of counts document when a new document is created.
Decrement the count property of counts document when a document is deleted.
w.r.t price (Document Read = 1) and fast data retrieval the above solution is good.
A workaround is to:
write a counter in a firebase doc, which you increment within a transaction everytime you create a new entry
You store the count in a field of your new entry (i.e: position: 4).
Then you create an index on that field (position DESC).
You can do a skip+limit with a query.Where("position", "<" x).OrderBy("position", DESC)
Hope this helps!
I have try a lot with different approaches.
And finally, I improve one of the methods.
First you need to create a separate collection and save there all events.
Second you need to create a new lambda to be triggered by time. This lambda will Count events in event collection and clear event documents.
Code details in article.
https://medium.com/#ihor.malaniuk/how-to-count-documents-in-google-cloud-firestore-b0e65863aeca
one of the fast + money saver trick is that:-
make a doc and store a 'count' variable in firestore, when user add new doc in the collection, increase that variable, and when user delete a doc, decrease variable. e.g.
updateDoc(doc(db, "Count_collection", "Count_Doc"), {count: increment(1)});
note: use (-1) for decreasing, (1) for increasing count
How it save money and time:-
you(firebase) don't need to loop through the collection, nor browser needs to load whole collection to count number of docs.
all the counts are save in a doc of only one variable named "count" or whatever, so less than 1kb data is used, and it use only 1 reads in firebase firestore.
Solution using pagination with offset & limit:
public int collectionCount(String collection) {
Integer page = 0;
List<QueryDocumentSnapshot> snaps = new ArrayList<>();
findDocsByPage(collection, page, snaps);
return snaps.size();
}
public void findDocsByPage(String collection, Integer page,
List<QueryDocumentSnapshot> snaps) {
try {
Integer limit = 26000;
FieldPath[] selectedFields = new FieldPath[] { FieldPath.of("id") };
List<QueryDocumentSnapshot> snapshotPage;
snapshotPage = fireStore()
.collection(collection)
.select(selectedFields)
.offset(page * limit)
.limit(limit)
.get().get().getDocuments();
if (snapshotPage.size() > 0) {
snaps.addAll(snapshotPage);
page++;
findDocsByPage(collection, page, snaps);
}
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
findDocsPage it's a recursive method to find all pages of collection
selectedFields for otimize query and get only id field instead full body of document
limit max size of each query page
page define inicial page for pagination
From the tests I did it worked well for collections with up to approximately 120k records!
Firestore is introducing a new Query.count() that fetches the count of a query without fetching the docs.
This would allow to simply query all collection items and get the count of that query.
Ref:
Firebase 10 iOS SDK
[JS SDK PR] (https://github.com/firebase/firebase-js-sdk/pull/6608)
There's a new build in function since version 9.11.0 called getCountFromServer(), which fetches the number of documents in the result set without actually downloading the documents.
https://firebase.google.com/docs/reference/js/firestore_#getcountfromserver
Took me a while to get this working based on some of the answers above, so I thought I'd share it for others to use. I hope it's useful.
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.countDocumentsChange = functions.firestore.document('library/{categoryId}/documents/{documentId}').onWrite((change, context) => {
const categoryId = context.params.categoryId;
const categoryRef = db.collection('library').doc(categoryId)
let FieldValue = require('firebase-admin').firestore.FieldValue;
if (!change.before.exists) {
// new document created : add one to count
categoryRef.update({numberOfDocs: FieldValue.increment(1)});
console.log("%s numberOfDocs incremented by 1", categoryId);
} else if (change.before.exists && change.after.exists) {
// updating existing document : Do nothing
} else if (!change.after.exists) {
// deleting document : subtract one from count
categoryRef.update({numberOfDocs: FieldValue.increment(-1)});
console.log("%s numberOfDocs decremented by 1", categoryId);
}
return 0;
});
This uses counting to create numeric unique ID. In my use, I will not be decrementing ever, even when the document that the ID is needed for is deleted.
Upon a collection creation that needs unique numeric value
Designate a collection appData with one document, set with .doc id only
Set uniqueNumericIDAmount to 0 in the firebase firestore console
Use doc.data().uniqueNumericIDAmount + 1 as the unique numeric id
Update appData collection uniqueNumericIDAmount with firebase.firestore.FieldValue.increment(1)
firebase
.firestore()
.collection("appData")
.doc("only")
.get()
.then(doc => {
var foo = doc.data();
foo.id = doc.id;
// your collection that needs a unique ID
firebase
.firestore()
.collection("uniqueNumericIDs")
.doc(user.uid)// user id in my case
.set({// I use this in login, so this document doesn't
// exist yet, otherwise use update instead of set
phone: this.state.phone,// whatever else you need
uniqueNumericID: foo.uniqueNumericIDAmount + 1
})
.then(() => {
// upon success of new ID, increment uniqueNumericIDAmount
firebase
.firestore()
.collection("appData")
.doc("only")
.update({
uniqueNumericIDAmount: firebase.firestore.FieldValue.increment(
1
)
})
.catch(err => {
console.log(err);
});
})
.catch(err => {
console.log(err);
});
});
var variable=0
variable=variable+querySnapshot.count
then if you are to use it on a String variable then
let stringVariable= String(variable)
Along with my npm package adv-firestore-functions above, you can also just use firestore rules to force a good counter:
Firestore Rules
function counter() {
let docPath = /databases/$(database)/documents/_counters/$(request.path[3]);
let afterCount = getAfter(docPath).data.count;
let beforeCount = get(docPath).data.count;
let addCount = afterCount == beforeCount + 1;
let subCount = afterCount == beforeCount - 1;
let newId = getAfter(docPath).data.docId == request.path[4];
let deleteDoc = request.method == 'delete';
let createDoc = request.method == 'create';
return (newId && subCount && deleteDoc) || (newId && addCount && createDoc);
}
function counterDoc() {
let doc = request.path[4];
let docId = request.resource.data.docId;
let afterCount = request.resource.data.count;
let beforeCount = resource.data.count;
let docPath = /databases/$(database)/documents/$(doc)/$(docId);
let createIdDoc = existsAfter(docPath) && !exists(docPath);
let deleteIdDoc = !existsAfter(docPath) && exists(docPath);
let addCount = afterCount == beforeCount + 1;
let subCount = afterCount == beforeCount - 1;
return (createIdDoc && addCount) || (deleteIdDoc && subCount);
}
and use them like so:
match /posts/{document} {
allow read;
allow update;
allow create: if counter();
allow delete: if counter();
}
match /_counters/{document} {
allow read;
allow write: if counterDoc();
}
Frontend
Replace your set and delete functions with these:
set
async setDocWithCounter(
ref: DocumentReference<DocumentData>,
data: {
[x: string]: any;
},
options: SetOptions): Promise<void> {
// counter collection
const counterCol = '_counters';
const col = ref.path.split('/').slice(0, -1).join('/');
const countRef = doc(this.afs, counterCol, col);
const countSnap = await getDoc(countRef);
const refSnap = await getDoc(ref);
// don't increase count if edit
if (refSnap.exists()) {
await setDoc(ref, data, options);
// increase count
} else {
const batch = writeBatch(this.afs);
batch.set(ref, data, options);
// if count exists
if (countSnap.exists()) {
batch.update(countRef, {
count: increment(1),
docId: ref.id
});
// create count
} else {
// will only run once, should not use
// for mature apps
const colRef = collection(this.afs, col);
const colSnap = await getDocs(colRef);
batch.set(countRef, {
count: colSnap.size + 1,
docId: ref.id
});
}
batch.commit();
}
}
delete
async delWithCounter(
ref: DocumentReference<DocumentData>
): Promise<void> {
// counter collection
const counterCol = '_counters';
const col = ref.path.split('/').slice(0, -1).join('/');
const countRef = doc(this.afs, counterCol, col);
const countSnap = await getDoc(countRef);
const batch = writeBatch(this.afs);
// if count exists
batch.delete(ref);
if (countSnap.exists()) {
batch.update(countRef, {
count: increment(-1),
docId: ref.id
});
}
/*
if ((countSnap.data() as any).count == 1) {
batch.delete(countRef);
}*/
batch.commit();
}
see here for more info...
J
This feature is now supported in FireStore, albeit in Beta.
Here are the official Firebase docs
With the new version of Firebase, you can now run aggregated queries!
Simply write
.count().get();
after your query.
As it stands, firebase only allows server-side count, like this
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Please not this is for nodeJS
New feature available in Firebase/Firestore provides a count of documents in a collection:
See this thread to see how to achieve it, with an example.
How To Count Number of Documents in a Collection in Firebase Firestore With a WHERE query in react.js
According to this documentation Cloud Firestore supports the count() aggregation query and is available in preview.
The Flutter/Dart code was missing (at the time of writing this) so I played around with it and the following function seems to work:
Future<int> getCount(String path) async {
var collection = _fireStore.collection(path);
var countQuery = collection.count();
var snapShot = await countQuery.get(source: AggregateSource.server);
return snapShot.count;
}
firebaseFirestore.collection("...").addSnapshotListener(new EventListener<QuerySnapshot>() {
#Override
public void onEvent(QuerySnapshot documentSnapshots, FirebaseFirestoreException e) {
int Counter = documentSnapshots.size();
}
});
So my solution for this problem is a bit non-technical, not super precise, but good enough for me.
Those are my documents. As I have a lot of them (100k+) there are 'laws of big numbers' happening. I can assume that there is less-or-more equal number of items having id starting with 0, 1, 2, etc.
So what I do is I scroll my list till I get into id's starting with 1, or with 01, depending on how long you have to scroll
👆 here we are.
Now, having scrolled so far, I open the inspector and see how much did I scroll and divide it by height of single element
Had to scroll 82000px to get items with id starting with 1. Height of single element is 32px.
It means I have 2500 with id starting with 0, so now I multiply it by number of possible 'starting char'. In firebase it can be A-Z, a-z, 0-9 which means it's 24 + 24 + 10 = 58.
It means I have ~~2500*58 so it gives roughly 145000 items in my collection.
Summarizing: What is wrong with you firebase?

Delete same value from multiple locations Firebase Functions

I have a firebase function that deletes old messages after 24 hours as in my old question here. I now have just the messageIds stored in an array under the user such that the path is: /User/objectId/myMessages and then an array of all the messageIds under myMessages. All of the messages get deleted after 24 hours, but the iDs under the user's profile stay there. Is there a way to continue the function so that it also deletes the messageIds from the array under the user's account?
I'm new to Firebase functions and javascript so I'm not sure how to do this. All help is appreciated!
Building upon #frank-van-puffelen's accepted answer on the old question, this will now delete the message IDs from their sender's user data as part of the same atomic delete operation without firing off a Cloud Function for every message deleted.
Method 1: Restructure for concurrency
Before being able to use this method, you must restructure how you store entries in /User/someUserId/myMessages to follow best practices for concurrent arrays to the following:
{
"/User/someUserId/myMessages": {
"-Lfq460_5tm6x7dchhOn": true,
"-Lfq483gGzmpB_Jt6Wg5": true,
...
}
}
This allows you to modify the previous function to:
// Cut off time. Child nodes older than this will be deleted.
const CUT_OFF_TIME = 24 * 60 * 60 * 1000; // 2 Hours in milliseconds.
exports.deleteOldMessages = functions.database.ref('/Message/{chatRoomId}').onWrite(async (change) => {
const rootRef = admin.database().ref(); // needed top level reference for multi-path update
const now = Date.now();
const cutoff = (now - CUT_OFF_TIME) / 1000; // convert to seconds
const oldItemsQuery = ref.orderByChild('seconds').endAt(cutoff);
const snapshot = await oldItemsQuery.once('value');
// create a map with all children that need to be removed
const updates = {};
snapshot.forEach(messageSnapshot => {
let senderId = messageSnapshot.child('senderId').val();
updates['Message/' + messageSnapshot.key] = null; // to delete message
updates['User/' + senderId + '/myMessages/' + messageSnapshot.key] = null; // to delete entry in user data
});
// execute all updates in one go and return the result to end the function
return rootRef.update(updates);
});
Method 2: Use an array
Warning: This method falls prey to concurrency issues. If a user was to post a new message during the delete operation, it's ID could be removed while evaluating the deletion. Use method 1 where possible to avoid this.
This method assumes your /User/someUserId/myMessages object looks like this (a plain array):
{
"/User/someUserId/myMessages": {
"0": "-Lfq460_5tm6x7dchhOn",
"1": "-Lfq483gGzmpB_Jt6Wg5",
...
}
}
The leanest, most cost-effective, anti-collision function I can come up for this data structure is the following:
// Cut off time. Child nodes older than this will be deleted.
const CUT_OFF_TIME = 24 * 60 * 60 * 1000; // 2 Hours in milliseconds.
exports.deleteOldMessages = functions.database.ref('/Message/{chatRoomId}').onWrite(async (change) => {
const rootRef = admin.database().ref(); // needed top level reference for multi-path update
const now = Date.now();
const cutoff = (now - CUT_OFF_TIME) / 1000; // convert to seconds
const oldItemsQuery = ref.orderByChild('seconds').endAt(cutoff);
const snapshot = await oldItemsQuery.once('value');
// create a map with all children that need to be removed
const updates = {};
const messagesByUser = {};
snapshot.forEach(messageSnapshot => {
updates['Message/' + messageSnapshot.key] = null; // to delete message
// cache message IDs by user for next step
let senderId = messageSnapshot.child('senderId').val();
if (!messagesByUser[senderId]) { messagesByUser[senderId] = []; }
messagesByUser[senderId].push(messageSnapshot.key);
});
// Get each user's list of message IDs and remove those that were deleted.
let pendingOperations = [];
for (let [senderId, messageIdsToRemove] of Object.entries(messagesByUser)) {
pendingOperations.push(admin.database.ref('User/' + senderId + '/myMessages').once('value')
.then((messageArraySnapshot) => {
let messageIds = messageArraySnapshot.val();
messageIds.filter((id) => !messageIdsToRemove.includes(id));
updates['User/' + senderId + '/myMessages'] = messageIds; // to update array with non-deleted values
}));
}
// wait for each user's new /myMessages value to be added to the pending updates
await Promise.all(pendingOperations);
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
Update: DO NOT USE THIS ANSWER (I will leave it as it may still be handy for detecting a delete operation for some other need, but do not use for the purpose of cleaning up an array in another document)
Thanks to #samthecodingman for providing an atomic and concurrency safe answer.
If using Firebase Realtime Database you can add an onChange event listener:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.onDeletedMessage = functions.database.ref('Message/{messageId}').onChange(async event => {
// Exit if this item exists... if so it was not deleted!
if (event.data.exists()) {
return;
}
const userId = event.data.userId; //hopefully you have this in the message document
const messageId = event.data.messageId;
//once('value') useful for data that only needs to be loaded once and isn't expected to change frequently or require active listening
const myMessages = await functions.database.ref('/users/' + userId).once('value').snapshot.val().myMessages;
if(!myMessages || !myMessages.length) {
//nothing to do, myMessages array is undefined or empty
return;
}
var index = myMessages.indexOf(messageId);
if (index === -1) {
//nothing to delete, messageId is not in myMessages
return;
}
//removeAt returns the element removed which we do not need
myMessages.removeAt(index);
const vals = {
'myMessages': myMessages;
}
await admin.database.ref('/users/' + userId).update(vals);
});
If using Cloud Firestore can add an event listener on the document being deleted to handle cleanup in your user document:
exports.onDeletedMessage = functions.firestore.document('Message/{messageId}').onDelete(async event => {
const data = event.data();
if (!data) {
return;
}
const userId = data.userId; //hopefully you have this in the message document
const messageId = data.messageId;
//now you can do clean up for the /user/{userId} document like removing the messageId from myMessages property
const userSnapShot = await admin.firestore().collection('users').doc(userId).get().data();
if(!userSnapShot.myMessages || !userSnapShot.myMessages.length) {
//nothing to do, myMessages array is undefined or empty
return;
}
var index = userSnapShot.myMessages.indexOf(messageId);
if (index === -1) {
//nothing to delete, messageId is not in myMessages
return;
}
//removeAt returns the element removed which we do not need
userSnapShot.myMessages.removeAt(index);
const vals = {
'myMessages': userSnapShot.myMessages;
}
//To update some fields of a document without overwriting the entire document, use the update() method
await admin.firestore().collection('users').doc(userId).update(vals);
});

Firebase Cloud function update all entries in database

i have bunch of comments in Firebase database and i want to do some updates to the comments via Cloud Function ( this is simplified example, i will be doing some logic which does require Cloud Function ).
What i need to do is go through all the comments in the database, adjust its rating node and then update the database with adjusted comments.
I spent a lot of time researching this, but i am completely new to Cloud Functions, so i have realy hard time figuring this out.
I am assuming i want to store all the changes to all the comments (there can be thousands of them) in the array or object and then do the update at one time instead of for each comment separately ?
Btw this code is not working, i am assuming the array and return is completely wrong.
exports.increaseRating = functions.database.ref('/comments/')
.onUpdate((snapshot) => {
var updates = [];
snapshot.before.forEach((element) => {
var comment = element.val();
comment.rating += 1000;
updates.push(comment);
});
return updates;
})
Code i am using to update one entry. I need to do the same thing for all the comments at one time.
exports.increaseRating = functions.database.ref('/comments/{commentId}')
.onUpdate((snapshot, context) => {
const comment = snapshot.before.val();
const newRating = comment.rating += 1000;
const now = new Date().getTime();
if (comment.lastUpdate) {
if (comment.lastUpdate > now - (30 * 1000)) {
return null;
}
}
return admin.database().ref(`/comments/${context.params.commentId}`).update({
"rating": newRating,
"lastUpdate": now
})
})
If you want to update all child nodes, you can do something like this:
var ref = firebase.database().ref("comments"); // or admin.database().ref("comments")
ref.once("value").then((snapshot) => {
var updates = {};
snapshot.forEach((commentSnapshot => {
var comment = commentSnapshot.val();
var newRating = comment.rating + 1000;
updates[commentSnapshot.key+"/rating"] = newRating;
});
ref.update(updates);
})
This performs a single multi-location update for all comments. Note that the performance benefit over performing separate updates is quite small, since Firebase pipelines the multiple requests over a single connection.
Also note that you should not put this in a Cloud Functions trigger on /comments, since that will lead to an endless loop: every time the comments get written, your function triggers, which updates the comments, which triggers the function again.
If you need this in Cloud Functions, you'll want to use a HTTP-triggered function, which is triggered by HTTP calls instead of database writes.
exports.updateCommentRatings = functions.https.onRequest((req, res) => {
var ref = admin.database().ref("comments")
ref.once("value").then((snapshot) => {
var updates = {};
snapshot.forEach((commentSnapshot => {
var comment = commentSnapshot.val();
var newRating = comment.rating + 1000;
updates[commentSnapshot.key+"/rating"] = newRating;
});
ref.update(updates).then(() => {
res.status(200).send("Comment ratings updated");
});
})
})
You can then periodically call this URL/function with a service like cron-job.org. For more on this see Cloud Functions for Firebase trigger on time?.

Detect changes in one ref and write in another Firebase Cloud

I'm trying Firebase Cloud Functions in my app using this code to remove data after 2 hours of created.
exports.deleteOldItems = functions.database.ref('/Rooms/{pushId}')
.onWrite(event => {
var ref = event.data.ref.parent; // reference to the items
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var oldItemsQuery = ref.orderByChild('timestampCreated/timestamp').endAt(cutoff);
return oldItemsQuery.once('value', function(snapshot) {
// create a map with all children that need to be removed
var updates = {};
snapshot.forEach(function(child) {
updates[child.key] = null
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
});
This works. Now I want to write in another ref (for example: /Users/{userID}/) every time data is deleted. Regards
Depending on whether you want the update to run as the current user or as an administrator, you can use event.data.ref or event.data.adminRef and work from there:
exports.deleteOldItems = functions.database.ref('/Rooms/{pushId}')
.onWrite(event => {
...
var ref = event.data.ref.root;
return ref.child("/Users/123").set("New value");
});
things have changed on version 1.0, adminRef is deprecated, you should use just ref for admin access and event has been substituted by snapshot and context, see here: cloud functions documentation 1.0 API changes
Frank's example becomes:
exports.deleteOldItems = functions.database.ref('/Rooms/{pushId}')
.onWrite((snapshot,context) => {
...
var ref = snapshot.ref.root;
return ref.child("/Users/123").set("New value");
});

Delete firebase data older than 2 hours

I would like to delete data that is older than two hours. Currently, on the client-side, I loop through all the data and run a delete on the outdated data. When I do this, the db.on('value') function is invoked every time something is deleted. Also, things will only be deleted when a client connects, and what might happen if two clients connect at once?
Where can I set up something that deletes old data? I have a timestamp inside each object created by a JavaScript Date.now().
Firebase does not support queries with a dynamic parameter, such as "two hours ago". It can however execute a query for a specific value, such as "after August 14 2015, 7:27:32 AM".
That means that you can run a snippet of code periodically to clean up items that are older than 2 hours at that time:
var ref = firebase.database().ref('/path/to/items/');
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var old = ref.orderByChild('timestamp').endAt(cutoff).limitToLast(1);
var listener = old.on('child_added', function(snapshot) {
snapshot.ref.remove();
});
As you'll note I use child_added instead of value, and I limitToLast(1). As I delete each child, Firebase will fire a child_added for the new "last" item until there are no more items after the cutoff point.
Update: if you want to run this code in Cloud Functions for Firebase:
exports.deleteOldItems = functions.database.ref('/path/to/items/{pushId}')
.onWrite((change, context) => {
var ref = change.after.ref.parent; // reference to the items
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var oldItemsQuery = ref.orderByChild('timestamp').endAt(cutoff);
return oldItemsQuery.once('value', function(snapshot) {
// create a map with all children that need to be removed
var updates = {};
snapshot.forEach(function(child) {
updates[child.key] = null
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
});
This function triggers whenever data is written under /path/to/items, so child nodes will only be deleted when data is being modified.
This code is now also available in the functions-samples repo.
I have a http triggered cloud function that deletes nodes, depending on when they were created and their expiration date.
When I add a node to the database, it needs two fields: timestamp to know when it was created, and duration to know when the offer must expire.
Then, I have this http triggered cloud function:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
/**
* #function HTTP trigger that, when triggered by a request, checks every message of the database to delete the expired ones.
* #type {HttpsFunction}
*/
exports.removeOldMessages = functions.https.onRequest((req, res) => {
const timeNow = Date.now();
const messagesRef = admin.database().ref('/messages');
messagesRef.once('value', (snapshot) => {
snapshot.forEach((child) => {
if ((Number(child.val()['timestamp']) + Number(child.val()['duration'])) <= timeNow) {
child.ref.set(null);
}
});
});
return res.status(200).end();
});
You can create a cron job that every X minutes makes a request to the URL of that function: https://cron-job.org/en/
But I prefer to run my own script, that makes a request every 10 seconds:
watch -n10 curl -X GET https://(your-zone)-(your-project-id).cloudfunctions.net/removeOldMessages
In the latest version of Firebase API, ref() is changed to ref
var ref = new Firebase('https://yours.firebaseio.com/path/to/items/');
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var old = ref.orderByChild('timestamp').endAt(cutoff).limitToLast(1);
var listener = old.on('child_added', function(snapshot) {
snapshot.ref.remove();
});
If someone will have the same problem, but in Firestore. I did a little script that at first read documents to console.log and then delete documents from a collection messages older than 24h. Using https://cron-job.org/en/ to refresh website every 24h and that's it. Code is below.
var yesterday = firebase.firestore.Timestamp.now();
yesterday.seconds = yesterday.seconds - (24 * 60 * 60);
console.log("Test");
db.collection("messages").where("date",">",yesterday)
.get().then(function(querySnapshote) {
querySnapshote.forEach(function(doc) {
console.log(doc.id," => ",doc.data());
});
})
.catch(function(error) {
console.log("Error getting documents: ", error);
});
db.collection("messages").where("date","<",yesterday)
.get().then(function(querySnapshote) {
querySnapshote.forEach(element => {
element.ref.delete();
});
})
You could look into Scheduling Firebase Functions with Cron Jobs. That link shows you how to schedule a Firebase Cloud Function to run at a fixed rate. In the scheduled Firebase Function you could use the other answers in this thread to query for old data and remove it.

Categories

Resources