Can't change data in Firebase Realtime Database when using Cloud Functions - javascript

I want to reset a specific value in my Firebase Realtime Database every day at 12:00 AM. I'm using Firebase Cloud Functions to do this. This is the code that I have:
exports.dailyReset = functions.pubsub.schedule('0 0 * * *').onRun((context) => {
exports.resetValue = functions.database.ref('/users/{userId}')
.onWrite((change, context) => {
var ref = change.data.ref;
ref.update({
"daily": 0
});
return ref.update({"daily": 0});
});
return null;
});
Upon testing the script, it doesn't work. It's not resetting the value in my Firebase Realtime Database. Can someone tell me how do I fix this?

It's not possible to use the Functions SDK to write to the database. functions can only be used to establish triggers that run when the triggering event occurs.
In other words, functions.database.ref('/users/{userId}').onWrite() isn't going to write anything at all.
If you want to write to Realtime Database from a nodejs program, you should use the Firebase Admin SDK to write data.

The Cloud Functions triggers in your index.js file have to be known and fixed when run firebase deploy. That means you can't dynamically create triggers from inside a Cloud Functions, as you're trying to do.
The common approaches for dynamic scheduling are:
Have a single Cloud Function that runs periodically and then executes the tasks for the past time period.
Dynamically schedule Cloud Functions with Cloud Tasks, as Doug describes in his blog post How to schedule a Cloud Function to run in the future with Cloud Tasks (to build a Firestore document TTL).
But in your case, why do you even need the onWrite trigger? Can't you just import the Admin SDK, user that to read all users, and then delete them?

Related

Issue using Cloud Functions to sync realtime database with Firestore

I'm using an extremely simple Cloud Function to (try to) keep my realtime database in sync with the Firestore.
exports.copyDocument = functions.database.ref('/invoices/{companyId}/{documentId}')
.onWrite((change, context) => {
if (!change.after.exists()) {
return null;
}
return admin.firestore().collection('companies').doc(context.params.companyId).collection('invoices')
.doc(context.params.documentId)
.set(change.after.val());
});
Unfortunately, I am seeing issues where sometimes the Cloud Firestore document does not have the latest copy of the realtime DB data. It's infrequent, but nonetheless impacting my end users.
Why is this happening?
Two ideas I had were -
The Firestore is possibly unavailable to write to in the Cloud Function, and I don't have retries enabled in my Cloud Function, so it could just be erroring out. Enabling retries will solve my problem. However, I have absolutely 0 error logs of this happening.
User makes change A, then user makes change B 2-3 seconds later. In this case, it's possible, I guess, that the Cloud Function trigger for change A hasn't executed, but the trigger for change B quickly executes, then the change A trigger executes and copies 'stale' data. Possible remedies would be to fetch the latest version again from the realtime database in my Cloud Function (not ideal, it's nice using change.after.val()), or perhaps keep some incrementing integer on my document, and instead copy it to the Firestore using a transaction that compares versions instead of a simple set().
The only error message I do see in my Cloud Function error logs is:
The request was aborted because there was no available instance. Additional troubleshooting documentation can be found at: https://cloud.google.com/functions/docs/troubleshooting#scalability
But those docs indicate that this error is always retried, even without explicitly enabling function retries.
I am leaning towards issue 1 - the Firestore just 'blips' sometimes, retries aren't enabled, and I'm not logging the failed promise properly. If this is the case - how can I fix this logging?

Best way to have a Node.JS server keep updated with a FireBase database in real time?

I currently have a Node.JS server set up that is able to read and write data from a FireBase database when a request is made from a user.
I would like to implement time based events that result in an action being performed at a certain date or time. The key thing here though, is that I want to have the freedom to do this in seconds (for example, write a message to console after 30 seconds have passed, or on Friday the 13th at 11:30am).
A way to do this would be to store the date/time an action needs be performed in the database, and read from the database every second and compare the current date/time with events stored so we know if an action needs to be performed at this moment. As you can imagine though, this would be a lot of unnecessary calls to the database and really feels like a poor way to implement this system.
Is there a way I can stay synced with the database without having to call every second? Perhaps I could then store a version of the events table locally and update this when a change is made to the database? Would that be a better idea? Is there another solution I am missing?
Any advice would be greatly appreciated, thanks!
EDIT:
How I currently initialise the database:
firebase.initializeApp(firebaseConfig);
var database = firebase.database();
How I then get data from the database:
await database.ref('/').once('value', function(snapshot){
snapshot.forEach(function(childSnapshot){
if(childSnapshot.key === userName){
userPreferences = childSnapshot.val().UserPreferences;
}
})
});
The Firebase once() API reads the data from the database once, and then stops observing it.
If you instead us the on() API, it will continue observing the database after getting the initial value - and call your code whenever the database changes.
It sounds like you're looking to develop an application for scheduling. If that's the case you should check out node-schedule.
Node Schedule is a flexible cron-like and not-cron-like job scheduler
for Node.js. It allows you to schedule jobs (arbitrary functions) for
execution at specific dates, with optional recurrence rules. It only
uses a single timer at any given time (rather than reevaluating
upcoming jobs every second/minute).
You then can use the database to keep a "state" of the application so on start-up of the application you read all the upcoming jobs that will be expected and load them into node-schedule and let node-schedule do the rest.
The Google Cloud solution for scheduling a single item of future work is Cloud Tasks. Firebase is part of Google Cloud, so this is the most natural product to use. You can use this to avoid polling the database by simply specifying exactly when some Cloud Function should run to do the work you want.
I've written a blog post that demonstrates how to set up a Cloud Task to call a Cloud Functions to delete a document in Firestore with an exact TTL.

Is there a way to setup an self destructing value on Firebase real time database or firestore? [duplicate]

I am looking for a way to schedule Cloud Functions for Firebase or in other words trigger them on a specific time.
Update 2019-04-18
There is now a very simple way to deploy scheduled code on Cloud Functions through Firebase.
You can either use a simple text syntax:
export scheduledFunctionPlainEnglish =
functions.pubsub.schedule('every 5 minutes').onRun((context) => {
console.log('This will be run every 5 minutes!');
})
Or the more flexible cron table format:
export scheduledFunctionCrontab =
functions.pubsub.schedule('5 11 * * *').onRun((context) => {
console.log('This will be run every day at 11:05 AM UTC!');
});
To learn more about this, see:
The Scheduling Cloud Functions for Firebase blog post introducing the feature.
The documentation on scheduled functions.
Note that your project needs to be on a Blaze plan for this to work, so I'm leaving the alternative options below for reference.
If you want to schedule a single invocation of a Cloud Function on a delay from within the execution of another trigger, you can use Cloud Tasks to set that up. Read this article for an extended example of how that can work.
Original answer below...
There is no built-in runat/cron type trigger yet.
For the moment, the best option is to use an external service to trigger a HTTP function periodically. See this sample in the functions-samples repo for more information. Or use the recently introduced Google Cloud Scheduler to trigger Cloud Functions through PubSub or HTTPS:
I also highly recommend reading this post on the Firebase blog: How to Schedule (Cron) Jobs with Cloud Functions for Firebase and this video: Timing Cloud Functions for Firebase using an HTTP Trigger and Cron.
That last link uses cron-job.org to trigger Cloud Functions, and works for projects that are on a free plan. Note that this allows anyone to call your function without authorization, so you may want to include some abuse protection mechanism in the code itself.
What you can do, is spin up an AppEngine instance that is triggered by cron job and emits to PubSub. I wrote a blog post specifically on that, you might want to take a look:
https://mhaligowski.github.io/blog/2017/05/25/scheduled-cloud-function-execution.html
It is important to first note that the default timezone your functions will execute on is America/Los_Angeles according to the documentation. You may find a list of timezones here if you'd like to trigger your function(s) on a different timezone.
NB!!: Here's a useful website to assist with cron table formats (I found it pretty useful)
Here's how you'd go about it:
(Assuming you'd like to use Africa/Johannesburg as your timezone)
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.timeZone('Africa/Johannesburg').onRun(() => {
console.log("successfully executed at 23:10 Johannesburg Time!!");
});
Otherwise if you'd rather stick to the default:
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.onRun(() => {
console.log("successfully executed at 23:10 Los Angeles Time!!");
});

Does not Cloud Spanner manage sessions properly?

I have looked up for this issue but could not find any sufficient information about it.
Google Cloud Spanner client libraries handles sessions automatically and its limit is 10.000 sessions for each node, no problem till here.
I have a micro serviced application which also has Google Cloud Functions. I am doing some specific database jobs on Cloud Functions and I'm also calling those functions continuously. After a little while, Cloud Spanner is starting to throw an error;
Too many active sessions in database, limit is 10000. Increase the node count to allow more sessions.
I know about the limits, but there is not any operation that will cause my app to exceed those limits.
After I noticed this, I have two questions which I could not find any answer;
1- Does Cloud Functions creates new session for every call? (I am using HTTP Trigger)
Here is what I did so far;
1- Here is example cloud functions declaration of mine;
exports.myFunction = function myFunction(req, res) {}
I was declaring my database instance out of this scope before I realize this issue;
const db = Spanner({projectId: '[my-project]'}).instance('[my-cs-instance]').database('[my-database]');
exports.myFunction = function myFunction(req, res) {}
After this issue, I have put it in the scope like this, and closed the database session after I'm done;
exports.myFunction = function myFunction(req, res) {
const db = Spanner({projectId: '[my-project]'}).instance('[my-cs-instance]').database('[my-database]');
// codes
db.close();
}
That didn't change anything, it still exceeds the session limit after a while.
Do you have any experience what causes this? Is this related to Cloud Functions or Cloud Spanner itself?
2- If every transaction object use one connection at a time, what happens in this scenario.
I have a REST endpoint other than these Cloud Functions. It creates a database instance when its starting to listen HTTP endpoints and I am not creating any other instance in its lifecycle anymore. At that endpoint, I am making CRUDs and I am using transactions and they all use the same instance which I created at the start of process. My experience is;
Sometimes transactions or other CRUD operations works with a bit delay which does not happen all the time.
My question is;
Is that because when transaction starts to work, does it lock the connection and all other operations should wait until it ends? If so, should I create independent database instances for transactions on that endpoint?
Thanks in advance
This now has been fixed per the issue opened at #89 and the fix at #91, and logged as #71987137 at Google Issue Trackers.
If any issue persists, please report at Google issue tracker they will re-open to examine.

Firebase Run a Query each 24 hours [duplicate]

I am looking for a way to schedule Cloud Functions for Firebase or in other words trigger them on a specific time.
Update 2019-04-18
There is now a very simple way to deploy scheduled code on Cloud Functions through Firebase.
You can either use a simple text syntax:
export scheduledFunctionPlainEnglish =
functions.pubsub.schedule('every 5 minutes').onRun((context) => {
console.log('This will be run every 5 minutes!');
})
Or the more flexible cron table format:
export scheduledFunctionCrontab =
functions.pubsub.schedule('5 11 * * *').onRun((context) => {
console.log('This will be run every day at 11:05 AM UTC!');
});
To learn more about this, see:
The Scheduling Cloud Functions for Firebase blog post introducing the feature.
The documentation on scheduled functions.
Note that your project needs to be on a Blaze plan for this to work, so I'm leaving the alternative options below for reference.
If you want to schedule a single invocation of a Cloud Function on a delay from within the execution of another trigger, you can use Cloud Tasks to set that up. Read this article for an extended example of how that can work.
Original answer below...
There is no built-in runat/cron type trigger yet.
For the moment, the best option is to use an external service to trigger a HTTP function periodically. See this sample in the functions-samples repo for more information. Or use the recently introduced Google Cloud Scheduler to trigger Cloud Functions through PubSub or HTTPS:
I also highly recommend reading this post on the Firebase blog: How to Schedule (Cron) Jobs with Cloud Functions for Firebase and this video: Timing Cloud Functions for Firebase using an HTTP Trigger and Cron.
That last link uses cron-job.org to trigger Cloud Functions, and works for projects that are on a free plan. Note that this allows anyone to call your function without authorization, so you may want to include some abuse protection mechanism in the code itself.
What you can do, is spin up an AppEngine instance that is triggered by cron job and emits to PubSub. I wrote a blog post specifically on that, you might want to take a look:
https://mhaligowski.github.io/blog/2017/05/25/scheduled-cloud-function-execution.html
It is important to first note that the default timezone your functions will execute on is America/Los_Angeles according to the documentation. You may find a list of timezones here if you'd like to trigger your function(s) on a different timezone.
NB!!: Here's a useful website to assist with cron table formats (I found it pretty useful)
Here's how you'd go about it:
(Assuming you'd like to use Africa/Johannesburg as your timezone)
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.timeZone('Africa/Johannesburg').onRun(() => {
console.log("successfully executed at 23:10 Johannesburg Time!!");
});
Otherwise if you'd rather stick to the default:
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.onRun(() => {
console.log("successfully executed at 23:10 Los Angeles Time!!");
});

Categories

Resources