Cloud Functions - How to instantiate global functions/variables only once? - javascript

I have a firebase application that uses Cloud Functions to talk to a Google Cloud SQL instance. These cloud functions are used to perform CRUD actions. I would like to ensure that the database reflects the CRUD operations, as such, run migration code every time I push new function code to ensure the database is always up to date.
I do this in a global function
const functions = require('firebase-functions')
const pg = require('pg')
// Create if not exists database
(function() {
console.log('create db...')
})()
exports.helloWorld = functions.https.onRequest((request, response) => {
console.log('Hello from Firebase function log!')
response.send('Hello from Firebase!')
})
exports.helloWorld2 = functions.https.onRequest((request, response) => {
console.log('Hello from Firebase function log 2!')
response.send('Hello from Firebase 2!')
})
This console log then runs twice when I deploy.
Now I understand that there is no way of knowing how many instances Cloud Functions will spin up for the functions, as stated in their docs:
The global scope in the function file, which is expected to contain the function definition, is executed on every cold start, but not if the instance has already been initialized.`
If I add a third function, this console log is now shown 3 times in the logs, instead of 2, one for each function. Would it be correct in saying that there's a new instance for every single function uploaded? I am trying to understand what happens under the hood when I upload a set of cloud functions.
If so - is there no reliable way to run migration code inside a global function in cloud functions?

What you're doing isn't a supported use case for Cloud Functions. Cloud Functions code runs in response to events that occur in your project. There is no "one time" function invocations that happen on deployment. If you need to run code a single time, just run that from your desktop or some other server you control.
You should also strive to minimize the amount of work that happens in the global scope of your functions. Globals will be instantiated and run once for each allocated server instance running a function in your app, as each function runs in full isolation of each other, and each has its own copy of everything. Watch my video about function scaling and isolation to better understand this behavior.

Related

Can't change data in Firebase Realtime Database when using Cloud Functions

I want to reset a specific value in my Firebase Realtime Database every day at 12:00 AM. I'm using Firebase Cloud Functions to do this. This is the code that I have:
exports.dailyReset = functions.pubsub.schedule('0 0 * * *').onRun((context) => {
exports.resetValue = functions.database.ref('/users/{userId}')
.onWrite((change, context) => {
var ref = change.data.ref;
ref.update({
"daily": 0
});
return ref.update({"daily": 0});
});
return null;
});
Upon testing the script, it doesn't work. It's not resetting the value in my Firebase Realtime Database. Can someone tell me how do I fix this?
It's not possible to use the Functions SDK to write to the database. functions can only be used to establish triggers that run when the triggering event occurs.
In other words, functions.database.ref('/users/{userId}').onWrite() isn't going to write anything at all.
If you want to write to Realtime Database from a nodejs program, you should use the Firebase Admin SDK to write data.
The Cloud Functions triggers in your index.js file have to be known and fixed when run firebase deploy. That means you can't dynamically create triggers from inside a Cloud Functions, as you're trying to do.
The common approaches for dynamic scheduling are:
Have a single Cloud Function that runs periodically and then executes the tasks for the past time period.
Dynamically schedule Cloud Functions with Cloud Tasks, as Doug describes in his blog post How to schedule a Cloud Function to run in the future with Cloud Tasks (to build a Firestore document TTL).
But in your case, why do you even need the onWrite trigger? Can't you just import the Admin SDK, user that to read all users, and then delete them?

Memory Space issue while adding more images for generating word document using docx node library

const imageResponse = await axios.get(url[0], {
responseType: "arraybuffer",
});
const buffer = Buffer.from(imageResponse.data, "utf-8");
const image = Media.addImage(doc, buffer);
I'm using the above code inside one loop that will execute 100 times because it has 100 images. Each image size is max 150kb. I deployed the cloud function with 256mb. I'm getting "Error: memory limit exceeded. Function invocation was interrupted".
Problem statement:
I need to add 250 images in word document. I'm getting memory limit exceeded error.
Q&A
Is there any way to get one image and add to word document, after that clearing the memory used by the image?
How to effectively use this plugin in firebase cloud function with cloud storage for images?
Environment:
Firebase Cloud Function (NodeJs)
Size : 256mb
Word Doc Generating Library : docx (https://docx.js.org/#/)
For the kind of scenario you are describing, as Doug mentions, you should consider increasing your resources to better handle the requests to your functions.
You can set the memory using the flag memory using the gcloud command available for deploy your functions, for example:
gcloud beta functions deploy my_function --runtime=python37 --trigger-event=providers/cloud.firestore/eventTypes/document.write --trigger-resource=projects/project_id/databases/(default)/documents/messages/{pushId}
--memory=AmountOfMemory
I recommend you take a look at the best practices for cloud functions document where is explained:
"Local disk storage in the temporary directory is an in-memory
filesystem. Files that you write consume memory available to your
function, and sometimes persist between invocations. Failing to
explicitly delete these files may eventually lead to an out-of-memory
error and a subsequent cold start."
For have a better perspective about how Cloud functions manage the requests, check this document where is mentioned:
"Cloud Functions handles incoming requests by assigning them to
instances of your function. Depending on the volume of requests, as
well as the number of existing function instances, Cloud Functions may
assign a request to an existing instance or create a new one
Each instance of a function handles only one concurrent request at a
time. This means that while your code is processing one request, there
is no possibility of a second request being routed to the same
instance. Thus the original request can use the full amount of
resources (CPU and memory) that you requested."

How can I parameterize a Firebase API call within a closure?

I'm using Firebase cloud functions to sync my Firestore db to a 3rd party db. Since my cloud functions use calls to a 3rd party API, I want to parameterize the api call as an input to use dependency injection while testing. The only way I can think of doing this is basically putting the cloud function within a regular function, i.e.,:
function foo(apiCall = api) {
exports.bar =
functions.firestore.document(doc_name).onCreate(snapshot => apiCall(snapshot));
return exports.bar
}
foo();
When I try to deploy with firebase deploy --only functions I'm told firebase doesn't see the functions as existing in the local source code.
I've additionally tried something like to no effect:
function foo(apiCall = api) {
return functions.firestore.document(doc_name).onCreate(snapshot => apiCall(snapshot));
}
exports.bar = foo();
When I put exports.bar outside of the function, it deploys fine.
This is my first time using Firebase so I'm not too familiar with the syntax and such, but I don't know why wrapping the cloud function with a regular function wouldn't work - any suggestions?
What you're trying to do isn't possible. The exports must be defined statically at the top level of index.js so that the CLI can find and deploy them. They can't be exported dynamically through a function.

Trying to understand Google Cloud Platform functions

I want to create a REST API for a mobile application.
I wanted to try GCP Functions to see if it could fit my needs.
Now I got some problems. I think I misunderstood something.
When I try a function locally with the firebase-tools, the server is recreating everytime a request comes to the function instance. I thought the instance would keep my server alive for some time.
I know that each instance can only process one request at a time. But I am scared that the time it loses recreating the server at every request is a lot.
I am sure there is something wrong in my understanding.
I just want to know how it works to make the best of it.
Thank you :)
Here is my main function with a nestjs server
import { NestFactory } from '#nestjs/core'
import { AppModule } from './module'
import { loggerMiddleware } from './middlewares/logger.middleware'
import { ExpressAdapter } from '#nestjs/platform-express'
import * as functions from 'firebase-functions'
import * as express from 'express'
const server: express.Application = express()
export async function createNestServer(server) {
const app = await NestFactory.create(AppModule, new ExpressAdapter(server))
app.use(loggerMiddleware)
return app.init()
}
createNestServer(server)
.then(() => console.log('Nest ok..'))
.catch(error => console.log('Nest broken', error))
export const api = functions.https.onRequest(server)
Edit:
Screenshot of the initialization logs when a request comes in the GCP function instance
As you can see, the function ends before the end of the NestJS server initialization. And it is doing this initialization each time I make a request to this URL. Even if the NestJS server is up, it does not keep the state to the next call.
If your function takes a long time to execute, then Cloud Functions will spin up a new instance whenever there is no existing instance idle and ready to accept the connection. Eventually, when your function finishes, the instance that handled it will go idle.
Code that runs in the global scope of your code will be executed as part of the first incoming request. It does not come "for free" in anyway.
For HTTP type connections, the concurrent access limit is based on the bandwidth required by your function, not its compute time. You could have hundreds of instances concurrently operating on longer-running functions, as long as you're willing to pay the cost of each one.
It would be good to understand the documented limits and quotas of Cloud Functions.

Does not Cloud Spanner manage sessions properly?

I have looked up for this issue but could not find any sufficient information about it.
Google Cloud Spanner client libraries handles sessions automatically and its limit is 10.000 sessions for each node, no problem till here.
I have a micro serviced application which also has Google Cloud Functions. I am doing some specific database jobs on Cloud Functions and I'm also calling those functions continuously. After a little while, Cloud Spanner is starting to throw an error;
Too many active sessions in database, limit is 10000. Increase the node count to allow more sessions.
I know about the limits, but there is not any operation that will cause my app to exceed those limits.
After I noticed this, I have two questions which I could not find any answer;
1- Does Cloud Functions creates new session for every call? (I am using HTTP Trigger)
Here is what I did so far;
1- Here is example cloud functions declaration of mine;
exports.myFunction = function myFunction(req, res) {}
I was declaring my database instance out of this scope before I realize this issue;
const db = Spanner({projectId: '[my-project]'}).instance('[my-cs-instance]').database('[my-database]');
exports.myFunction = function myFunction(req, res) {}
After this issue, I have put it in the scope like this, and closed the database session after I'm done;
exports.myFunction = function myFunction(req, res) {
const db = Spanner({projectId: '[my-project]'}).instance('[my-cs-instance]').database('[my-database]');
// codes
db.close();
}
That didn't change anything, it still exceeds the session limit after a while.
Do you have any experience what causes this? Is this related to Cloud Functions or Cloud Spanner itself?
2- If every transaction object use one connection at a time, what happens in this scenario.
I have a REST endpoint other than these Cloud Functions. It creates a database instance when its starting to listen HTTP endpoints and I am not creating any other instance in its lifecycle anymore. At that endpoint, I am making CRUDs and I am using transactions and they all use the same instance which I created at the start of process. My experience is;
Sometimes transactions or other CRUD operations works with a bit delay which does not happen all the time.
My question is;
Is that because when transaction starts to work, does it lock the connection and all other operations should wait until it ends? If so, should I create independent database instances for transactions on that endpoint?
Thanks in advance
This now has been fixed per the issue opened at #89 and the fix at #91, and logged as #71987137 at Google Issue Trackers.
If any issue persists, please report at Google issue tracker they will re-open to examine.

Categories

Resources