Where should i declare a new firebase database instance? - javascript

When using const db = firebase.database(), does it matter where I declare this in a cloud function script?
For example, index.ts which contains all of my cloud functions, should I declare it at the top, or in each individual function?
const db = firebase.database()
export const functionA = functions.https.onCall(async (data, context) => {
// use db here
});
export const functionB = functions.https.onCall(async (data, context) => {
// use db here
});
OR
export const functionA = functions.https.onCall(async (data, context) => {
const db = firebase.database()
});
export const functionB = functions.https.onCall(async (data, context) => {
const db = firebase.database()
});
Or does this not matter?

The first approach creates the db instance when the code loads.
The second approach creates the db instance when the code runs.
Neither is pertinently better than the other, and the firebase.database() operation is very lightweight so it's likely to make little difference in practice.
What does make a difference is whether you load the database SDK to begin with. Some of the SDKs in Firebase are quite big, and not each Cloud Function needs all SDKs. So a common trick to speed up load/cold-start times is to move the require('firebase-...') statements into the body of the Cloud Function(s) that require them.

Normally, if you want to interact, from a Cloud Function, with the Realtime Database you just need to initialize the Admin SDK and get the Database service for the default app (or a given app), as explained here in the doc.
So you would do something like:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
// Get the Database service for the default app
const db = admin.database();
To get more details on what Frank explains about Cold Start in his answer, you should read the following article: https://medium.com/firebase-developers/organize-cloud-functions-for-max-cold-start-performance-and-readability-with-typescript-and-9261ee8450f0
However in your case, since your two Cloud Functions use the Admin SDK, there shouldn't be any difference between the two approaches presented in your question, as Frank mentioned.

Related

How can I invoke a firebase storage function locally and manually?

I am fairly familiar with invoking firebase firestore functions manually for testing. I have done this in the past with the docs here. In short I make a wrappedOnWrite variable, assign it to my admin sdk.wrap(my firebase function), and then I invoke it with an object that represents the firestore collection before and after the call. Example:
const testEnv = functions({ projectId: **** }, '****');
let wrappedOnWrite:
| WrappedScheduledFunction
| WrappedFunction<Change<QueryDocumentSnapshot>>;
beforeAll(() => {
// wrap the firebase function, ready to be invoked
wrappedOnWrite = testEnv.wrap(moderateCompetition);
});
const changeDoc: Change<QueryDocumentSnapshot> = {
before: testEnv.firestore.makeDocumentSnapshot(dataBefore, path),
after: testEnv.firestore.makeDocumentSnapshot(dataAfter, path),
};
// call firebase function with the changes inputted to the function
await wrappedOnWrite(changeDoc);
This is great for testing my firestore collections with jest, however, I am never seen this done with firebase storage, and I haven't been able to find many useful resources either. I have a pretty basic firestore .onFinalize function for storage:
export const blurImages = functions.storage.object().onFinalize(async (object) => {}
Is there any way to manually test it like in the example I gave above? When I initially deployed the function, it ran recursively and charged my company a couple of dollars, and I need to be able to run it periodically so that recession doesn't happen on the live function. Thanks in advance :)

How do I create a javascript helper function / class that can perform an async task before each exposed method

I've got a React app that needs to call the AWS S3 API a couple times. Login for my app uses Cognito, set up via Amplify. It's all working fine, but I'm trying to clean up some code and am struggling with the right way to structure the code.
What I am currently doing is importing Auth from aws-amplify, as well as AWS from the aws-sdk.
This doesn't seem right, but it is the way I've solved it.. but essentially I had to create an async function to get the credentials in a format that works for the AWS SDK. That function looks like this:
const getCreds = async () => {
let curCred = await Auth.currentCredentials()
return Auth.essentialCredentials(curCred)
}
In a useEffect hook when the page loads, I am downloading a JSON file from S3. In order to do so, I have to do this:
const creds = await getCreds()
const myS3 = new AWS.S3({ credentials: creds })
const content = JSON.parse(await myS3.getObject({config}).promise()).Body.toSting('utf-8')
//process content...
I have similar code in other parts of the app to do things like myS3.putObject or myS3.getSignedURL etc. But each time I need to get new creds and create a new S3 object.
What I'd like to do is store all of that in a file, so I could import myS3 from './myS3.js', then create some help methods like
...
get = async (config) => return JSON.parse(await S3.getObject(config).promise())....
put = ...
sign = ...
and ideally create it in a way that when I call the get method it automatically refreshes the creds, then runs the get etc.
I'm just going in circles with exports vs. classes vs. functions. Is there a right / sane way to do this?
You can create a file with exported helper functions like this:
export {
get: async (config) => return JSON.parse(await S3.getObject(config).promise()),
put: () => console.log('put'),
sign: () => console.log('sign'),
}
And use them like this on your components:
import {get, put, sign} from './myS3';
get();
put();
sign();
Or:
import * as s3 from './myS3';
s3.get();
s3.put();
s3.sign();
Turns out what I was trying to do was create Singleton.

How to update a document when a document is added to a specified collection in cloud firestore?

I have 2 collections in my firestore (global and local) and when I add a doc to local I need to update a field in the global doc by 1
Below is the code I have for the same. I am very new to this so I might have some syntactical error as well, please do highlight if you find any.
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.helloWorld = functions.https.onRequest((request, response) => {
response.send("Hello world");
}); // For testing, even this is not being deployed
exports.updateGlobal = functions.firestore
.document("/local/{id}")
.onCreate((snapshot, context) => {
console.log(snapshot.data());
return admin
.firebase()
.doc("global/{id}")
.update({
total: admin.firestore.FieldValue.increment(1),
});
});
The Terminal says "function failed on loading user code"
Before this, it showed something along the lines of "admin is undefined" or "cannot access firestore of undefined" which I'm unable to replicate now.
This is a part of a react app which has normal firestore working through firebase npm module
Any other info needed regarding the issue I'll edit the question accordingly, thank you so much for the help.
In addition to loading the firebase-functions and firebase-admin modules, you need to initialize an admin app instance from which Cloud Firestore changes can be made, as follows:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
//...
I see another problem in your CF. You need to use the context object to get the value of id.
exports.updateGlobal = functions.firestore
.document("/local/{id}")
.onCreate((snapshot, context) => {
const docId = context.params.id;
return admin
.firebase()
.doc("global/" + docId)
.update({
total: admin.firestore.FieldValue.increment(1),
});
});
You can also use template literals as follows:
return admin
.firebase()
.doc(`global/${docId}`)
//...

Access Firebase Child Node in Node.js - Firebase Cloud Functions

Here is how my Firebase Schema is laid out:
I am able to index everything except _geoloc: into my Algolia Index with this code:
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
// Authenticate to Algolia Database.
// TODO: Make sure you configure the `algolia.app_id` and `algolia.api_key` Google Cloud environment variables.
const algoliasearch = require('algoliasearch');
const client = algoliasearch(functions.config().algolia.app_id, functions.config().algolia.api_key);
// Name fo the algolia index for content.
const ALGOLIA_POSTS_INDEX_NAME = 'businessSearch';
exports.indexentry = functions.database.ref('/businessSearch/{uid}/').onWrite(event => {
const index = client.initIndex(ALGOLIA_POSTS_INDEX_NAME);
const firebaseObject = Object.assign({}, event.data.val(), {
functions.database.ref('/businessSearch/{uid}/_geoloc').onWrite(event =>{
})
objectID: event.params.uid,
});
index.saveObject(firebaseObject); // .then or .catch as well
index.addObject
});
How can I index the _geoloc: child node into my Algolia Index with Node.js?
I heard somewhere that this is not possible to index a nested object with Object.assign, but I just wanted to see if it was.
There is a timing issue when the database is written and what gets indexed by Algolia. The code included in my question does work with nested objects.

Where should I initialize pg-promise

I just started to learn nodejs-postgres and found the pg-promise package.
I read the docs and examples but I don't understand where should I put the initialization code? I using Express and I have many routes.
I have to put whole initialization (including pg-monitor init) to every single file where I would like to query the db or I need to include and initalize/configure them only in the server.js?
If I initialized them only in the server.js what should I include other files where I need a db query?
In other words. Its not clear to me if pg-promise and pg-monitor configuration/initalization was a global or a local action?
It's also unclear if I need to create a db variable and end pgp for every single query?
var db = pgp(connection);
db.query(...).then(...).catch(...).finally(**pgp.end**);
You need to initialize the database connection only once. If it is to be shared between modules, then put it into its own module file, like this:
const initOptions = {
// initialization options;
};
const pgp = require('pg-promise')(initOptions);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
module.exports = {
pgp, db
};
See supported Initialization Options.
UPDATE-1
And if you try creating more than one database object with the same connection details, the library will output a warning into the console:
WARNING: Creating a duplicate database object for the same connection.
at Object.<anonymous> (D:\NodeJS\tests\test2.js:14:6)
This points out that your database usage pattern is bad, i.e. you should share the database object, as shown above, not re-create it all over again. And since version 6.x it became critical, with each database object maintaining its own connection pool, so duplicating those will additionally result in poor connection usage.
Also, it is not necessary to export pgp - initialized library instance. Instead, you can just do:
module.exports = db;
And if in some module you need to use the library's root, you can access it via property $config:
const db = require('../db'); // your db module
const pgp = db.$config.pgp; // the library's root after initialization
UPDATE-2
Some developers have been reporting (issue #175) that certain frameworks, like NextJS manage to load modules in such a way that breaks the singleton pattern, which results in the database module loaded more than once, and produce the duplicate database warning, even though from NodeJS point of view it should just work.
Below is a work-around for such integration issues, by forcing the singleton into the global scope, using Symbol. Let's create a reusable helper for creating singletons...
// generic singleton creator:
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
Using the helper above, you can modify your TypeScript database file into this:
import * as pgLib from 'pg-promise';
const pgp = pgLib(/* initialization options */);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp('my-connect-string'),
pgp
};
});
}
Then, in the beginning of any file that uses the database you can do this:
import {getDB} from './db';
const {db, pgp} = getDB();
This will ensure a persistent singleton pattern.
A "connection" in pgp is actually an auto-managed pool of multiple connections. Each time you make a request, a connection will be grabbed from the pool, opened up, used, then closed and returned to the pool. That's a big part of why vitaly-t makes such a big deal about only creating one instance of pgp for your whole app. The only reason to end your connection is if you are definitely done using the database, i.e. you are gracefully shutting down your app.

Categories

Resources