firebase storage: storage.ref is not a function - javascript

I would like to use storage service from Firebase with a nodeJS api (hosted on "firebase functions") to allow the users to upload his avatars.
So I read the doc from https://firebase.google.com/docs/storage/web/start
and I do:
admin.js
const admin = require('firebase-admin');
const config = require('./config.js');
admin.initializeApp(config);
const db = admin.firestore();
const storage = admin.storage();
module.exports = { admin, db, storage };
user.js
const { admin, db, storage } = require('../util/admin');
exports.postAvatar = async (request, response) => {
const storageRef = storage.ref();
}
but I have the following error: storage.ref is not a function
Is something is missing from the documentation ?
The console.log of storage const is:
Storage {
INTERNAL: StorageInternals {},
storageClient: Storage {...},
appInternal: FirebaseApp {...}
}

admin.storage() returns a Storage object. If you want to use it to refer to a file in your default storage bucket, you should use its bucket() method with no parameters, and it will give you a Bucket object from the Google Cloud nodejs SDK.
There are no methods called ref() anywhere in that SDK. It's not much like the JavaScript web client SDK. You will have to learn a different but similar API to work with content in using the Cloud Storage node SDK. The Admin SDK just essentially wraps this API.
const file = storage.bucket().file('/path/to/file');

Try below code.
const storage = storage.bucket() // you can also put your bucket-id from config
const storageRef = storage.ref()
Also check this answer. TypeError: firebase.storage is not a function

Related

How to refresh Instagram Basic Display API token automatically?

I'm trying to set up an instagram feed (just images and links) of a public instagram account for my Nextjs app.
I know I need to use the Instagram Basic Display API and get a Long-Lived Access Token but it expires after 60 days and I don't want to have to manually refresh it. Does anyone know a good, preferably free, way of doing this automatically?
I have looked at instagram-token-agent but that setup uses Heroku and an add-on that costs $30 a month which seems high.
Any ideas or links would be really helpful, thanks!
I eventually ended up using Google Cloud Secret Manager.
Overview: Secret Manager stores long-lived token and every rotation triggers a pub/sub that then triggers a cloud function. The cloud function refreshes the token for a new one and then adds a new version to the secret.
Create New Secret
Name it "instagram-token" and add your long lived token as the secret value. For now leave everything else default and create secret.
Create a service account for secret manager
In your terminal:
gcloud auth login
then
gcloud beta services identity create --service "secretmanager.googleapis.com" --project "YOUR_GCP_PROJECT_ID"
It may ask you to install gcloud beta commands.
IMPORTANT: Make sure you note down the full name of the service account returned in the terminal. If you have lost it, run the same command again.
Create pub/sub topic
Create a new topic and name it "instagram-token-refresh", untick 'add a default subscription'.
Give secret manager permission to publish pub/sub
In your new pub/sub topic go to permissions -> Add Principle.
Search and add the service account name added above. service-{id}#gcp-sa-secretmanager.iam.gserviceaccount.com. Add the new role Pub/Sub Publisher
Add rotation and pub/sub to secret
Go to your "instagram-token" secret and "edit secret".
Rotation -> custom -> every 50 days
Notifications -> Add Topic -> Select "instagram-token-refresh"
Save
Now every 50 days your "instagram-token-refresh" pub/sub will be triggered.
Create Cloud Function
Search cloud functions -> enable -> create cloud function
Function name: "Refresh-Instagram-Token"
Trigger: pub/sub -> Select "instagram-token-refresh"
click next
Entry Point: "refreshInstaToken"
Edit files:
You might need to enable to cloud build API
package.json
{
"name": "refresh-instagram-token",
"version": "0.0.1",
"dependencies": {
"#google-cloud/pubsub": "^0.18.0",
"#google-cloud/secret-manager": "^3.10.1",
"axios": "^0.24.0"
}
}
index.js
// Import the Secret Manager client
const { SecretManagerServiceClient } = require("#google-cloud/secret-manager");
const axios = require('axios');
// name of function is the same as entry point
exports.refreshInstaToken = async (event, context) => {
// check pub/sub message is rotation to prevent infinte looping
const event_type = event && event.attributes.eventType;
//allowing SECRET_VERSION_ENABLE lets you manually trigger this function by disabling the secret and then enabling it (rather than waiting for rotation trigger)
if (event_type != "SECRET_ROTATE" && event_type != "SECRET_VERSION_ENABLE") {
return null;
}
// secret name
const parent = event.attributes.secretId;
const name = parent + "/versions/latest";
// Instantiates a client
const client = new SecretManagerServiceClient();
// get latest secret
const [version] = await client.accessSecretVersion({
name: name,
});
// Extract the payload as a string.
const secret = version.payload.data.toString();
// refresh token
const requesturl = `https://graph.instagram.com/refresh_access_token?grant_type=ig_refresh_token&access_token=${secret}`;
const response = await axios.get(requesturl);
const data = await response.data;
// data = {"access_token", "token_type", "expires_in"}
// check access_token isn't null
if (data && data.access_token) {
// Payload is the plaintext data to store in the secret
const newSecret = Buffer.from(data.access_token, "utf8");
// add new secret version (the refreshed token)
const [newVersion] = await client.addSecretVersion({
parent: parent,
payload: {
data: newSecret,
},
});
console.log(`Added new secret version ${newVersion.name}`);
// get new secret version number
let newVersionN = newVersion.name.split("/");
newVersionN = newVersionN[newVersionN.length - 1];
if (newVersionN > 1) {
// if is a second version delete one before it
const nameToDestroy = parent + "/versions/" + (newVersionN - 1);
const [deletedVersion] = await client.destroySecretVersion({
name: nameToDestroy,
});
console.info(`Destroyed ${deletedVersion.name}`);
}
}
};
Adding/Accessing Secrets Ref
Consume event notifications with Cloud Functions Ref
Give cloud functions permissions to Secret
Go to your secret -> permission
Add -> {project-id}#appspot.gserviceaccount.com
Add role "Secret Manager Admin"
Accessing Secret Manager from service account
Create new service account name "instagram-token".
In new service account -> keys -> add keys -> save to desktop
Go to your secret -> permission -> add -> "instagram-token...gserviceaccount.com" and give the role of "Secret Manager Secret Accessor"
Setup credentials environment variable
create .env.local file in next js root directory
add new empty value GOOGLE_APPLICATION_CREDENTIALS=
Convert JSON file to Base64 key and copy to clipboard MAC
openssl base64 < /Users/{username}/Desktop/service-account.json | tr -d '\n' | pbcopy
Convert JSON file to Base64 WINDOWS
certutil -encode service-account.json encoded.txt
paste to variable so you will have something like GOOGLE_APPLICATION_CREDENTIALS=faGdfdSytDsdcDg...
Authenticating GCP in Next.js
Install #google-cloud/secret-manager npm i #google-cloud/secret-manager
const {
SecretManagerServiceClient
} = require("#google-cloud/secret-manager");
export const getInstagramToken = async() => {
// parse your base 64 env variable to a JSON object
const credentials = JSON.parse(
Buffer.from(process.env.GOOGLE_APPLICATION_CREDENTIALS, "base64").toString()
);
// TO DO -> CHANGE
const projectId = "eleanor-daisy";
const secretId = "instagram-token";
// set up credentials config
const config = {
projectId,
credentials,
};
// init secret manager with credentials
const client = new SecretManagerServiceClient(config);
const secretName = `projects/${projectId}/secrets/${secretId}/versions/latest`;
// Access the secret.
const [accessResponse] = await client.accessSecretVersion({
name: secretName,
});
const instaToken = accessResponse.payload.data.toString("utf8");
return instaToken;
};
Add GOOGLE_APPLICATION_CREDENTIALS and key to vercel when deploying.
Done! I might make a video tutorial on this as there's not much out there, let me know if that would be helpful :)

Upload files to firebase storage via function

I am trying to allow ability for user to select file and pass to firebase function to store in storage.
I am uploading file in react client like following:
const formData = new FormData();
formData.append("myFile", aFile);
const aRequesObject= {
method: "POST",
body: formData,
};
const response = await fetch(aUrl, aRequesObject);
Then I have a serverless function like following where i want to save this file to cloud storage.
import firebase from "firebase";
import "firebase/storage";
import { config } from "./Config";
firebase.initializeApp(config);
const file = request.body.myFile;
const ref = firebase.storage().ref().child(file.name);
ref.put(file).then(() => {
console.log("Uploaded file", file.name);
}); */
I have tried several variations from firebase documentation. All the examples i have found are uploading directly to storage from client as opposed to passing file to function and extracting from the request and then saving to storage. I am looking for a simple example of this or a link to where someone has done this scenario.

Google Cloud Function/google-auth-library: Cannot read property 'user' of undefined

Following this: https://medium.com/#nedavniat/how-to-perform-and-schedule-firestore-backups-with-google-cloud-platform-and-nodejs-be44bbcd64ae
Code is:
const functions = require('firebase-functions'); // is installed automatically when you init the project
const { auth } = require('google-auth-library'); // is used to authenticate your request
async function exportDB () {
const admin = await auth.getClient({
scopes: [ // scopes required to make a request
'https://www.googleapis.com/auth/datastore',
'https://www.googleapis.com/auth/cloud-platform'
]
});
const projectId = await auth.getProjectId();
const url = `https://firestore.googleapis.com/v1beta1/projects/${projectId}/databases/(default):exportDocuments`;
return admin.request({
url,
method: 'post',
data: {
outputUriPrefix: 'gs://name-of-the-bucket-you-created-for-backups'
}
});
}
const backup = functions.pubsub.topic('YOUR_TOPIC_NAME_HERE').onPublish(exportDB);
module.exports = { backup };
When I go to deploy via:
gcloud functions deploy backup --runtime nodejs8 --trigger-topic YOUR_TOPIC_NAME_HERE
I get error:
ERROR: (gcloud.functions.deploy) OperationError: code=3,
message=Function failed on loading user code. Error message: Code in
file index.js can't be loaded. Is there a syntax error in your code?
Detailed stack trace: TypeError: Cannot read property 'user' of
undefined
Is this something with google-auth-library?
I assume that you are trying to deploy GCF function triggered by HTTP request, I suggest you to check this link[1] seems is the same use case and can help you to use Google Cloud Datastore with node.js on GCF
[1] How to return an entire Datastore table by name using Node.js on a Google Cloud Function

Create File or Blob from local path in Javascript

I have extensively and systematically searched for an answer in stack overflow but haven't been able to find one that fits my needs.
I am trying to upload a number of files to Firebase Storage, which requires a File or Blob object.
var file = ... // use the Blob or File API
ref.put(file).then(function(snapshot) {
console.log('Uploaded a blob or file!');
});
I have a folder in my project with all the files I want to upload, and I'm trying to create such objects with their paths. However, none of my attempts have worked.
I tried importing the file:
let file = require('./Images/imagename.jpg');
and I researched using 'fs', the File API and other options, but none seem to have a way for me to get the file into an object using only the path.
In short: is there any simple way to get the object from a local path?
Here is how you can upload a file from the drive to Firebase Storage:
let bucket = admin.storage().bucket();
let uploadRes = await bucket.upload(filePath, options);
You can find a description of the option in the google cloud storage docs.
You will most likely create a key in the google cloud console with permissions and export the file as a json. You will find this in the Google Cloud Platform console -> IAM & admin -> Service accounts -> Create service account (create it with a key). Once you exported the json, set the environment variable like this:
export GOOGLE_APPLICATION_CREDENTIALS='./the_exported_file.json'
PLEASE STORE THIS FILE SECURELY ON YOUR SERVER AS IT HAS READ AND WRITE ACCESS!
Below is a full example of an upload function that also saves the file under it's hash name.
<!-- language: typescript -->
import admin from "firebase-admin";
import path from 'path'
const sha256File = require('sha256-file');
const firebaseConfig = {
apiKey: "...",
authDomain: "abc.firebaseapp.com",
databaseURL: "https://abc.firebaseio.com",
projectId: "abc",
storageBucket: "abc.appspot.com",
messagingSenderId: "123",
appId: "1:123:web:xxx",
measurementId: "G-XXX"
};
admin.initializeApp(firebaseConfig);
async function uploadFile(filePath: string, uploadFolder: string, contentType: string, hash: string | undefined = undefined,)
: Promise<string> {
if (!hash) {
hash = await sha256File(filePath);
}
let bucket = admin.storage().bucket();
const ext = path.extname(filePath);
const uploadPath = uploadFolder + hash + ext;
const options = {destination: uploadPath};
console.debug("starting upload");
let uploadRes = await bucket.upload(filePath, options);
console.debug("finished upload");
let newMetadata = {
contentType: contentType
};
if(uploadRes) {
let file = uploadRes[0];
file.setMetadata(newMetadata).then(() => {
// Updated metadata for 'images/forest.jpg' is returned in the Promise
}).catch(function (error) {
console.error(error)
});
}
return uploadPath;
}
This should work in recent versions of Node.js:
import fs from "fs";
import { Blob } from "buffer";
let buffer = fs.readFileSync("./your_file_name");
let blob = new Blob([buffer]);
First of all tell me that which technology you're using for front-end.
If you are using angular than you just need to get $event every time like change event.
than you need to create FormData object.
Pass that object into node and node side use multer for storing the file.
if you need the demo let me know I can help you....

How to access multiple Realtime Database instances in Cloud Functions for Firebase

I'm using multiple databases in a Firebase project. Cloud functions for the main (default) database work great, however, I cannot make them work for a secondary database. For example I want to make a read request on a node with admin privileges:
//this works
admin.database().ref(nodePath).once('value')...
This works in the main database, however, if I want to execute the command on another database, it doesn't work:
//this doesn't work
admin.database(secondaryDatabaseUrl).ref(nodePath).once('value')...
Although the functions are deployed, I get an error on the console when trying to execute the cloud function.
Here's the code for the cloud function with an https trigger:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const secureCompare = require('secure-compare');
exports.testFunction= functions.https.onRequest((req, res) => {
const key = req.query.key;
// Exit if the keys don't match
if (!secureCompare(key, functions.config().cron.key)) {
console.error('keys do not match');
res.status(403).send('error1');
return;
}
//test read request
//the line below crashes the function
return admin.database('https://secondary_db_url.firebaseio.com').ref(`/testNode`).once('value').then(dataSnapshot=> {
console.log('value', dataSnapshot.val());
return;
}).catch(er => {
console.error('error', er);
res.status(403).send('error2');
});
});
Below is the error log in the Firebase console:
TypeError: ns.ensureApp(...).database is not a function
at FirebaseNamespace.fn (/user_code/node_modules/firebase-admin/lib/firebase-namespace.js:251:42)
at exports.testFunction.functions.https.onRequest (/user_code/index.js:16:16)
at cloudFunction (/user_code/node_modules/firebase-functions/lib/providers/https.js:26:41)
at /var/tmp/worker/worker.js:671:7
at /var/tmp/worker/worker.js:655:9
at _combinedTickCallback (internal/process/next_tick.js:73:7)
at process._tickDomainCallback (internal/process/next_tick.js:128:9)
If I don't specify the secondary database URL, the function will make the read request on my main database which works great:
//this works
return admin.database().ref(`/testNode`).once('value').then(dataSnapshot=> {
...
I'm using the latest SDK versions: "firebase-admin": "^5.5.1" and "firebase-functions": "^0.7.3"
So, how do I get an instance of a secondary database in cloud functions using admin privileges?
Here's how to access database by URL using Admin SDK:
let app = admin.app();
let ref = app.database('https://secondary_db_url.firebaseio.com').ref();
Here's an example from Admin SDK integration tests: https://github.com/firebase/firebase-admin-node/blob/master/test/integration/database.js#L52
With cloud functions > 1.1 now, here is the documentation link that saved my life on this issue.
https://firebase.google.com/docs/database/usage/sharding#connect_your_app_to_multiple_database_instances
So, it looks like this at the top of my my cloud function index.js :
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const dev = admin.initializeApp({
databaseURL: "https://appdev.firebaseio.com"
}, 'dev');
const v2 = admin.initializeApp({
databaseURL: "https://appv2.firebaseio.com"
}, 'v2');
and then, in my clond functions functions code I can do :
//will change stuff on default database
admin.database().ref().child(`stuff/${stuffId}`).set(myStuff)
//will change stuff on my dev database
admin.database(dev).ref().child(`stuff/${stuffId}`).set(myStuff)
//will change stuff on my v2 database
admin.database(v2).ref().child(`stuff/${stuffId}`).set(myStuff)
So it looks like you are trying to access multiple databases using the javascript web client API. Passing the URL of the database to the API like this doesn't work with the Admin SDK:
admin.database('https://secondary_db_url.firebaseio.com').ref(`/testNode`)
Instead, you have to initialize a second app, give it a name, and pass that app around to the Admin SDK APIs. Here's a complete sample that writes the same data to two different database instances in the same project:
const functions = require('firebase-functions')
const admin = require('firebase-admin')
admin.initializeApp(functions.config().firebase)
const otherConfig = Object.assign({}, functions.config().firebase)
otherConfig.databaseURL = 'https://your-other-db.firebaseio.com/'
const otherApp = admin.initializeApp(otherConfig, 'otherAppName')
exports.foo = functions.https.onRequest((req, res) => {
const data = { foo: 'bar' }
const p1 = admin.database().ref('data').set(data)
const p2 = admin.database(otherApp).ref('data').set(data)
Promise.all([p1, p2]).then(() => {
res.send("OK")
})
.catch(error => {
res.status(500).send(error)
})
})
Updating this while on Firebase Functions v3.14.0. None of this answers worked for me so I implemented this solution
instance Registers a function that triggers on events from a specific Firebase Realtime Database instance
functions.database.instance('my-app-db-2').ref('/foo/bar')
Use the name of the database instance and it works, no need for the url. functions.database.ref used without instance watches the default instance for events.
So if both the answers doesn't work.
What happened with me is both the method worked without any error but second instance of database was not getting updated.
I updated npm and firebase CLI it worked.
Also #Dough Stevenson you Passing the URL of the database to the API like this **does** work with the Admin SDK
And this is a good blog from Firebase about the same
Firebase Blog : Easier scaling with multi-database support!

Categories

Resources