error uploading to cloud storage using a cloud function - javascript

I am trying to upload files to google cloud storage using a cloud function which is triggered by HTTP. However when the cloud function sends the file to be uploaded I often (although not always) get the following error
ERROR uploading to storage: { ApiError: Anonymous caller does not have storage.objects.create access to bucket_name/folder/test.jpg.
I am not sure why this error occurs - and why only some of the time
Here is the code:
const storage = require('#google-cloud/storage')();
function uploadToStorage(filepath, folder, filename) {
const options = {
destination: bucket.file(`${folder}/${filename}`),
public: false,
resumable: false
};
storage
.bucket(BUCKET_NAME)
.upload(filepath, options)
.then(function () {
console.log(`${filename} uploaded to ${BUCKET_NAME}`);
})
.catch((err) => {
console.error('ERROR uploading to storage: ', err);
});
}
Thanks

I had the same error after adding a return statement at the end of my function that performed file deletes on storage objects. This is what I was doing:
Make a database call to get some data
Once that request comes back, delete some files out of cloud storage (GCS)
The code structurally looked like this:
deleteStuffOutStorage() {
admin.firestore().doc(`My-doc-ref`).get()
.then(snapshot => {
// Do the deleting here {Interacting with GCS}
return deleteFile(snapshot.data().path); // Deletes file
})
.then(success => {
// Worked
})
.catch(error => {
// Error = ApiError: Anonymous caller does not have storage.objects...
})
return; // This statement was creating the problems
}
When I removed the return statement, I no longer got the error. I thought in my case it may have something to do with firebase-admin object instance getting deallocated and re-allocated between asynchronous operations (steps 1 and 2 above), or at least its GCS auth token?
All FCF instances should have access to GCS via a service account that is auto-generated. You can confirm this in the GCP console : https://console.cloud.google.com/iam-admin/serviceaccounts/
From the code snippet you posted I can't see anything that would cause the same issue I was getting, but maybe have a think about any time-based events that could cause this behaviour. That may explain the inconsistent behaviour you elude to.
Hope that's some sort of help.

Related

firebase httpsCallable is not called / does not respond

I'm facing a strange issue developing a react-native application connected with firebase as backend.
I try to call a firebase cloud function via httpsCallable.
If i'm in debug mode everything is working fine and the saveImage() functions returns a value.
But if i disable debug randomly (maybe 50% of the time) the function just hangs at await functions().httpsCallable('directUpload')
I already tried to get an output showing a alert with the result, because i can not use console.log without debug, but its not working. Same for the error. It seems like its waiting for the await forever
Even on the server side, i can see in the log that the function is not called.
saveImage = async item => {
try {
let result = await functions().httpsCallable('directUpload')({ //its "frozen here"
uid: this.state.user.uId,
mimeType: item.mimeType,
ext: item.ext,
});
helper.showAlert(result) //never gets called
return {success: true};
} catch (error) {
helper.showAlert(error); //never gets called
return {success: false};
}
};
Does anyone have any idea where the problem is coming from or what it could be?

Want client-side Firebase logs to show up in StackDriver, not users' browsers

I'm using firebase-functions/lib/logger to log client-side firebase/firestore activity like
const { log, error } = require("firebase-functions/lib/logger");
export const addData = async (userId, dataId) => {
try {
const collectionRef = firestore
.collection("docs")
await collectionRef.add({
dataId,
});
log(`Data added`, { userId, dataId });
} catch (err) {
error(`Unable to add new data`, { userId, dataId });
throw new Error(err);
}
};
When I run this on my local, the log shows up in my browser console. Will this happen on non-local environments, ie for real users? Will these logs also show up automatically in Stackdriver, or are they stuck on the client side? I want to be able to view the logs either in Stackdriver or Firebase console but have them not show up in the browser for real users. How should I accomplish this?
Messages logged in Cloud Functions will not show up in the client app at all (that would probably be a security hole for your app). They will show up in the Cloud Functions console in the log tab, and in StackDriver.
Any messages logged in your app will not show up in any Google Cloud product. They are constrained to the device that generated them. If you want cloud logging, you'll need to implement some other solution. Cloud Functions does not support this - you will need to investigate other solutions or build something yourself.

FirebaseError when logging out of web app, exception thrown by user callback, document references must have an even number of segments

I've built a Javascript web app using Firestore and Firebase. When logging the user out, I am getting console errors. The errors reference the firebase-database.js and firebase-firestore.js scripts, though, so I can't really tell what is happening:
[2020-05-22T12:32:58.436Z] #firebase/database: FIREBASE WARNING:
Exception was thrown by user callback.
Hr#https://www.gstatic.com/firebasejs/7.6.1/firebase-firestore.js:1:48219
firebase-database.js:1:11297
FirebaseError: Invalid document reference. Document references must
have an even number of segments, but user has 1
firebase-firestore.js:1:48219
This is my log out function:
$('.logout').on('click', function(){
firebase.auth().signOut()
.catch(function(error){
console.log(error.code);
console.log(error.message);
});
});
Then I have a listener for firebase.auth().onAuthStateChanged which triggers this:
firestoredb.collection('user').doc(uid).update({
status: false,
last_changed: firebase.firestore.FieldValue.serverTimestamp()
})
.then(function(){
uid='';
$('#screenname').html('');
window.location='https://www.example.com/your-account.asp?task=logout&afterlogin=%2Fv2';
})
.catch(function(error){
console.log(error.code);
console.log(error.message);
});
What might be my strategy for tracking down this error since the console logs are not that helpful? The error does not really affect the performance of the app, since the user is logged out anyway (and redirected via Javascript), however it bothers me that there is an error.
EDIT: I am wondering if the cloud script that is running could be the problem. That might explain why I cannot identify the line number and why the error message is so vague. Here is my cloud script, can this be modified so that a missing UID value would be ignored? This is basically the script provided by Google for combining Firebase and Firestore to maintain session state of the user.
const functions=require('firebase-functions');
const admin=require('firebase-admin');
admin.initializeApp();
const firestore=admin.firestore();
exports.onUserStatusChanged=functions.database.ref('user/{uid}').onUpdate(
async (change, context) => {
const eventStatus=change.after.val();
const userStatusFirestoreRef=firestore.doc(`user/${context.params.uid}`);
const statusSnapshot=await change.after.ref.once('value');
const status=statusSnapshot.val();
if (status.last_changed>eventStatus.last_changed){
return null;
}
eventStatus.last_changed=new Date(eventStatus.last_changed);
return userStatusFirestoreRef.update(eventStatus);
}
);

Firebase Callable Cloud Function CORS Error

The following is the client side code to call the cloud function:
var getShippingRate = firebase
.functions()
.httpsCallable("shippo-generateShippingRate");
getShippingRate({ address: shippo })
.then(function(result) {
// Read result of the Cloud Function.
console.log("THE SHIPPING RATES", result.data.shipment);
})
.catch(function(error) {
// Getting the Error details.
console.log("ERROR SHIPPING: ", error);
var code = error.code;
var message = error.message;
var details = error.details;
});
The cloud function:
exports.generateShippingRate = functions.https.onCall(async (data, context) => {
const customer_address = data.address;
return generateShipmentObject(customer_address);
});
generateshipmentObject returns this:
shippo.shipment.create(
{
address_from: addressFrom,
address_to: addressTo,
parcels: [parcel],
async: true
},
(err, shipment) => {
// asynchronously called
if (err) {
return { error: err };
} else {
return { result: shipment };
}
}
I get the standard CORS Error, but a callable Cloud Function should handle this automatically:
Access to fetch at ... from origin 'http://localhost:5000' has been blocked by CORS policy:
EDIT
I'm using firebase serve --only hosting to test on localhost.
The Cloud Functions are deployed with firebase deploy --only funtions
I'm calling other similar Cloud Functions on the same site which do not have that issue.
Temp fix:
In the cloud console functions page, select a function to show the info panel. In the permissions tab, select ADD MEMBER. In the new members field, type allUsers. In the roles drop down, select cloud functions, then cloud functions invoker, and save.
It actually sort of makes sense for a function to have restricted permissions when it's first created, however I'm used to the default permissions being present, so it's a bug (or new feature) that definitely threw me off. Of course this doesn't fix the underlying problem, but hope it helps.

NodeJS - Return a single status code from an async function using Array.protoype.map() to save multiple files in different locations in a database

I am a little stumped on how to handle this the best way possible. I've decided to rewrite this controller, and I need to (at least I think) make use of promise.all() here.
Premise:
In this application, the Admin user must be able to bulk upload a bunch of .pdf's at once that are for multiple users. The .pdf's adhere to a specific naming convention that my backend upload controller by using a regEx, pulls out a first and last name. These .pdf's are auto-generated in a program, that always names them exactly the same, so there is no human error in misspelling names.
Each call to the database and an AWS S3 Bucket is made within an Array.prototype.map() a function that is looping through and uploading a file to an S3 bucket, and then it takes the Key name of the file returned from s3.upload() and saves that Key to a user model in Mongo DB as a reference to their file(s) within the S3 Bucket.
Example Code:
This is what I currently have (that does work somewhat). This is the block of code responsible for what I described above. employeeFiles is created further up in the controller and contains an array of objects that each have a file and id property. The file name destructuring and user matching happen further up in the controller as well, and the employeeFiles array is a result of that. The id property contains the mongo _id of the employee, and the file property contains the file to be saved. This all works perfectly, and I don't think that code is needed for context here. fileType is a variable available within the scope of the controller:
const employeeFileUploadToDb = () => {
employeeFiles.map((employee, i) => {
const { file, id } = employee;
const params = {
Bucket: S3_BUCKET_NAME,
Body: file.buffer,
Key: `${filetype}/${file.originalname}`
};
s3.upload(params, (err, data) => {
if (err) {
next(err);
}
if (data) {
//Save reference to Employee model
let dataObj = {
key: data.key,
fileName: file.originalname,
date: Date.now()
};
Employee.findOneAndUpdate(
{ _id: id },
{ $push: { [`${filetype}`]: dataObj } }
)
.then(resp => res.send(200))
.catch(err => next(err));
}
});
});
};
I am making use of next() to handle any errors within the s3.upload() and findOneAndUpdate() functions (I do realize findOneAndUpdate() is deprecated) moving forward. My idea here is that if there is an error with one of the functions, next() will send it to my error handler middleware and keep going, versus ending the process and halting all of it.
Inside of every iteration of s3.upload(), I make a call to my database so that I can save the reference to the file uploaded to the S3 Bucket. Inside of a then() method of Employee.findOneAndUpdate(), I return a (200) response to let my client know everything has been uploaded to S3 and saved in my DB. So on each iteration of this map() function, I am returning a 200. If I have 10 files, I am returning 200 10 times.
I feel that I can convert this into an async function, and make use of a promise.all() to return a single status code upon completion. Returning that many status codes seem a bit crazy to me. But I am not too sure how to approach this while using a map() function to loop and make an async call on every iteration.
Hope this makes sense, and thank you in advance for looking at this!
I would split it up into a 2-step process. Upload in bulk and then save to mongo if it all worked out.
const employeeFileUploadToDb = () => {
const uploadFiles = files => files.map((employee, i) => new Promise((resolve, reject) => {
//...
s3.upload(params, (err, data) => {
if (err) {
return reject(err);
}
resolve(data);
})
});
});
Promise.all(uploadData(employeeFiles)).then((err, data) => {
// Handle saving to mongo
})
};

Categories

Resources