firebase functions upload file from web via stream only partially working - javascript

I am trying to create a firebase function that downloads a photo off the web via URL and uploads it to firebase storage.
Using Axios to stream the file to the bucket. it appears the file gets uploaded but i cant download or view it.
This is my code:
let fileURL = 'https://www.example.file.path.png'
let myFile = await axios.get(fileURL, { responseType: 'stream' })
let destFile = bucket.file(photoId).createWriteStream({ contentType: myFile.headers['content-type']})
myFile.data.pipe(destFile)
And here is the storage console from firebase:
I have messed around with the storage api and attempted using the upload and save functions. using axios get and write streams is the closest that I'v got to getting this to work.
Reading the example in the docs only aids in my confusion because the file is never reference in the upload function.. just the file name??
Feel like i'm almost there considering the file or rather the name of the file is there and the size and the type.. just not the content?

This problem is related with Firebase console only. There is no problem with downloading with command: gsutil cp gs://<your bucket>/yourfile . (doc) and as well it is working in Google Cloud Console in Storage browser (direct link).
However indeed file uploaded like this is not possible to be downloaded or previewed with Firebase console.
After many tries I have realized that this is related with custom metadata Access token, which underneath is calledfirebaseStorageDownloadTokens. It appears automatically if ex. you download the file directly in Firebase console.
Anyway I noticed that value of the metadata is irrelevant. According to my test if you change your createWriteStream method parameters to:
let destFile = bucket.file(photoId)
.createWriteStream({
metadata: {
contentType: myFile.headers['content-type'],
metadata: {
firebaseStorageDownloadTokens: 'canBeAnyThing'
}
}
});
The problem disappears...
A file already downloaded to Firebase Storage and affected by the issue can be fixed by adding the same metadata. In the screenshot you have provided you can see "File Location" if you open you will see link "Create new access token" or as well you can add it in GCP Storage Browser adding it manually in "Edit metadata" for such object (remember to refresh FB console in browser).
At this point we can think of why it's looks like this. I have found interesting information in github here.

Related

Problems with Resumable Upload For WebApps script

I was looking for a solution so that people with access to a spreadsheet could upload files through it, researching I found some solutions, but as I will need these people to upload videos through the spreadsheet, some solutions that used Blob ended up being discarded.
Searching, I found this script made by Tanaike, apparently it solves practically all problems, I thought of pulling it into the spreadsheet using an html alert, thus allowing people to upload files with sizes greater than 50mb.
The script can be found here:
Resumable Upload For WebApps
The issue is that I'm having some problems getting it to work, basically I'm getting this error when trying to upload a file:
Error: <HTML> <HEAD> <TITLE>Not Implemented</TITLE> </HEAD> <BODY BGCOLOR="#FFFFFF" TEXT="#000000"> <H1>Not Implemented</H1> <H2>Error 501</H2> </BODY> </HTML>
Other than that, I have a few questions I'd like to clear up:
1- I'm not sure if with this script people with other accounts would be able to upload the files to my Google Drive, is it possible?
2- Is it possible implement it in a button on a spreadsheet and request that the file be uploaded in the same folder as that spreadsheet?
Sorry for the amount of questions, javascript and GAS are things that are not very present in my daily life, so I have a little difficulty.
Checking the developer console, the error returned from the server is accessNotConfigured. This happens when the GCP Project doesn't have the necessary APIs enabled. To fix this you need to create a new Cloud Project:
In the Google Cloud console, go to Menu > IAM & Admin > Create a Project.
In the Project Name field, enter a descriptive name for your project.
In the Location field, click Browse to display potential locations for your project. Then, click Select.
Click Create. The console navigates to the Dashboard page and your project is created within a few minutes.
After that you need to enable the Drive API:
In the Google Cloud console, go to Menu > More products > Google Workspace > Product Library.
Click the API that you want to turn on.
Click Enable.
Finally you need to attach the GCP project to your Apps Script Project:
Determine the Project number of your Cloud project.
Open the script whose Cloud project you want to replace.
At the left, click Project Settings.
Under Google Cloud Platform (GCP) Project, click Change project.
Enter the new project number and click Set project.
After attaching the standard project the error stopped showing up for me. The reason for this is that Google changed the way Apps Script creates GCP projects so now scripts may have Default or Standard projects. Default projects are essentially more restricted so you may have to create a Standard Project in certain scenarios. One these scenarios in the documentation is "To create a file-open dialog". Tanaike's code uses the same technique as the file-open dialogs to retrieve the access token from the server, which I believe is the cause of the error.
As for your other questions:
I'm not sure if with this script people with other accounts would be able to upload the files to my Google Drive, is it possible?
Only if you deploy it as a Web App, setting it to execute with your account and available to "Anyone with a Google account". This uses your account's access token to authorize so other users will upload to your account.
Is it possible implement it in a button on a spreadsheet and request that the file be uploaded in the same folder as that spreadsheet?
As I mentioned under 1., doing it within the spreadsheet probably won't work, but you can add the parents property to the request body on the HTML side to specify the folder. You can also retrieve it dynamically by calling google.script.run. Here's a sample I modified to do this:
google.script.run.withSuccessHandler(function(at) {
var xhr = new XMLHttpRequest();
xhr.open("POST", "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable");
xhr.setRequestHeader('Authorization', "Bearer " + at.token);
xhr.setRequestHeader('Content-Type', "application/json");
xhr.send(JSON.stringify({
mimeType: fileType,
name: fileName,
parents: at.parent
}));
xhr.onload = function() {
doUpload({
location: xhr.getResponseHeader("location"),
chunks: chunks,
});
};
xhr.onerror = function() {
console.log(xhr.response);
};
}).getAt();
That's a part of the init() function in the index.html file. The at variable originally only received the access token retrieved from the server. I just modified it so it receives an object with both the access token and the folder ID, and I included the parent folder ID in the API call. You also need to modify the getAt() function in Code.gs to actually return the folder ID:
function getAt() {
var id = SpreadsheetApp.getActiveSpreadsheet().getId()
var folder = DriveApp.getFileById(id).getParents().next().getId()
return { token: ScriptApp.getOAuthToken(), parent: [folder] }
}
There's a lot to unpack here so I advise you to check the documentation. I think you'll have to go through the Web App route if you want other users to upload the files to your Drive.
Reference:
Web Apps
Communicate with Server Functions
Files.insert
Default and Standard Projects

Firebase storage onUpload incorrect path name

I'm having trouble when uploading a image file to firebase storage. The path names seem to be incorrect. The 'avatar' is showing up there from no know origin at all
Upload script
uploadTask = storageRef
.child('complaints')
.child(this.uid)
.child(this.file.name)
.put(this.file)
Result
I've searched in all my code base for any kind of unexpected mutation of the firebase.storage function by couldn't find any possible responsible for this kind of behaviour.
It looks like you might have the Resize Images Extension installed, which automatically creates (in your case) a 128x128 version of the image in the avatar folder.

Why am I not able to get the download URL of my Firebase storage file?

To summarize the problem, users are uploading lots of PDF files to a storage bucket. After upload, the users have a button they can press that gets the download URL of the file they have selected and opens it in a new window.
const storageRef = Storage.ref(`Inbound_Forms/${selectedInbound.id}/${row.FileName}`);
storageRef.getDownloadURL().then(url => {
window.open(url, '__blank');
});
The opening of the URL works about 95% of the time, but every once in a while when the user clicks the button to open the URL, this error gets thrown:
This would lead me to believe that the file does not exist, but when I open up the bucket in the Firebase console, the file exists, and I can get download it via it's link in the console:
The path included: /Inbound_Forms/5eE2Oytwrpc7FTkmH4gy
The solution for the time being is to have the users send me an email with a link to the page where the file is. I then track the original down in the bucket, download the file, and upload the exact same file with suffixed with 'copy', as seen in the image above. The new file works without issue, where the original continues throwing the same error. I just inform the user a copy has been uploaded, and they access it and continue on with their work.
There have been a few solutions I have looked into, but those worth mentioning are:
1) The file names have spaces. This does not seem to cause an issue, as even the copy I upload works without problems. And as stated before, this process works 95% of the time, many cases with spaces in the file name.
2) Storage rules. The same user has been able to open many other files with the same rules, including the newly uploaded copy.
Currently, I am unable to recreate this bug, as it happens so rarely. I thank anyone in advanced for and leads on what is going on here.

firebase cloud functions onFinalize trigger with not predefined bucket?

Is it possible to write a firebase cloud function that triggers when a new fila was created in firebase storage - onFinalize (but we don't know the exact bucket in advance) ?
Inside my firebase storage a have a folder 'loads' and inside I have folders named with load id like:
/loads/-Lw1UySdunYyFMrrV6tl
/loads/-LwisA5hkNl_uxw3k36f
/loads/-LwkPUm-q7wNv-wZ49Un
https://ibb.co/NnQkTyC here's a screenshot of storage
I want to trigger cloud function when new file has been created inside one of these folders. I don't know in andvance where the file will be created. I don't know if this is even possible. That's why I need an advice.
My main goal is to merge 2 pdf files in one within cloud functions. In my app (TMS written with vuejs) on frontend I create confirmationOrder.pdf using html2canvas/jsPDF and then save it to storage/loads/${loadID}. And later on user can manually upload POD.pdf on the same bucket. When it happens I want my cloud function to merge these two pdfs in one new file (in same storage bucket). But again I don't know the bucket in advance.
Here's how I upload PDFs in frontend:
async uploadConfPDF({ commit }, payload) {
const PDF = payload.PDF;
const loadID = payload.loadID;
const fileName = payload.fileName;
const fileData = await firebase
.storage()
.ref(`loads/${loadID}/${fileName}.pdf`)
.put(PDF);
const confOrderURL = await fileData.ref.getDownloadURL();
return confOrderURL;
},
Any help is highly appreciated. Sorry if my explanation could seem not clear enough. English is not my native language.
EDIT FOLLOWING YOUR QUESTION RE-WORKING
Based on your code and on the print screen of your Cloud Storage console, you are working in the default bucket of your project, which location's URL is gs://alex-logistics.appspot.com.
As we can see on the print screen of your Cloud Storage console, the files in your bucket are presented in a hierarchical structure, just like the file system on your local hard disk. However, this is just a way of presenting the files: there aren't genuine folders/directories in a bucket, the Cloud Storage console just uses the different part of the files paths to "simulate" a folder structure.
So, based on the above paragraphs, I think that your question can be re-phrased to "In a Cloud Function, how can I extract the different parts of the path of a file that is uploaded to the default bucket of my Firebase Project?".
Answer:
In a Cloud Function that is triggered when a file is added to the default bucket, you can get the file path as follows:
exports.yourCloudFunction = functions.storage.object().onFinalize(async (object) => {
const filePath = object.name; // File path in the bucket.
//...
});
Since we use an onFinalize event handler, you get the path of this new file by using the name property of the object Object, which is of type ObjectMetadata.
You can then use some String methods to, for example, extract from this path the ${loadID} you refer to in your question.

What is the downloadUrl property in a File object of the drive rest api?

I am working on a project where I'd like to upload a file to a Google Drive using a URL. One property I stumbled upon was downloadUrl. I'm not sure if it's what I need yet, but in the documentation here (https://developers.google.com/drive/v2/reference/files) the column of description is blank.
What is this property used for?
This property is used to allow you to download the file. If it is present in the file info metadata you can use it to issue a HTTP GET request and download the file.
You can do it using a token granted by the domain administrator user or the file's owner user.
'downloadUrl' Allows you to download files that are stored in Google Drive. You can download Google Documents and export them in a few different formats using exportLinks .
To get downloadUrls, you need to get the metadata of a file. Do it by requesting GET method. The method will return a file resource that include 'downloadUrl' property.

Categories

Resources