Upload file to google drive using javascript sdk - javascript

I searched lot and found nothing about how to send files. Even in google documentation there is nothing about sending file using Javascript sdk.
See here https://developers.google.com/drive/v3/web/manage-uploads
So right now I'm converting the nodeJs script to javascript. And they used fs to get the readstream. And I have no idea how to do that in javascript. Closet I can get to this...
var file = uploadButton.files[0]
var fileName = uploadButton.files[0].name
var fileMetadata = {
'name': fileName
};
var media = {
mimeType: 'image/jpeg',
body: file
};
gapi.client.drive.files.create({
resource: fileMetadata,
media: media.result,
fields: 'id'
}).execute();
Above code creates the empty file with then fileName and no content inside on it.

In order to upload a file to your google drive you need to use a google request object and 'POST' the file. You can see an example in this answer. Keep in mind that you need to get your API keys in order to initialise your google drive client object.

Related

firebase functions upload file from web via stream only partially working

I am trying to create a firebase function that downloads a photo off the web via URL and uploads it to firebase storage.
Using Axios to stream the file to the bucket. it appears the file gets uploaded but i cant download or view it.
This is my code:
let fileURL = 'https://www.example.file.path.png'
let myFile = await axios.get(fileURL, { responseType: 'stream' })
let destFile = bucket.file(photoId).createWriteStream({ contentType: myFile.headers['content-type']})
myFile.data.pipe(destFile)
And here is the storage console from firebase:
I have messed around with the storage api and attempted using the upload and save functions. using axios get and write streams is the closest that I'v got to getting this to work.
Reading the example in the docs only aids in my confusion because the file is never reference in the upload function.. just the file name??
Feel like i'm almost there considering the file or rather the name of the file is there and the size and the type.. just not the content?
This problem is related with Firebase console only. There is no problem with downloading with command: gsutil cp gs://<your bucket>/yourfile . (doc) and as well it is working in Google Cloud Console in Storage browser (direct link).
However indeed file uploaded like this is not possible to be downloaded or previewed with Firebase console.
After many tries I have realized that this is related with custom metadata Access token, which underneath is calledfirebaseStorageDownloadTokens. It appears automatically if ex. you download the file directly in Firebase console.
Anyway I noticed that value of the metadata is irrelevant. According to my test if you change your createWriteStream method parameters to:
let destFile = bucket.file(photoId)
.createWriteStream({
metadata: {
contentType: myFile.headers['content-type'],
metadata: {
firebaseStorageDownloadTokens: 'canBeAnyThing'
}
}
});
The problem disappears...
A file already downloaded to Firebase Storage and affected by the issue can be fixed by adding the same metadata. In the screenshot you have provided you can see "File Location" if you open you will see link "Create new access token" or as well you can add it in GCP Storage Browser adding it manually in "Edit metadata" for such object (remember to refresh FB console in browser).
At this point we can think of why it's looks like this. I have found interesting information in github here.

firebase cloud functions onFinalize trigger with not predefined bucket?

Is it possible to write a firebase cloud function that triggers when a new fila was created in firebase storage - onFinalize (but we don't know the exact bucket in advance) ?
Inside my firebase storage a have a folder 'loads' and inside I have folders named with load id like:
/loads/-Lw1UySdunYyFMrrV6tl
/loads/-LwisA5hkNl_uxw3k36f
/loads/-LwkPUm-q7wNv-wZ49Un
https://ibb.co/NnQkTyC here's a screenshot of storage
I want to trigger cloud function when new file has been created inside one of these folders. I don't know in andvance where the file will be created. I don't know if this is even possible. That's why I need an advice.
My main goal is to merge 2 pdf files in one within cloud functions. In my app (TMS written with vuejs) on frontend I create confirmationOrder.pdf using html2canvas/jsPDF and then save it to storage/loads/${loadID}. And later on user can manually upload POD.pdf on the same bucket. When it happens I want my cloud function to merge these two pdfs in one new file (in same storage bucket). But again I don't know the bucket in advance.
Here's how I upload PDFs in frontend:
async uploadConfPDF({ commit }, payload) {
const PDF = payload.PDF;
const loadID = payload.loadID;
const fileName = payload.fileName;
const fileData = await firebase
.storage()
.ref(`loads/${loadID}/${fileName}.pdf`)
.put(PDF);
const confOrderURL = await fileData.ref.getDownloadURL();
return confOrderURL;
},
Any help is highly appreciated. Sorry if my explanation could seem not clear enough. English is not my native language.
EDIT FOLLOWING YOUR QUESTION RE-WORKING
Based on your code and on the print screen of your Cloud Storage console, you are working in the default bucket of your project, which location's URL is gs://alex-logistics.appspot.com.
As we can see on the print screen of your Cloud Storage console, the files in your bucket are presented in a hierarchical structure, just like the file system on your local hard disk. However, this is just a way of presenting the files: there aren't genuine folders/directories in a bucket, the Cloud Storage console just uses the different part of the files paths to "simulate" a folder structure.
So, based on the above paragraphs, I think that your question can be re-phrased to "In a Cloud Function, how can I extract the different parts of the path of a file that is uploaded to the default bucket of my Firebase Project?".
Answer:
In a Cloud Function that is triggered when a file is added to the default bucket, you can get the file path as follows:
exports.yourCloudFunction = functions.storage.object().onFinalize(async (object) => {
const filePath = object.name; // File path in the bucket.
//...
});
Since we use an onFinalize event handler, you get the path of this new file by using the name property of the object Object, which is of type ObjectMetadata.
You can then use some String methods to, for example, extract from this path the ${loadID} you refer to in your question.

Share uploaded files with specific user using google Drive API

I am trying to share upload file using google drive API.I following this tutorial
Upload files/folder, List drive file are working fine but I want to add share functionality where file can we share with specific user.
I didn't found any solution. how to share file?
You want to share the files in Google Drive using Drive API with Javascript.
You have already been able to get and put for the files on Google Drive using Drive API.
If my understanding is correct, how about this answer? Please think of this as just one of several answers.
In order to give the permissions for the file, the method of "Permissions: create" in Drive API is used.
Sample script:
In this sample script, I prepared the function of createPermissions for your situation.
function createPermissions(fileId, body) {
gapi.client.load('drive', 'v3', function() {
gapi.client.drive.permissions.create({
fileId: fileId,
resource: body,
})
.then(function(res) {
console.log(res)
// do something
})
.catch(function(err) {
console.log(err)
// do something
});
});
}
When you use this function, please use as follows.
const fileId = "###"; // Please set the file ID.
const body = {
role: "writer",
type: "user",
emailAddress: "###" // Please set the email address of user that you want to share.
};
createPermissions(fileId, body);
When you use this script, please prepare the file ID of the file you want to share.
At above script, the file is shared with the user, which has the email address of ###, as writer and user. About the detail information of parameters, please check the official document.
Note:
Unfortunately, in your actual situation, I'm not sure where you want to use the above sample script in your script. So please add it to your script.
Reference:
Permissions: create
If I misunderstood your question and this was not the direction you want, I apologize.

AWS Transfer Acceleration with pre-signed URLs using JavaScript SDK

Simply, is it possible to use transfer acceleration (TA) with pre-signed URLs generated using the AWS-SDK for JavaScript?
Turning on TA for a specific S3 bucket gives a URL with the format: {bucket}.s3-accelerate.amazonaws.com. However, when specifying the parameters for a request, the only valid options seem to be {Bucket: 'bucket', Key: 'key', Body: 'body', Expires: 60} and doesn't seem to allow me to say I want to use TA. The resulting URL is in the usual format {bucket}.s3-{region}.amazonaws.com, which is wrong for TA.
The documentation does not seem to offer much information with regards to pre-signed URLs.
Yes, but this is still undocumented and nowhere to be found on their docs or anywhere else (up until now :) ). We got it working by searching in the source code of the SDK. You need to load S3 like this:
var s3 = new AWS.S3({useAccelerateEndpoint: true});
Then the SDK will use the accelerated endpoint.
As it happens, there is a documented way of enabling S3 transfer acceleration feature on AWS SDK for JavaScript. It can be done by specifying the same property mentioned by #Luc Hendriks but in the AWS.Config class as follow:
AWS.config.update({
useAccelerateEndpoint: true
});
var s3 = new AWS.S3();
Documentation reference: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html

Chrome Extension access Files

I want my chrome extension to access a sqllite file which is part of the extension. I looked up the chrome.storeage api, but it doesnt really help me! And my JavaScript knowledge isnt enough to access a file and read its content.
Also is it possible to start my extension when a specific file type is loaded?
This is how I read a json file named env.json in the root of the extension directory.
Add the following configuration to the manifest.json. You need to add your sqlite file in this configuration.
"web_accessible_resources": [
"*.json"
],
I defined a function to load json file. You need to tweak the response data, cause the sqlite file is a binary data file instead of a text file.
// the path is relative to the extension directory
let loadData = async path => {
let url = chrome.runtime.getURL(path);
let resp = await fetch(url)
let json = resp.json()
return json
}
Usage example
// Use the above function to load extension file
let conf = await loadData('env.json')

Categories

Resources