I am building a simple static website selling a single pdf file using the Stripe checkout api.
I would like to be able to generate an expiring download link after the customer successfully purchased the pdf.
I am really not sure about how to do this, but I was thinking about using firebase to store the file in a bucket and somehow use a cloud function to generate a temporary download link that expires after some time, but I am not sure how to go about this (or if this is even the best solution).
Can anyone give me some help about which direction to go here?
Firebase'd download URLs don't expire, although you can revoke the from the Firebase console.
But a better fit for your use-case might be to use Cloud Storage's signed URLs, which have a built-in expiration attribute. You can generate these with the Cloud Storage SDKs or Firebase's Admin SDKs, all of which should only be run in trusted environments - such as your development machine, a server you control, or Cloud Functions.
Also see:
A guide to Firebase Storage download URLs and tokens.
Get Download URL from file uploaded with Cloud Functions for Firebase
Related
I am trying to make a web based audio player and I use firebase storage for keeping audio files. I got problem with storage because firebase doesn't provide http link. Can anyone explaine How i use firebase storage download url or gs path for web based audio player?
Firebase Storage provides a download URL for your files. You can manually get the download URL by clicking on the file, then opening the File Location tab on the right, then clicking on Download URL.
You can get the download URL from your JavaScript with getDownloadURL(). Or you can get the download URL from Firebase Cloud Functions with file.getSignedUrl(). Note that getsignedUrl() doesn't work until you set the Storage Object Creator role for your appspot service account in your Google Cloud Platform IAM & admin permissions page.
It's a 2 step process. First, you upload the audio file.
Second, use cloud functions .onCreate trigger to get, then write the info about the newly added audio file to Firestore (the database). URL shortening can be automated inside this cloud function.
End state is that you have the audio file stored in Cloud Storage and the info about the audio file stored in Firestore.
I have a web service involving the attachment of multiple documents to one of many "objects". To simplify this process and make it easier to edit each file individually if the user so desires, I want the user to be able to synchronise all of these files onto a directory on his/her computer (much like that of Google Drive or Dropbox). If the user were to change one of these files, remove a file, or add a file, this would reflect on my web service, and hence affect the files that are attached to this "object".
What would be the best choice of services to do this? I am currently using a Node.JS back-end, although I suspect this will do little to influence the choice of storage. I'm looking for a service that allows the user the flexibility of full filesystem-level CRUD, whilst synchronising the files in a secure manner to a subset of a larger object storage collection, much like synchronising (and only providing access to) a subdirectory of an AWS S3 bucket, hence the title of this question.
I'm currently looking into (somehow) doing this with AWS S3, although I am open to using another storage service.
Thanks in advance.
AWS Storage Gateway (cached gateway) allows you to edit files locally and the Gateway will synchronize the update automatically over to S3.
You will need to install a small VM on the machine. Typically, if your clients have a private data centre / server , this config will allows a "share folder" (or a NAS) to be synchronized with S3.
I am working with google cloud storage transfer service to transfer data from twilio recording url to google cloud bucket. While implementing above thing I came to know that to transfer a file from url you must have a md5 hash of that object.
Now twilio doesn't provide me with the md5 hash value.I wanted to ask is it possible to do above thing and also along with that is there any other way to transfer the content of a url to directly on google cloud bucket.
Also I don't want to use my server for a very long amount of time it has to be quick like schedule so that i can track it or some kind of callback when it will get completed.
Twilio developer evangelist here.
It looks like whatever you do, you're going to need to download the recording files to your own server at some point during this process.
You could loop through the files, download them and generate the md5 hash for each of them, then discarding the file but creating the TSV of URLs and hashes as you go.
But, if you do that, you've done half the work in downloading the file, so you might as well continue to upload the file to Google Cloud Storage from that point, using gsutil or the JSON API.
I just set up a simple JS photo uploader on my site. It uploads photos directly to a Google Cloud Storage bucket.
I used the code from their official JavaScript library
But if you look at that code, you'll see that it requires authentication. I'm authenticated and able to upload photos, but I want everyone to be able to just upload their files without signing in to their Google Accounts.
Is that possible?
You could use a POST policy doc: https://cloud.google.com/storage/docs/xml-api/post-object#policydocument
This will let you build a signed request with various constraints built in (content-length, content-size, etc.), allowing unauthenticated users to upload to your bucket with the given constraints.
I'm trying to learn DropBox API, I started learning using of creating Drop-in chooser APP. I created app and it works success, but before I choose the file, it needs to login on dropbox system. I want to set my app on my account, so for every user when they open my app, I want to give them possibility to choose files from my dropbox account. I want to create app and choose files without loggining on dropbox. I hope you understand what I mean...
The Drop-ins are part of the Dropbox web site, and are built to show the user their own accounts, so it's not possible to use the Drop-ins with a single pre-defined account for all users.
The Dropbox Core API was also designed with the intention that each user would link their own Dropbox account, in order to interact with their own files. However, it is technically possible to connect to just one account. The SDKs don't offer explicit support for it and we don't recommend doing so, for various technical and security reasons.
However if you did want to go this route, instead of kicking off the authorization flow, you would manually use an existing access token for your app. (Just be careful not to revoke it, e.g. via https://www.dropbox.com/account/security .)