Google storage transfer service with Twilio - javascript

I am working with google cloud storage transfer service to transfer data from twilio recording url to google cloud bucket. While implementing above thing I came to know that to transfer a file from url you must have a md5 hash of that object.
Now twilio doesn't provide me with the md5 hash value.I wanted to ask is it possible to do above thing and also along with that is there any other way to transfer the content of a url to directly on google cloud bucket.
Also I don't want to use my server for a very long amount of time it has to be quick like schedule so that i can track it or some kind of callback when it will get completed.

Twilio developer evangelist here.
It looks like whatever you do, you're going to need to download the recording files to your own server at some point during this process.
You could loop through the files, download them and generate the md5 hash for each of them, then discarding the file but creating the TSV of URLs and hashes as you go.
But, if you do that, you've done half the work in downloading the file, so you might as well continue to upload the file to Google Cloud Storage from that point, using gsutil or the JSON API.

Related

Generate expiring download link for newly purchased item

I am building a simple static website selling a single pdf file using the Stripe checkout api.
I would like to be able to generate an expiring download link after the customer successfully purchased the pdf.
I am really not sure about how to do this, but I was thinking about using firebase to store the file in a bucket and somehow use a cloud function to generate a temporary download link that expires after some time, but I am not sure how to go about this (or if this is even the best solution).
Can anyone give me some help about which direction to go here?
Firebase'd download URLs don't expire, although you can revoke the from the Firebase console.
But a better fit for your use-case might be to use Cloud Storage's signed URLs, which have a built-in expiration attribute. You can generate these with the Cloud Storage SDKs or Firebase's Admin SDKs, all of which should only be run in trusted environments - such as your development machine, a server you control, or Cloud Functions.
Also see:
A guide to Firebase Storage download URLs and tokens.
Get Download URL from file uploaded with Cloud Functions for Firebase

Javascript storage in offline application

I'm working on an offline application in Javascript that will convert a <div> to <canvas> and save it as an image to a location in local disk.
I'd prefer if the saving has no dialog and to save to the same location (set in configuration) every time.
I'm still going through the documentation on Cache API but there doesn't seem to be any answers in regards to setting absolute paths. All the examples show relative paths. I'd like to set it to something like C:/Users/Work/Presentation/file1.jpg, and have it overwrite itself every save.
Is this possible with Cache API or is there another offline storage API (localstorage won't work because it only stores k-v pairs) that would better suit this use case? Is there a library that already exists to make this implementation easier?
Why not save the image data as a base-64 encoded image or a BLOB in localStorage? And deal with encoding/decoding in your code?
You can try these packages:
https://www.npmjs.com/package/base64-img
https://www.npmjs.com/package/blob-util
And, Yes. You should change your Username. ;)
Have you tried PouchDB, via their home page.
PouchDB was created to help web developers build applications that
work as well offline as they do online.
It enables applications to
store data locally while offline, then synchronize it with CouchDB and
compatible servers when the application is back online, keeping the
user's data in sync no matter where they next login.

Allow full synchronized filesystem read/write access over a subdirectory of an S3 bucket?

I have a web service involving the attachment of multiple documents to one of many "objects". To simplify this process and make it easier to edit each file individually if the user so desires, I want the user to be able to synchronise all of these files onto a directory on his/her computer (much like that of Google Drive or Dropbox). If the user were to change one of these files, remove a file, or add a file, this would reflect on my web service, and hence affect the files that are attached to this "object".
What would be the best choice of services to do this? I am currently using a Node.JS back-end, although I suspect this will do little to influence the choice of storage. I'm looking for a service that allows the user the flexibility of full filesystem-level CRUD, whilst synchronising the files in a secure manner to a subset of a larger object storage collection, much like synchronising (and only providing access to) a subdirectory of an AWS S3 bucket, hence the title of this question.
I'm currently looking into (somehow) doing this with AWS S3, although I am open to using another storage service.
Thanks in advance.
AWS Storage Gateway (cached gateway) allows you to edit files locally and the Gateway will synchronize the update automatically over to S3.
You will need to install a small VM on the machine. Typically, if your clients have a private data centre / server , this config will allows a "share folder" (or a NAS) to be synchronized with S3.

Upload blob to dropbox from client-side javascript

I have an app that runs in the client browser and doesn't have any server side (http/js is served, but nothing posts to the server). the app is redeployed on many servers (iis, apache, nginx, sometimes localhost, sometimes on an intranet) and are served using http (not https). My app generates files such as zip files and pdf's in the clients browser as blobs BEFORE I want to save, so having them navigate away on the same page then back to the app defeats the purpose; and I can't post the generated data to dropbox anymore, since they have to start over... I want to be able to send these blobs directly to files in the end users dropbox (and later google drive).
https://www.dropbox.com/developers-v1/dropins/saver performs exactly as I would like. It pops up. It lets the user authenticate in the popup. It lets the user choose where they want to put my file. But I can't send it a data uri, or base64-encoded data, or a bytearray, or whatever. It only works with files previously saved somewhere accessible on the net. So it does not work for me.
https://www.newfangled.com/direct-javascript-dropbox-api-usage/ shows how I could embed the oauth data, which I don't have.
https://blogs.dropbox.com/developers/2013/12/writing-a-file-with-the-dropbox-javascript-sdk/ seems like it should work, except that it's trying to perform an oauth session and it uses the same window as my app (which is undesired).
My current tabs I'm looking at (includes entries from a few years ago, so things might have since changed). Some articles indicate that it isn't possible. Other articles incidate that it IS possible - i mean this particular comment https://github.com/dropbox/dropbox-js/issues/144# doesn't help me much. Neither does "I'll be sure to pass this along as feedback" - was it passed along? To whom?
https://github.com/dropbox/dropbox-js/issues/144
https://stackoverflow.com/questions/30094403/save-input-text-to-dropbox
https://blogs.dropbox.com/developers/2015/06/programmatically-saving-a-url-to-dropbox/
How can I upload files to dropbox using JavaScript?
upload file to dropBox using /files_put javascript
https://github.com/morrishopkins/DropBox-Uploader/blob/master/js/reader.js
https://www.dropbox.com/developers/saver
https://www.dropboxforum.com/hc/en-us/community/posts/202339309-Can-I-save-a-JSON-stream-object-to-Dropbox-file-with-Dropbox-Post-Rest-API-
https://github.com/smarx/othw
Can Dropbox Saver accept data from createObjectURL()?
It sounds like the code from https://blogs.dropbox.com/developers/2013/12/writing-a-file-with-the-dropbox-javascript-sdk/ works fine for you, but you want to do the auth in a separate browser window/tab.
If so, I'd suggest just changing that code to use the Popup auth driver instead.

indexedDb backup in the cloud

I would like my client web app to be able to backup the indexedDb database.
I do not have a webserver
i found this: Exporting and importing IndexedDB data
that says: "you can call an export callback passing the privileged array of objects representing a backup of your object store"
so i have an array of objects in javascript which is my backup...
how can I :
1. turn it to a file?
2. back it up in the cloud?
thank you for your help.
Michael (belgium)
nb: I would love to use dropbox api, i have found this http://code.google.com/p/dropbox-js/source/browse/#svn/trunk but doesn't know if it works ( will try it and let you know).
nb: Also i am planning to use pokki.com to deploy app on client's desktop.
From https://developer.mozilla.org/en-US/docs/IndexedDB/Basic_Concepts_Behind_IndexedDB
Synchronizing. The API is not designed to take care of synchronizing with a server-side database. You have to write code that synchronizes a client-side indexedDB database with a server-side database.

Categories

Resources