To stream file data directly from Dropbox to webpage - javascript

I want to stream data from my Dropbox to webpage in real time, but don't know how to do it.

It's usually a bad idea because Dropbox can throttle speed, stop sharing file when using from many locations.
You can install Dropbox to your server and sync some folder with your Dropbox:
https://www.dropbox.com/install
And to stream from your local folder is easier task.
But if you really need to get files from Dropbox real-time, you can use their API. They've got libraries for many languages. For example this one is for PHP, also tutorial there:
https://www.dropbox.com/developers-v1/core/start/php

Related

Node.js - Pipe a file stream from one server to another

I'm trying to use Node.js to get a file from a remote URL then send it to another server (using an API provided by each of the two websites). I already managed to successfully upload a local file to the remote server using fs.createReadStream("file.png"). Remote files seem to be a different story however: I can't simply put "https://website.com/file.png" in there, I need an equivalent for createReadStream for remote files.
Obviously I could use a separate command to download the file locally and upload it using createReadStream then delete the local file, but I want my code to be efficient and not rely on manually downloading temporary files, plus this is a good learning experience. I'd thus like to know the simplest way to pipe files as streams between two different servers.
Also I would like to avoid using extra dependencies if possible, as I'm writing a simple script which I'd rather not make reliant on too many npm packages. I rely on require("https") and require("fs") primarily. I'm curious if this can be achieved through a simple https.get() call.

How to download Image from browser & upload to Amazon S3

How can I allow the client to upload an image in the browser, and then I upload it to Amazon S3? I have been looking around a lot and have found no resources explaining how to do this.
Are there any tutorials that I could follow?
Are there any libraries that I should use for this?
I am using AngularJS on the frontend and Node.js on the backend.
In short, look for two different tutorials. One for uploading from a client to a server, one for uploading from a server to S3.
StackOverflow discourages linking to specific tutorials, but there are lots of them out there, so it shouldn't be too tricky to track down.
For the client-to-server, you'll want to do a basic HTML form upload up to the server, then snag the data. You can temporarily write it to your file system (if you're on Linux, the /tmp directory is a good place to stash it).
After that, just upload from your server to S3. Amazon itself has some good documentation on that. The s3 package for Node also has good examples: https://www.npmjs.com/package/s3
It's also possible to go straight from the browser to S3, which may be better depending on your use case. Check this out: http://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
You're going to need the AWS SDK for node. Then they have a pretty comprehensive developper guide. You may have to read up on credential management, too.
the procedure would be as follows
user uploads the image from browser to your server (I'd recomend plain form upload, unless you feel ok with uploading via ajax)
then your server uses the SDK to save to S3
you display back info to the user (link to the image, upload status ?).
you could also use pre-signed posts but that seems more advanced and I haven't seen info on it for Node.

Uploading large image files and video to Google Cloud Storage

I am using the standard python app engine environment and currently looking at how one goes about uploading multiple large media files to Google Cloud Storage (Public Readable) using App Engine or the Client directly (preferred).
I currently send a bunch of smaller images (max 20 - between 30 and 100k on average), at the same time directly via a POST to the server. These images are provided by the client and put in my projects default bucket. I handle the requests images using a separate thread and write them one at a time to the cloud and then associate them with an ndb object. This is all fine and dandy when the images are small and do not cause the request to run out of memory or invoke a DeadlineExceededError.
But what is the best approach for large image files of 20mb+ a piece or video files of up to 1GB in size? Are there efficient ways to do this from the client directly, would this be possible via the Json api ,a resumable upload, for example? If so, are there any clear examples of how to do this purely in javascript on the client? I have looked at the docs but it's not intuitively obvious at least to me.
I have been looking at the possibilities for a day or two but nothing hits you with a clear linear description or approach. I notice in the Google Docs there is a way using PHP to upload via a POST direct from the client...https://cloud.google.com/appengine/docs/php/googlestorage/user_upload...Is this just relevant to using PHP on app engine or is there an equivalent to createUploadUrl for python or javascript?
Anyway, I'll keep exploring but any pointers would be greatly appreciated.
Cheers
Try BlobStore with Cloud Storage or the Image Service

Download files from authenticated API with javascript

I have found several threads where the same question has been asked, but I suspect that the top answer in most of them are outdated.
My problem
I have a frontend javascript app communicating with an oauth authenticated API. This API contains files I want my users to be able to download. Because the API requires authentication, I can not show the user a regular link in order to initiate the download.
Instead, I have to send a XHR-request to initiate the download (so I can add the necessary authentication header).
In my case, the files will usually be pretty large (>1GB), so keeping them in memory is not a solution.
Reading this article, I'm wondering if it might be possible to stream the file from the API to the filesystem through the Javascript file API. Does anyone have a suggestion on how I might make this work?
Isn't this a pretty common problem in 2016?
It is somewhat hack-ish, but I've used it before and it works wonders.
From Downloading file from ajax result using blob

Writting to a txt file inside dropbox with processing.js/javascript

I'm creating an application using processing.js and hosting the result via dropbox public folder, the idea is to use a .txt file generated via that platform to comunicate certain data to a local 3d modelling enviroment (rhinoceros/grasshopper), is it possible to write to a .txt file hosted in the public folder in dropbox directly from the sketch running in the web?
I mean, using: saveStrings("test.txt","this is an example");
the html containing the sketch, the .txt file and the sketch file itself are all stored in the same public folder in dropbox, you can see the site here: https://dl.dropboxusercontent.com/u/97841548/kinetica%20App/KineticaAppHTML.html
thanks in advance
You can't directly access your local filesystem from a web page like that, and you don't get write access to your Dropbox account just by virtue of the KineticaAppHTML.html page being hosted on Dropbox.
One way to do this however would be to use the Dropbox API. You could either proxy the file writes to a server you control and then have that server make the API calls to Dropbox, or you could use the Dropbox API directly from JavaScript on your KineticaAppHTML.html page.
Dropbox offers a JavaScript SDK you can use:
https://www.dropbox.com/developers/datastore/sdks/js
There's a tutorial here, though it focuses on datastores functionality and not files:
https://www.dropbox.com/developers/datastore/tutorial/js
The basics for authenticating the user are relevant though.
Once authenticated, you can use this method to write new contents to the file in your Dropbox:
https://www.dropbox.com/developers/datastore/docs/js#Dropbox.Client.writeFile
Note however that this strategy only results in you yourself, i.e., in your own browser, being able to make Dropbox API calls. If you want other users to use this to, this setup would have them connecting to their own Dropbox accounts. If you need them to connect to only your own account, you'd need to host and use your own actual web app, like I mentioned earlier, where you could safely make API calls to your own account. (You could technically avoid this by embedding an access token in your web page, but this is highly discouraged due the security implications there.)
Short answer: no.
Webpages in your browser do not have read/write access to your file system. Only the web.
Long answer: yes, but not the way you describe.
If you have a server running with a RESTful API that you can call by URL (like any other API on the web), then you can use any "ajax" approach there is to communicate with that, giving it the data it needs to generate those files, and making it responsible for putting them in the right dropbox folder.

Categories

Resources