Resumeable upload from client JavaScript? - javascript

I'm trying to understand if there currently is any way to do resumeable uploads (for example to a Google Cloud Storage bucket) from a web client. Looking at FileReader it does not look possible (for big files). Do I miss something?
https://developer.mozilla.org/en-US/docs/Web/API/FileReader

You may want to check the Cloud Storage official documentation for resumable uploads, either for the JSON API or the XML API. You'll basically need to request a resumable session URI to Storage in a first HTTP request and actually upload the file to that URI in a second request, via jQuery's ajax method for example.
You'll see that you'll need to authenticate your request via a bearer token when requesting the resumable session URI. As explained in this SO answer:
You'll either need to have your customers use their own Google credentials (unusual, but makes sense for a third party tool for managing someone else's Google Cloud resources) or use some form of signed URL or similar feature.

I did not understand the documentation. There is a "slice" method that can be used here, but it is on the File object. See for example "Reading local files in JavaScript - HTML5 Rocks", https://www.html5rocks.com/en/tutorials/file/dndfiles/

Related

How to set body for file upload from javascript to Zoho Creator

I need to upload an image file from javascript to zoho creator using rest api of zoho creator which is given here (https://www.zoho.com/creator/help/api/rest-api/rest-api-upload-file.html). But in this no parameter is available for attaching the file. from my survey regarding this, it requires a body formation to upload a file. So I need to know how to set the body and how to upload into zoho creator. Please help me!! I am using following line to capture image. and it is downloading to my local disk clearly. But now I need to upload that into zoho creator
var strmime="image/jpeg";
renderer.domElement.toDataURL(strmime,0.75);
Please look into the JQuery documentation for the $.post method, or the documentation of your respective XHR library. There you should find the information needed on how to add the required headers for this API.
It would also be possible to tunnel the request through your own PHP server as a proxy as described in the comments above. That way, you might me able to work around the problem by using a php implementation that does ignore the CORS header.
But as I already said: It is technically absolutely impossible to get rid of the CORS error from your Javascript code. This is a security feature implemented into all relevant browsers. The developer of the API, ZOHO, must gain your domain awesomesimple.96.lt CORS access to their API, otherwise it will not work from the browser directly. Please contact their technical support in case.

Download files from authenticated API with javascript

I have found several threads where the same question has been asked, but I suspect that the top answer in most of them are outdated.
My problem
I have a frontend javascript app communicating with an oauth authenticated API. This API contains files I want my users to be able to download. Because the API requires authentication, I can not show the user a regular link in order to initiate the download.
Instead, I have to send a XHR-request to initiate the download (so I can add the necessary authentication header).
In my case, the files will usually be pretty large (>1GB), so keeping them in memory is not a solution.
Reading this article, I'm wondering if it might be possible to stream the file from the API to the filesystem through the Javascript file API. Does anyone have a suggestion on how I might make this work?
Isn't this a pretty common problem in 2016?
It is somewhat hack-ish, but I've used it before and it works wonders.
From Downloading file from ajax result using blob

Upload from Client Browser to Google Cloud Storage Using JavaScript

I am using Google Cloud Storage. To upload to cloud storage I have looked at different methods. The method I find most common is that the file is sent to the server, and from there it is sent to Google Cloud storage.
I want to move the file directly from the user's web browser to Google Cloud Storage. I can't find any tutorials related to this. I have read through the Google API Client SDK for JavaScript.
Going through the Google API reference, it states that files can be transferred using a HTTP request. But I am confused about how to do it using the API client library for JavaScript.
People here would require to share some code. But I haven't written any code, I have failed in finding a method to do the job.
EDIT 1: Untested Sample Code
So I got really interested in this, and had a few minutes to throw some code together. I decided to build a tiny Express server to get the access token, but still do the upload from the client. I used fetch to do the upload instead of the client library.
I don't have a Google cloud account, and thus have not tested this, so I can't confirm that it works, but I can't see why it shouldn't. Code is on my GitHub here.
Please read through it and make the necessary changes before attempting to run it. Most notably, you need to specify the location of the private key file, as well as ensure that it's there, and you need to set the bucket name in index.html.
End of edit 1
Disclaimer: I've only ever used the Node.js Google client library for sending emails, but I think I have a basic grasp of Google's APIs.
In order to use any Google service, we need access tokens to verify our identity; however, since we are looking to allow any user to upload to our own Cloud Storage bucket, we do not need to go through the standard OAuth process.
Google provides what they call a service account, which is an account that we use to identify instances of our own apps accessing our own resources. Whereas in a standard OAuth process we'd need to identify our app to the service, have the user consent to using our app (and thus grant us permission), get an access token for that specific user, and then make requests to the service; with a service account, we can skip the user consent process, since we are, in a sense, our own user. Using a service account enables us to simply use our credentials generated from the Google API console to generate a JWT (JSON web token), which we then use to get an access token, which we use to make requests to the cloud storage service. See here for Google's guide on this process.
In the past, I've used packages like this one to generate JWT's, but I couldn't find any client libraries for encoding JWT's; mostly because they are generated almost exclusively on servers. However, I found this tutorial, which, at a cursory glance, seems sufficient enough to write our own encoding algorithm.
I'd like to point out here that opening an app to allow the public free access to your Google resources may prove detrimental to you or your organization in the future, as I'm sure you've considered. This is a major security risk, which is why all the tutorials you've seen so far have implemented two consecutive uploads.
If it were me, I would at least do the first part of the authentication process on my server: when the user is ready to upload, I would send a request to my server to generate the access token for Google services using my service account's credentials, and then I would send each user a new access token that my server generated. This way, I have an added layer of security between the outside world and my Google account, as the burden of the authentication lies with my server, and only the uploading gets done by the client.
Anyways, once we have the access token, we can utilize the CORS feature that Google provides to upload files to our bucket. This feature allows us to use standard XHR 2 requests to use Google's services, and is essentially designed to be used in place of the JavaScript client library. I would prefer to use the CORS feature over the client library only because I think it's a little more straightforward, and slightly more flexible in its implementation. (I haven't tested this, but I think fetch would work here just as well as XHR 2.).
From here, we'd need to get the file from the user, as well as any information we want from them regarding the file (read: file name), and then make a POST request to https://www.googleapis.com/upload/storage/v1/b/<BUCKET_NAME_HERE>/o (replacing with the name of your bucket, of course) with the access token added to the URL as per the Making authenticated requests section of the CORS feature page and whatever other parameters in the body/query string that you wish to include, as per the Cloud Storage API documentation on inserting an object. An API listing for the Cloud Storage service can be found here for reference.
As I've never done this before, and I don't have the ability to test this out, I don't have any sample code to include with my answer, but I hope that my post is clear enough that putting together the code should be relatively straightforward from here.
Just to set the record straight, I've always found OAuth to be pretty confusing, and have generally shied away from playing with it due to my fear of its unknowns. However, I think I've finally mastered it, especially after this post, so I can't wait to get a free hour to play around with it.
Please let me know if anything I said is not clear or coherent.

Upload files from browser to S3

I have a pre signed url that allows you to PUT documents objects into S3.
http://docs.aws.amazon.com/AmazonS3/latest/dev/UploadObjectPreSignedURLDotNetSDK.html
How do I put a file from the browser to S3 using javascript? I am using angular, but I am open to using any javascript library.
I believe I could POST to my server, and then PUT the object on the amazon server, but I would prefer to do it from the browser.
I have changed the CORS settings on S3 to allow PUTs.
I was planning to use angular file upload, but it is hard coded to POST not PUT.
https://github.com/danialfarid/angular-file-upload
Amazon has a guide (here) that describes how to POST-upload a file into your S3 bucket. It relies on an input <form> signed with your AWS private key. You can specify restrictions on the target directory, as well as file-size restrictions.
It's a bit annoying to use, because you have to duplicate most of the fields in both the <form> and the signed policy, but it seems to work.
After the POST, S3 will redirect the browser to a URL you specify in the form (with parameters specifying the name of the uploaded file, etc.). This isn't ideal for Angular web sites, which tend to be "applications" rather than a set of discrete pages, but you could probably work with it.
On my Angular site, I did the POST in Javascript using $http.post() and passed all the appropriate form data. Unfortunately, I was always getting "cancelled" errors, even though the uploads were actually successful. In my case, I just double-checked by downloading the file with $http.get() and comparing it to the original data... but this was only a viable solution because my files were only a couple of KB.

Uploading files on web from client without revealing API key

I'm trying to upload a file from a web application to an external source (such as scribd) for example. to upload the file I need to send the API key as well. however if i send the API key from the client it will be revealed to users who search for it on the client side.
How could I upload from client using an API key that I don't want to reveal to users? It seems redundant to upload it to my server and then to the external source.
As redundant as it may be to pass through your server, it's the only way. You can't use the key client-side and hide it from the client, and if you don't use HTTPS it can easily be intercepted too. As a side note, I don't know about Scribd but usually stealing API keys is not very useful, so you may just live with the "risk".
Edit:
apparently Scribd offers a way to provide encrypted requests so that your API key can't be deduced by them (you have to generate these remotely and send them to the client of course). See http://www.scribd.com/developers/api?method_name=Signing

Categories

Resources