Uploading files to Azure storage container from html5 directly - javascript

For my scenario, I am trying to allow a user to drag and drop files to a webpage using javascript that would be uploaded to a container, upload the files similar to how wordpress media uploading works from the administrative side. The problem I am having is that I found code for creating a SAS url for the container,
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24);
sasConstraints.Permissions = SharedAccessBlobPermissions.Write;
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return container.Uri + sasContainerToken;
but all of the examples I found seem to indicate that I have to generate a sas url for each blob
Microsoft.WindowsAzure.Storage.Blob.CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Get a reference to a container to use for the sample code, and create it if it does not exist.
Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
//Create a new stored access policy and define its constraints.
Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy sharedPolicy = new Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),
Permissions = Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPermissions.Write
};
//Get the container's existing permissions.
Microsoft.WindowsAzure.Storage.Blob.BlobContainerPermissions permissions = container.GetPermissions();//new Microsoft.WindowsAzure.Storage.Blob.BlobContainerPermissions();
Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
return blob.Uri.AbsoluteUri + blob.GetSharedAccessSignature(sharedPolicy);
instead as if I am uploading one file.
An administrator can upload any number of files, so to have to generate a blob sas via web api call for each one of these files seems to be very inefficient. I would prefer to generate a SAS for the container and allow the user to upload to that container for a specified time, say 3 hours. Also, I would like to use chunking to upload each file. Would this be possible or would I have to generate a blob sas url for each file?

An administrator can upload any number of files, so to have to
generate a blob sas via web api call for each one of these files seems
to be very inefficient. I would prefer to generate a SAS for the
container and allow the user to upload to that container for a
specified time, say 3 hours.
It is certainly possible. When you create a SAS on a blob container with Write permission (for uploading purpose), same can be used for uploading multiple blobs in that container. You just have to construct blob URI based on the file being uploaded and append the SAS token. So for example, you created a SAS token for mycontainer container in myaccount storage account and uploading a file myfile.png, your SAS URL would be https://myaccount.blob.core.windows.net/mycontainer/myfile.png?SAS-Token.
I noticed in your code that you're returning container URI with SAS token. In this case, you just have to insert the file name after container name to get blob upload URI.
Also, I would like to use chunking to upload each file.
It is again possible. I wrote a few blog posts about this some time back which you may find useful:
http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/ (This was written before CORS support in Azure Storage so please ignore my comments about CORS in the post).
http://gauravmantri.com/2013/12/01/windows-azure-storage-and-cors-lets-have-some-fun/

Related

Is it possible to store files in browser's temporary folder and retrieve them using JavaScript?

I'm using a PDF viewer on the frontend which can only read .pdf URLs (it's not able to read blob directly). So I'll need to download those files to the user's computer first. Hypothetically, this is what I want to do:
// download blob
let pdf = fetch(mysite.com);
// store in a temporary folder
// maybe "%AppData%\Local\Google\Chrome\User Data\Default\Cache"?
browser.storeFile(pdf);
// retrieve the file from temporary folder and pass to PDF viewer
myPDFViewer.load('%AppData%\Local\Google\Chrome\User Data\Default\Cache\my-file.pdf')
The PDF viewer I'm using can load offline files perfectly fine, as long as I have the absolute path to the file (ending with .pdf). I have tried all sorts of things but still not able to generate such .pdf URLs from the fetched blob.
So, is it possible to download files, store them in a temporary folder internal to the browser (which will get cleaned up automatically), and generate absolute local paths to those files?

Within a JavaScript web-app start download of large file served from a RESTful endpoint which requires setting header for user's desired format

Question in brief
From within a JavaScript web application how can I have a user's browser download a large (on the order of gigabytes), dynamically generated file from a RESTful endpoint which requires the setting of headers?
Question at length
From within a JavaScript web application I need to provide users with the ability to download a potentially large (~GBs), dynamically generated file in one of several formats (eg text/flatfile, application/vnd.ms-excel, and others) from a RESTful API. The desired format is indicated in the header of the request. For example, to download a fasta format version of the file:
curl -s -H 'accept:text/fasta' 'https://.../api/download?query=human'
The back end will run the query, retrieve the results, and return a file in the fasta format. An issue arises when I want the front end application to provide a download link to the user. Disregarding the need for setting headers I can create a download link like so:
const link = document.createElement('a');
link.href = "https://.../api/download?query=human";
link.setAttribute('download', 'download');
document.body.appendChild(link);
link.click();
or equivalently as HTML:
<a download href="https://.../api/download?query=human">
download human fasta
</a>
However, as far as I know, I don't have a means of setting the header to specify the format. I could make a request within the app itself:
axios.get(
'https://.../api/download?query=human',
{
headers: {
Accept: 'text/fasta',
},
}
)
.then(response => {
// provide response.data to the client as a download link
});
However, I can see several problems with this:
This will fill up the user's browser's memory (and these can be large downloads)
There will be no indication of progress (eg 50% done)
It will be tied to the web app's browser tab as it's not a separate download process. If this is closed the download cancels.
Potential front end solution
Problems 1 and 2 could potentially be solved with the Streams API but problem 3 would still exist.
Potential back end solution
As the back end team who manage the API are open to modifications a potential (and less RESTful way) solution is to request a particular file format is by using a format parameter in the url. Then to access the fasta format of the "human" query I can do:
<a download href="https://.../api/download?query=human&format=flatfile">
download human fasta
</a>
Do other solutions or approaches exist?

How to upload file to S3 without AWS JavaScript SDK?

AWS SDK for JavaScript (even if only the S3 component is included) is a huge bulk for just some sporadic file uploads in my webapp. Is there a leaner way to upload files to an S3 bucket directly from the client's browser, given that I have the bucket name, accessKeyId and secretAccessKey at my disposal?
To upload files from browser to S3 you can use presigned PUT. This way you will not be disclosing the aws secret to the browser. You can use the minio-js library to generate presigned PUT url.
On the server side you can generate the presigned PUT url like this:
var Minio = require('minio')
// find out your s3 end point here:
// http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
var s3Client = new Minio({
url: 'https://<your-s3-endpoint>',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})
var presignedUrl = s3Client.presignedPutObject('bucket', 'object', 24*60*60)
// presignedUrl expires in 1 day
You can pass this presigned URL to the browser which can just do a simple HTTP PUT to amazon s3. The PUT request will succeed because the signature will be part of the presignedUrl.
You can also alternatively use presigned POST to upload too. Presigned POST gives much more control on the upload - like you can limit the size of the upload object, its content-type etc.
S3 supports uploads from the browser using a form post upload, with no special code needed at the browser. It involves a specific design of form and a signed policy document that allows the user to only upload files matching constraints you impose, and doesn't expose your secret key. It will optionally also redirect the browser back to your site after the upload.
http://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html
Try use Extended S3 Browser (Chrome extension)

Getting an MD5 of a remote file blob in Azure node js SDK

I'm writing a backup script that will pull a full copy of every file in a specific blob container in our Windows Azure blob storage. These files are not uploaded by me, I'm just writing a script that traverses the blob storage and downloads the files.
To speed up this process and skip unnecessary downloads, I'd like to request MD5s for the files before downloading them, and compare them with the already local files.
My problem: I can't find documentation anywhere detailing how to do this. I'm pretty sure the API supports it, I'm finding docs and answered questions related to other languages everywhere, but not for the Node.js Azure SDK.
My question: Is it possible, and if yes, how, to request an MD5 for the remote file blob through the Azure Node.js SDK, before downloading it? And is it faster than just downloading the file?
It is certainly possible to get blob's MD5 hash. When you list blobs, you'll get MD5 in blob's properties. See the sample code below:
var azure = require('azure');
var blobService = azure.createBlobService("accountname", "accountkey");
blobService.listBlobs("containername", function(error, blobs){
if(!error){
for(var index in blobs){
console.log(blobs[index].name );
console.log(blobs[index].properties['content-md5'] );
}
}
});
Obviously the catch is that blob should have this property set. If this property is not set, then an empty string is returned.

FakePath issue in Chrome browser

I am making a browser based audio player. So for making a playlist from the local directory I am using :
<input type="file" id="getFile" />
Then I am using a button to confirm the playlist.On clicking the button I am calling a javascript function to change the src of the audio tag to play the new audio file selected in the playlist. I want the exact path of the file from the input file to run in the HTML5 audio player but it starts taking the path as C://Fakepath/filename.mp3. Can someone help me with this.
This is a security feature, by design. You should not be able to read the original file path of a file input into a browser form. File input is for reading file contents only, not metadata like path on the user's file system.
The good news is that you don't need the original file path. You can use FileReader's readAsDataURL to convert the file contents into a base64-encoded data URL and use that as the audio src. To read from #myUploadInput and output through #myAudioElement (also available as a working fiddle):
var reader = new FileReader();
reader.onload = function (event) {
document.getElementById("myAudioElement").src = event.target.result;
};
reader.readAsDataURL(document.getElementById("myUploadInput").files[0]);
if the user is 'building' / creating the playlist based on files they have locally you could do a 'browse' field (s) where they select the local audio files, then take the contents of the field (that Should include the paths to those images), build an array of the count/id, filename.mp3, and path... then, based on what is 'chosen' to play, just reassemble the full local path and play that file.
that would be an approach I would take anyway to see if it would work. the necessary piece here is getting the user to disclose the paths to the audio files... but Im still not 100% sure it would work given the security feature that the earlier commenter posted a link to.
if this were included in an application the user approved for local installation you could just refer to it using the 'application directory' and copy the file to that 'safe location' but since its web based it just really opens up a whole can of worms in terms of a potentially unapproved / authorized web function knowing your local directory structure. good luck, let me know if you find a solution.

Categories

Resources