Uploading a blob url to azure storage? - javascript

I have the URL to a blob which I'm trying to upload to azure storage, there doesn't seem to be an obvious way of doing this as none of the APIs handle uploading a blob url directly.
I'm trying to do something like this:
blobService.createBlockBlobFromLocalFile('taskcontainer', 'myfile.png', blobUrl, (error, result, response) => {
});
Which doesn't work, I've tried to find ways to read the blob url to a readable stream and upload that but haven't gotten very far either.
I basically have a file selected by the user using react-dropzone which provides me with a blob url (which can look like this: blob:http://localhost:3000/cd8ba70e-5877-4112-8131-91c594be8f1e) pointing to the local file. My goal is to now upload that blob url to an azure container.
Firebase storage has a 'put' function which allows you to upload the blob from a url: https://firebase.google.com/docs/storage/web/upload-files
This is the closest I have gotten:
var blobUrl = acceptedFiles[0].preview;
var xhr = new XMLHttpRequest();
xhr.open("GET", blobUrl);
xhr.responseType = "text";//force the HTTP response, response-type header to be blob
xhr.onload = function () {
const Stream = require('stream')
const readable = new Stream.Readable()
readable.push(xhr.responseText);
readable.push(null);
blobService.createBlockBlobFromStream('taskcontainer', 'myblob.png', readable, xhr.responseText.length, (error, result, response)=>{
var ok = 0;
})
}
xhr.send();
The file (or parts of it?) seem to get uploaded but the end result is the file type is lost and I can't view the png uploaded..

You could try the following
var azure = require('azure-storage');
var blobService = azure.createBlobService('', '');
blobService.createBlockBlobFromLocalFile('nodecontainer', 'AzureDC', 'azure_center.png', function(error, result, response) {
publicAccessLevel: 'blob'
}, function(error, result, response) {
if (!error) {
console.log(response);
} else {
console.log(error);
}
});
EDIT
Check this code snippet to upload blob to azure storage

Related

Uploading file with FormData and XmlHttpRequest with Formidable on node js backend shows empty files, fields

I am uploading a csv file using FormData and XmlHttpRequest. Here is the code for that.
I have a form wrapped around an html input type file, whose onchange event I am executing this code. I have tried to send the form directly as well and also read the form element into the FormData object.
let formData = new FormData();
let file = e.target.files[0];
var blob = new Blob([file],{type: 'text/csv'});
formData.append("payoutUpload", blob, 'processed.csv');
let uri = encodeURI(`${window.serviceUri}${path}`);
var req = new XMLHttpRequest();
req.onload = (result) => {
if (req.status === 500 && result && result.code === 'ECONNRESET') {
console.log(
'Connection was reset, hence retry the sendRequest function'
);
} else if (req.status === 200) {
} else {
console.log("Error while retrieving data");
}
}
req.onerror = (e) => {
console.log('There was an error while retrieving data from service', e);
};
req.open('POST', uri, true);
req.setRequestHeader('Content-Type', 'multipart/form-data');
req.setRequestHeader('Authorization', 'Bearer ' + token);
req.send(formData);
When I send the request, I can see that the file is being sent in the form of Request Payload.
On the NodeJs backend, I am running Express and Formidable. I am not using body-parser, I am using express's inbuilt json and urlencoding methods.
Here is the formidable part.
const form = formidable({multiples: true});
form.parse(req, (err, fields, files) => {
console.log(`error is ${JSON.stringify(err)}`);
console.log(`fields is ${JSON.stringify(fields)}`);
console.log(`files JSON: ${JSON.stringify(files)}`);
console.log('file in request: ' + files.payoutUpload);
console.log(`req.body: ${req.body}`);
options.file = files.payoutUpload;
});
I get err, fields and files as empty. I have searched through all similar questions and set the request headers correctly(which is usually the issue). I can see that the request.body still has the file payload on the server end. But formidable does not parse this. Can anyone tell what I am doing wrong?
UPDATE: I have tried other packages for parsing the file, like multer, express-fileupload, all of them return files as empty. I have also tried fetch API to send my request, but with no luck.
req.setRequestHeader('Content-Type', 'multipart/form-data')
When you send multipart/form-data you must include a boundary parameter in the header however you can't know what value you need to set for this.
Don't set the Content-Type header at all. Allow XMLHttpRequest to generate it automatically from the FormData object.

Implement file download from REST service in webapp

I have a simple webapp which I intend to serve a file download from a REST api. The file is of type .xlsx.
My naive attempt to accomplish this uses a design pattern that I have copied from other data pulls from the REST api for example:
var requestData = JSON.stringify({level: plevel, yearMonthStart: beginningYearmo, yearMonthEnd: endingYearmo});
var url = 'http://localhost:8181/v2/file/download';
d3.request(url)
.header("X-Requested-With", "XMLHttpRequest")
.header("Content-Type", "application/octet-stream")
.mimeType("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")
.send("POST",
requestData,
function(rawData){
console.log(rawData);
});
The response from the server is a 415 error code (unsupported payload media type).
I have tried to set the appropriate headers for the filestream as can be seen above.
My expected behaviour is that the request is accepted without error and the browser initiates a file download. Any guidance here on how to better accomplish this would be appreciated.
I found some examples in other posts that light the way.
You can do this using blob files. Here's an example:
function downloadReport(level, beginningYearmo, endingYearmo){
var requestData = JSON.stringify({plevel: level, yearMonthStart: beginningYearmo, yearMonthEnd: endingYearmo});
var url = 'http://localhost:8181/file/download';
d3.request(url)
.header("X-Requested-With", "XMLHttpRequest")
.header("Content-Type", "application/json")
.header("Accept","application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")
.header("Cache-Control", "no-cache")
.header("Accept-Charset", "utf-8")
.on("load", function(data){
var blob = new Blob([data], {type: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"});
var objectUrl = URL.createObjectURL(blob);
window.open(objectUrl);
console.log("Download request was successful.");
})
.on("error", function(error){ alert("Error: ", error) })
.send("POST", requestData);
}

What is the output of a piped file stream?

Perhaps the question is not worded in the greatest way but here's some more context. Using GridFSBucket, I'm able to store a file in mongo and obtain a download stream for that file. Here's my question. Let's say I wanted to send that file back as a response to my http request.
I do:
downloadStream.pipe(res);
On the client side now when I print the responseText, I get some long string with some funky characters that look to be encrypted. What is the format/type of this string/stream? How do I setup my response so that I can get the streamed data as an ArrayBuffer on my client side?
Thanks
UPDATE:
I haven't solved the problem yet, however the suggestion by #Jekrb, gives exactly the same output as doing console.log(this.responseText). It looks like the string is not a buffer. Here is the output from these 2 lines:
console.log(this.responseText.toString('utf8'))
var byteArray = new Uint8Array(arrayBuffer);
UPDATE 2 - THE CODE SNIPPETS
Frontend:
var savePDF = function(blob){
//fs.writeFile("test.pdf",blob);
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (this.readyState === XMLHttpRequest.DONE && this.status === 200){
//TO DO: Handle the file in the response which we will be displayed.
console.log(this.responseText.toString('utf8'));
var arrayBuffer = this.responseText;
if (arrayBuffer) {
var byteArray = new Uint8Array(arrayBuffer);
}
console.log(arrayBuffer);
}
};
xhr.open("POST","/pdf",true);
xhr.responseType = 'arrayBuffer';
xhr.send(blob);
};
BACKEND:
app.post('/pdf',function(req,res){
MongoClient.connect("mongodb://localhost:27017/test", function(err, db) {
if(err) return console.dir(err);
console.log("Connected to Database");
var bucket = new GridFSBucket(db, { bucketName: 'pdfs' });
var CHUNKS_COLL = 'pdfs.chunks';
var FILES_COLL = 'pdfs.files';
// insert file
var uploadStream = bucket.openUploadStream('test.pdf');
var id = uploadStream.id;
uploadStream.once('finish', function() {
console.log("upload finished!")
var downloadStream = bucket.openDownloadStream(id);
downloadStream.pipe(res);
});
// This pipes the POST data to the file
req.pipe(uploadStream);
});
});
My guess is that either the response is being outputted as plain binary which is not base64 encoded (still a buffer) or it is a compressed (gzip) response that needs to be uncompressed first.
Hard to pinpoint the issue without seeing the code though.
UPDATE:
Looks like you're missing the proper response headers.
Try setting these headers before the downloadStream.pipe(res):
res.setHeader('Content-disposition', 'attachment; filename=test.pdf');
res.set('Content-Type', 'application/pdf');
Your stream is likely already a buffer. You might be able to call responseText.toString('utf8') to convert the streamed data into readable string.
I solved it!!!
Basically preset the response type to "arraybuffer" before you make the request using
req.responseType = "arraybuffer"
Now, once you receive the response, don't use responseText, instead use response. response contains the arraybuffer with the data for the file.

how to send image to server with http.post in javascript and store base64 in mongodb

I have trouble getting into http requests on the client-side storing images on the server-side using mongodb. I appreciate help a lot. I need an easy example of how i add an image file as data into a http post request such as XMLhttprequest. Lets say, I know the url of the servermethod. The source of the image is defined in
imgsrc
a name of the file is stored in
name
I have this atm:
var http = new XMLHttpRequest();
httpPost.onreadystatechange = function(err) {
if (httpPost.readyState == 4 && httpPost.status == 200){
console.log(httpPost.responseText);
} else {
console.log(err);
}
}
var path = "http://127.0.0.1:8000/uploadImage/"+name;
httpPost.open("POST", path, true);
// I guess I have to add the imagedata into the httpPost here, but i dont know how
httpPost.send(null);
Then on the serverside at the path, the following method will be called and I want to store the url of the base64-encoded image in mongodb. How do I access the image from the httpPost?
function postNewImageType(req, res, next){
var newImageTypeData = {
name: req.params.name,
image: "placeholder.png"
}
var data = // how to access the image?
var imageBuffer = decodeBase64Image(data);
fs.writeFile(cfg.imageFolger+newImageTypeData._id+'.jpeg', imageBuffer.data, function(err){
if (err) return new Error(err);
newImageTypeData.set({image:newImageTypeData._id+'.jpeg'});
var image = new ImageType(newImageData);
});
imagetype.save(function (err) {
if (error) {return next(new restify.InvalidArgumentError(JSON.stringify(error.errors)));}
else { res.send(201, imagetype);}
});
}
There are a number of ways that you can send your image data in the request to the server, but all of them will involve calling the send method of your XMLHttpRequest object with the data you wish to send as its argument.
The send method both dispatches the request to the remote server, and sets its argument as the body of that request. Since you're expecting Base64 encoded image data on your server, you'll first need to convert your image file to Base64 data on the client.
The simplest way to convert an image to Base64 on the client is by loading the image as an image element, drawing it to a canvas element, and then getting the Base64 representation of the canvas's image data.
That might look something like the following (given that the URL for the original image is stored in a variable named imgsrc, and the desired name is stored in name as stated):
// This function accepts three arguments, the URL of the image to be
// converted, the mime type of the Base64 image to be output, and a
// callback function that will be called with the data URL as its argument
// once processing is complete
var convertToBase64 = function(url, imagetype, callback){
var img = document.createElement('IMG'),
canvas = document.createElement('CANVAS'),
ctx = canvas.getContext('2d'),
data = '';
// Set the crossOrigin property of the image element to 'Anonymous',
// allowing us to load images from other domains so long as that domain
// has cross-origin headers properly set
img.crossOrigin = 'Anonymous'
// Because image loading is asynchronous, we define an event listening function that will be called when the image has been loaded
img.onLoad = function(){
// When the image is loaded, this function is called with the image object as its context or 'this' value
canvas.height = this.height;
canvas.width = this.width;
ctx.drawImage(this, 0, 0);
data = canvas.toDataURL(imagetype);
callback(data);
};
// We set the source of the image tag to start loading its data. We define
// the event listener first, so that if the image has already been loaded
// on the page or is cached the event listener will still fire
img.src = url;
};
// Here we define the function that will send the request to the server.
// It will accept the image name, and the base64 data as arguments
var sendBase64ToServer = function(name, base64){
var httpPost = new XMLHttpRequest(),
path = "http://127.0.0.1:8000/uploadImage/" + name,
data = JSON.stringify({image: base64});
httpPost.onreadystatechange = function(err) {
if (httpPost.readyState == 4 && httpPost.status == 200){
console.log(httpPost.responseText);
} else {
console.log(err);
}
};
// Set the content type of the request to json since that's what's being sent
httpPost.setHeader('Content-Type', 'application/json');
httpPost.open("POST", path, true);
httpPost.send(data);
};
// This wrapper function will accept the name of the image, the url, and the
// image type and perform the request
var uploadImage = function(src, name, type){
convertToBase64(src, type, function(data){
sendBase64ToServer(name, data);
});
};
// Call the function with the provided values. The mime type could also be png
// or webp
uploadImage(imgsrc, name, 'image/jpeg')
When the request is received by your server, the request body will contain the JSON string with your Base64 image within it. Since you haven't provided the server framework or database driver you're using for Mongo, I've adapted your code assuming that you're using Express and Mongoose with an ImageType model already defined in your application.
Since you can always construct the file name of the image record from its _id property and your image folder path, it doesn't necessarily make sense to save that as a property on the record, but I've preserved that functionality here, which will require you to save your record twice in one request cycle.
I've also changed the way any errors from the filesystem call are handled. The 'err' you get back from a filesystem error is already an Error object, and will need to be handled by your server in some way.
function postNewImageType(req, res, next){
var json = JSON.parse(req.body),
newImageTypeData = {
name: json.name,
image: "placeholder.png"
},
imageBuffer = decodeBase64Image(data),
newImageType = new ImageType(newImageTypeData);
//First we save the image to Mongo to get an id
newImageType.save(function(err){
if(err) return next(new restify.InvalidArgumentError(JSON.stringify(err.errors)));
var fileName = cfg.imageFolder + newImageType._id + '.jpeg';
fs.writeFile(fileName, imageBuffer.data, function(err){
//Handle error in next middleware function somehow
if (err) return next(err);
newImageType.set({image: 'filename.png'});
newImageType.save(function(err){
if (err) return next(new restify.InvalidArgumentError(JSON.stringify(err.errors)));
res.send(201, imagetype);
});
})
});
}

Chrome extension: how to pass ArrayBuffer or Blob from content script to the background without losing its type?

I have this content script that downloads some binary data using XHR, which is sent later to the background script:
var self = this;
var xhr = new XMLHttpRequest();
xhr.open('GET', url);
xhr.responseType = 'arraybuffer';
xhr.onload = function(e) {
if (this.status == 200) {
self.data = {
data: xhr.response,
contentType: xhr.getResponseHeader('Content-Type')
};
}
};
xhr.send();
... later ...
sendResponse({data: self.data});
After receiving this data in background script, I'd like to form another XHR request that uploads this binary data to my server, so I do:
var formData = new FormData();
var bb = new WebKitBlobBuilder();
bb.append(data.data);
formData.append("data", bb.getBlob(data.contentType));
var req = new XMLHttpRequest();
req.open("POST", serverUrl);
req.send(formData);
The problem is that the file uploaded to the server contains just this string: "[object Object]". I guess this happens because ArrayBuffer type is lost somehow while transferring it from content process to the background? How can I solve that?
Messages passed between a Content Script and a background page are JSON-serialized.
If you want to transfer an ArrayBuffer object through a JSON-serialized channel, wrap the buffer in a view, before and after transferring.
I show an isolated example, so that the solution is generally applicable, and not just in your case. The example shows how to pass around ArrayBuffers and typed arrays, but the method can also be applied to File and Blob objects, by using the FileReader API.
// In your case: self.data = { data: new Uint8Array(xhr.response), ...
// Generic example:
var example = new ArrayBuffer(10);
var data = {
// Create a view
data: Array.apply(null, new Uint8Array(example)),
contentType: 'x-an-example'
};
// Transport over a JSON-serialized channel. In your case: sendResponse
var transportData = JSON.stringify(data);
//"{"data":[0,0,0,0,0,0,0,0,0,0],"contentType":"x-an-example"}"
// At the receivers end. In your case: chrome.extension.onRequest
var receivedData = JSON.parse(transportData);
// data.data is an Object, NOT an ArrayBuffer or Uint8Array
receivedData.data = new Uint8Array(receivedData.data).buffer;
// Now, receivedData is the expected ArrayBuffer object
This solution has been tested successfully in Chrome 18 and Firefox.
new Uint8Array(xhr.response) is used to create a view of the ArrayBuffer, so that the individual bytes can be read.
Array.apply(null, <Uint8Array>) is used to create a plain array, using the keys from the Uint8Array view. This step reduces the size of the serialized message. WARNING: This method only works for small amounts of data. When the size of the typed array exceeds 125836, a RangeError will be thrown. If you need to handle large pieces of data, use other methods to do the conversion between typed arrays and plain arrays.
At the receivers end, the original buffer can be obtained by creating a new Uint8Array, and reading the buffer attribute.
Implementation in your Google Chrome extension:
// Part of the Content script
self.data = {
data: Array.apply(null, new Uint8Array(xhr.response)),
contentType: xhr.getResponseHeader('Content-Type')
};
...
sendResponse({data: self.data});
// Part of the background page
chrome.runtime.onMessage.addListener(function(data, sender, callback) {
...
data.data = new Uint8Array(data.data).buffer;
Documentation
MDN: Typed Arrays
MDN: ArrayBuffer
MDN: Uint8Array
MDN: <Function> .apply
Google Chrome Extension docs: Messaging > Simple one-time requests
"This lets you send a one-time JSON-serializable message from a content script to extension, or vice versa, respectively"
SO bonus: Upload a File in a Google Chrome Extension - Using a Web worker to request, validate, process and submit binary data.
For Chromium Extensions manifest v3, URL.createObjectURL() approach doesn't work anymore because it is prohibited in the service workers.
The best (easiest) way to pass data from a service worker to a content script (and vice-versa), is to convert the blob into a base64 representation.
const fetchBlob = async url => {
const response = await fetch(url);
const blob = await response.blob();
const base64 = await convertBlobToBase64(blob);
return base64;
};
const convertBlobToBase64 = blob => new Promise(resolve => {
const reader = new FileReader();
reader.readAsDataURL(blob);
reader.onloadend = () => {
const base64data = reader.result;
resolve(base64data);
};
});
Then send the base64 to the content script.
Service worker:
chrome.tabs.sendMessage(sender.tab.id, { type: "LOADED_FILE", base64: base64 });
Content script:
chrome.runtime.onMessage.addListener(async (request, sender) => {
if (request.type == "LOADED_FILE" && sender.id == '<your_extension_id>') {
// do anything you want with the data from the service worker.
// e.g. convert it back to a blob
const response = await fetch(request.base64);
const blob = await response.blob();
}
});

Categories

Resources