Fetch API is modifying request body inconsistently - javascript

I'm trying to upload an arbitrary list of files:
for (let i=0;i<files.length;i++) {
let fileTypeToUse = files[i].type;
fetchWithRetry(url, 1000, 2, {
method: 'POST',
credentials: 'same-origin',
headers: {
'Content-Type': fileTypeToUse
},
body: files[i],
}
}
This works for most file types (including images) by taking the bytes and sending them in the body of the request. But when I try to upload audio of type "audio/mpeg" my server receives a file which is about 60% larger than I expected.
I initially assumed this meant the file was being base64 encoded by the browser, so I tried to decode the file. Unfortunately, this hypothesis seemed to be incorrect. I received a decoding error on the server: base64.StdEncoding.Decode: illegal base64 data at input byte 3
For reference, here is a screenshot of the files object I am trying to upload:
And here is the oversized object being sent by the browser to the server:
Related issue: https://stackoverflow.com/a/40826943/12713117 Unfortunately they are encouraging people to upload the file with form/multipart instead of directly uploading the file.
I have 2 questions. First, why is the uploaded object larger than the file accessible in javascript? Second, how do I know if an arbitrary file will be sent as-is (which is the case with images) or if it will be encoded and need decoding on the server?
Fetch With Retry Code
function fetchWithRetry(url, delay, tries, fetchOptions = {}) {
return fetch(url, fetchOptions).catch((err) => {
let triesLeft = tries - 1;
if (triesLeft == null || triesLeft < 1) {
throw err;
}
return wait(delay).then(() => fetchWithRetry(url, delay * 2, triesLeft, fetchOptions));
});
}

Related

Downloading a zip file from a byte array

I have an api which sends a zip file as a byte array (not the byte arrays of the individual files, but the zipped file on the whole). When I trigger the api in postman, i get random characters (as shown below).
When I download this response (as option in postman: send to a file and download) in a zip file, I am able to unzip it and extract the actual files. My goal is to achieve the same thing in angular and typescript.
I have tried to convert the response to a blob and download it, as suggested in multiple places online, including this question. So I did something like
const blob = new Blob([response], { type: 'application/zip' });
const url = window.URL.createObjectURL(blob);
window.open(url);
But the resultant zip file I download says 'unable to open: empty archive'. I am not sure what I am missing here. I tried converting the response to arrayBuffer (using this) first before applying the steps as well, as that was suggested in another place online. But that hasn't been of use either.
Can someone please help me understand what I'm doing wrong. Thanks
I am calling the API in a js file:
function downloadAzureRT(params) {
return $http({
method: 'GET',
url: API.public('protectionSources/downloadArtFile'),
params: params || {},
}).then(function downloadAwsARTResp(resp){
return resp.data || {};
});
}
And then calling this function in a ts file.
downloadART() {
this.ajsPubSourceService.downloadAzureRT({
filePath: ART_FILE_PATH,
fileName: ART_FILE_NAME,
})
.then((response) => {
const blob = new Blob([response], { type: 'application/zip' });
const url = window.URL.createObjectURL(blob);
window.open(url);
}

Reactjs | How to make boundary static when content type is multipart/form-data

We are facing one issue while we are making file upload post call using React-JS, in dev-tools under form data we are getting some browser generated boundary.
------WebKitFormBoundarypSTl3xdAHAJgTN8A
And because of this random boundary we are getting issue while we are making call to a third party api.
Is there any way to make this boundary some fixed value. Something like this :
----somefixedvalue.
Here is the js code:
function doupload() {
let data = document.getElementById("file").files[0];
console.log('doupload',data);
let formData = new FormData();
formData.append("file", data);
fetch('http://localhost:8081/upload/multipart',
{
method:'POST',
body: formData,
headers: {
'Content-Type': 'multipart/form-data; boundary=----somefixedboundary'
}
}).then(res => {
for(const header of res.headers){
console.log(`resHeaderName: ${header[0]}, Value:${header[1]}`);
}
});
alert('your file has been uploaded');
location.reload();
};
can someone help me to solve this? I'm quite confused here as i have given content type along with some static boundary but no luck.
You can convert a file to the binary string and prepare the body by yourself instead of using FormData as it shown in this answer.
But keep in mind that if this ----somefixedvalue substring appears in the file it will be considered a boundary causing body parsing error on receiver side. In case of FormData, browser should take care of it and prevent this.

How to send file details to back end using angularjs?

Hi I am developing web application in angularjs. I am developing file upload module. I have below array with file details.
//below code to get array of files
$scope.showPicker=function()
{
var client = filestack.init('AGeDIRvVZTRWgtmFbfGuZz');
client.pick({
}).then(function (result) {
arrMakes.push(result.filesUploaded);
});
}
In the above image i shown my array. I have three files.
Below is my angular code to send details to api.
var files = new FormData();
angular.forEach(arrMakes, function (value, index) {
console.log(value,index);
files.append(index, value);
files.append('data', angular.toJson(index).replace(/['"]+/g, ''));
});
return $http.post(this.uploadUrl, files, {
transformRequest: angular.identity,
headers: {
'Content-Type': undefined,
}
})
The problem is i am not receiving file in server side. Below line gives me 0 files in server.
System.Web.HttpFileCollection hfc = System.Web.HttpContext.Current.Request.Files;
May i know am i sending correct data to server? Can someone help me to fix this? Any help would be greatly appreciated. Thank you.
You are not uploading any files to the server, only strings.
You can't append objects to a FormData. appart from Blob & File objects See what will happen:
fd = new FormData
fd.append(2, {foo: 'bar'})
fd.append('data', 5)
new Response(fd).text().then(console.log)
// you get [object Object]
Why do you stringify the index in "data"? It will be casted to string automatically. And what is there that you have to replace?
If i where you i would just simple send the hole arrMakes to the server and download all the files from the url on the backend, otherwise the client has to download and then upload them to the server and wasting bandwidth and time.
beside, you don't need angulars forEach loop, arrays has that method built in
arrMakes.forEach(function (value, index) {
...
})
You won't even have to use any loop if you just pass the arrMarks to the server

Node js - How to serve multiple SVG files to a browser

I'm new to Node and server oriented code and am trying to get multiple svg files which are stored in the server.
Here is my code client-side using jQuery:
$.ajax({
url: someURL,
data: someData
})
.done(function(data) {
console.log('data got', data);
callback(null, data);
})
.fail(function() {
callback(new Error('Cannot access files'));
});
And here is my code server side:
// links is an array of links to the different svg files
var svgs = [];
async.eachSeries(links, function(link, next) {
fs.readFile(link, function(err, svg) {
svgs.push(svg);
next(err);
});
}, function(err) {
if (err) {
response.writeHead(500);
response.end(JSON.stringify(err));
return;
}
response.writeHead(200);
response.end(svgs); // Doesn't work
// response.end(svgs[0]); // Works
});
As long as I send only one file to the browser (which seem to be a Buffer instance), everything seems to work fine, but when I try to send multiple ones as an Array the transaction succeed but I got nothing in my returned data. That may be related to the MIME type of what I'm trying to send, but I couldn't find how to handle that.
You'll have to convert svgs into a String or Buffer first. One option is to stringify it as JSON:
response.writeHead(200, {
'Content-Type': 'application/json'
});
response.end(JSON.stringify(svgs));
This is because response.write(), which response.end() is calling to handle the data (svgs), doesn't accept Arrays.
chunk can be a string or a buffer. If chunk is a string, the second parameter specifies how to encode it into a byte stream. By default the encoding is 'utf8'.
Whereas each svg provided by fs.readFile() is a Buffer, so it has no issues writing svgs[0].

Google Drive resumable upload with javascript

I'm trying to upload files to Google Drive using Google APIs Client Library for JavaScript and resumable upload type.
I authenticate and get the upload URI successfully, but I ran into problems while sending the actual data. If the file contains only ASCII characters, the file is sent successfully to Drive, but in case of special characters (åäö) or binary file (such as PNG) the file gets corrupted. My guess would be that somewhere in the process the file is encoded to unicode in client side.
If I use "btoa()" to encode the raw data to base64 and add header "Content-Encoding: base64" to the data sending request, the file uploads fine. Using this method however increases the overhead for 33%, which is quite a lot when the planned upload size of files is 100MB to 1GB.
Here are some code examples:
Getting the resumable upload URI:
// Authentication is already done
var request = gapi.client.request({
"path": DRIVE_API_PATH, // "/upload/drive/v2/files"
"method": "POST",
"params": {
"uploadType": "resumable"
},
"headers": {
"X-Upload-Content-Type": self.file.type,
//"X-Upload-Content-Length": self.file.size
// If this is uncommented, the upload fails because the file size is
// different (corrupted file). Manually setting to the corrupted file
// size doesn't give 400 Bad Request.
},
"body": {
// self.file is the file object from <input type="file">
"title": self.file.name,
"mimeType": self.file.type,
"Content-Lenght": self.file.size,
}
});
Sending the whole file in one go:
// I read the file using FileReader and readAsBinaryString
// body is the reader.result (or btoa(reader.result))
// and this code is ran after the file has been read
var request = gapi.client.request({
"path": self.resumableUrl, // URI got from previous request
"method": "PUT",
"headers": {
//"Content-Encoding": "base64", // Uploading with base64 works
"Content-Type": self.file.type
},
"body": body
});
Am I missing something? Is it possible to upload file in binary stream? I am new to uploading files in HTML and Javascript and I haven't found any examples using Google Javascript library with resumable upload. There is similar question in SO with no answers.
Blob types are a hot topic for XMLHttpRequest implementations and they are not truly mature. I'd recommend you to stick with base64 encoding. Google's JavaScript client lib doesn't support resumable uploads because it's very unlikely that a client side browser app uploads very large files directly to Google Drive.
What works
To upload a binary blob, use github/googleapi's cors-upload-sample or use my gist fork, UploaderForGoogleDrive, which will grab access_token out of the gapi client for you.
Here is an ugly mixture of Promise and callback code that works for me. As a prerequisite, gapi,UploaderForGoogleDrive, JSZip need to be loaded via <script> tags. The snippet also omits gapi initialization and the API secrets, which are also necessary.
function bigCSV(){ // makes a string for a 300k row CSV file
const rows = new Array(300*1000).fill('').map((v,j)=>{
return [j,2*j,j*j,Math.random(),Math.random()].join(',');
});
return rows.join("\n");
}
function bigZip(){ // makes a ZIP file blob, about 8MB
const zip = new window.JSZip();
zip.folder("A").file("big.csv", bigCSV());
return zip.generateAsync({type:"blob", compression:"DEFLATE"});
// returns Promise<blob>
}
function upload2(zipcontent){
'use strict';
const parent = 'root';
const spaces = 'drive';
const metadata = {
name: 'testUpload2H.zip',
mimeType: 'application/zip',
parents: [parent]
};
const uploader = new window.UploaderForGoogleDrive({
file: zipcontent,
metadata: metadata,
params: {
spaces,
fields: 'id,name,mimeType,md5Checksum,size'
},
onProgress: function(x){
console.log("upload progress:",Math.floor(100*x.loaded/x.total));
},
onComplete: function(x){
if (typeof(x)==='string') x = JSON.parse(x);
// do something with the file metadata in x
console.log("upload complete: ");
},
onError: function(e){ console.log("upload error: ",e); }
});
uploader.upload();
}
function uploadZipFile(){
'use strict';
(bigZip()
.then(upload2)
);
}
What doesn't work
As of Nov 2017, uploading a binary blob with the gapi.client.request call is not going to work, because of an issue where gapi removes the PUT payload
I've also tried using base64 with gapi, which works. but deposits base64 files, not true binaries; and the fetch API in cors mode, which half-worked but produced CORS-related errors and response hiding, at least for me.

Categories

Resources