Writing UTF-8 to a file and downloading it with FileSaver.js - javascript

I've been working on making a Distributed File System and have come to a halt today with trying to write the contents of a returned file for the browser to download.
My system sends a request for a file's data and gets a string of the entire UTF-8 encoded binary contents as a response. I'm then trying to use FileSaver.js to write and download this content.
It only works for .txt files at the moment. I suspect it's something to do with the encoding and how the Blob object is made, I think it's simply writing the UTF-8 as text to each file which is why it works for .txt and nothing else. I've tried converting the response to byteArrays or using a File object instead but nothing works.
Here's the code doing the request and handling the response:
$.ajax({
type: "POST",
url: "http://localhost:3040/read",
data: {
data: data
},
xhrFields: {
withCredentials: true
},
success: function(response) {
var FileSaver = require('file-saver');
var blob = new Blob([response], {type: "application/octet-stream"});
FileSaver.saveAs(blob, filename);
}
});
An small snippet of the response output (in this case for a png image):
PNG
\u1a
\u0\u0\u1a
IHDR\u0\u0\u2î\u0\u0\u56Eµ¤Q\u0\u0\u0\u1sRGB\u0®Î\u1cé...

Related

Downloading a zip file from a byte array

I have an api which sends a zip file as a byte array (not the byte arrays of the individual files, but the zipped file on the whole). When I trigger the api in postman, i get random characters (as shown below).
When I download this response (as option in postman: send to a file and download) in a zip file, I am able to unzip it and extract the actual files. My goal is to achieve the same thing in angular and typescript.
I have tried to convert the response to a blob and download it, as suggested in multiple places online, including this question. So I did something like
const blob = new Blob([response], { type: 'application/zip' });
const url = window.URL.createObjectURL(blob);
window.open(url);
But the resultant zip file I download says 'unable to open: empty archive'. I am not sure what I am missing here. I tried converting the response to arrayBuffer (using this) first before applying the steps as well, as that was suggested in another place online. But that hasn't been of use either.
Can someone please help me understand what I'm doing wrong. Thanks
I am calling the API in a js file:
function downloadAzureRT(params) {
return $http({
method: 'GET',
url: API.public('protectionSources/downloadArtFile'),
params: params || {},
}).then(function downloadAwsARTResp(resp){
return resp.data || {};
});
}
And then calling this function in a ts file.
downloadART() {
this.ajsPubSourceService.downloadAzureRT({
filePath: ART_FILE_PATH,
fileName: ART_FILE_NAME,
})
.then((response) => {
const blob = new Blob([response], { type: 'application/zip' });
const url = window.URL.createObjectURL(blob);
window.open(url);
}

Is it possible to read object from AJAX request on server side?

Actually, I'm trying to send video file in base64 but it the file is large (small files works fine) that's why ajax process not completed and I got 400 error.
So, I thought to send a file object like below so, I can read this object from the server-side. But I don't know if it is possible? OR is there any way through which I can handle large video file upload?
[object FileReader]
And here is my AJAX Code
var reader = new FileReader();
// this function is triggered once a call to readAsDataURL returns
reader.onload = async function(event){
var fileData = new FormData();
var fileType;
fileType = ".avi";
// console.log(my_script_vars.postID);
// fileData.append("file", event.target);
fileData.append("file", event.target.result);
fileData.append("action", "myaction");
fileData.append("filetype", fileType);
fileData.append("post_id", my_script_vars.postID);
jQuery.ajax({
url: 'https://www.reelme.app/sign-up/wp-admin/admin-ajax.php',
processData: false,
contentType: false,
cache: false,
data: fileData,
type: 'POST',
.......
.......
.......
});
}
Please help. Thanks in advance.
You shouldn't read the file into base64 and store everything in memory. screw that FileReader you have.
you are doing the right thing by using FormData, but a FormData can also append blob & files
// Simulate a file you would normally get from a file input or drag n drop
const file = new File(['abc'], 'sample.txt', { type: 'text/plain' })
const fd = new FormData()
fd.append('file', file)
then to upload it i would suggest that you use fetch instead of jQuery that requires all those processData & other config
const url = 'https://www.reelme.app/sign-up/wp-admin/admin-ajax.php'
fetch(url, { method: 'POST', body: fd })
.then(console.log, console.error)
JSON isn't meant to handle large binary data... b/c it's no good streaming format.

Upload a javascript generated PDF file to server

Please note I am uploading a file generated at runtime, NOT a user submitted file. This is where my problem lies, taking that file and sending it along has proven itself to be truly a challenge.
I'm using PDFMake to generate a pdf file. It took me A LONG time to get this code working, tl;dr of it is I convert the pdf doc info to a buffer, make a new File object with that buffer, attach that file object to a formdata object, and send it to my server.
Here is the code:
var pdfGen = pdfMake.createPdf(docDef);
pdfGen.getBuffer(function(buffer){
var pdf = new File(buffer, "testpdf.pdf", {type: "application/pdf"});
var data = new FormData();
data.append('upload', pdf);
data.append('content-type', 'application/pdf');
$.ajax({
method: "POST",
url: "/index.cfm?action=admin:service.upload_pdf",
data: data,
processData: false,
contentType: false,
success: function(msg){
console.log(msg);
},
error: function(msg){
console.log(msg);
}
})
})
The problem is that the file I get on my server isn't a valid PDF. The application type is listed as application/octet-stream. The function I'm pushing the File object to SHOULD be rejecting everything except pdf's, so I'm not sure where the error is exactly.
I've been at this for two days now and I'm at a loss so any help is GREATLY appreciated.
Have you tried converting your generated PDF into Base64 and then sending the result to your server?
Here's what I mean:
var pdfGen = pdfMake.createPdf(docDef);
pdfGen.getBase64((data) => {
PostToServer(data);
});
Source: https://github.com/bpampuch/pdfmake

Send binary of file from ajax to ashx file as application/octet-stream

I need to send with ajax a file Binary to .ashx file (c# .net environment).
When I send it, the "Request Payload" must be of Content-Type: "application/octet-stream", this is the requirement from the server side. My problem is When I run my script, It sends the file as a normal type instead of "application/octet-stream". if the file is jpg, then it sends it as Content-type: "image/jpeg", This is not good for me. I wanna send the file in Binary format.
right now what I have is this:
var formData = new FormData();
formData.append('file',file);
$.ajax({
url: gatewayUrl,
type: "POST",
data: formData,
processData: false,
contentType: false,
success: function(data){},
error: function(){}
});
I read about FileReader, and about File.getAsBinary() but non of them is working for me. I am struggling with the ArrayBuffer and I'm not sure if I'm on the right direction..
This is the direction I'm thinking about but I can't figure out if that's the correct way to go with or not:
function get_file_binary(file){
var fileReader = new FileReader();
fileReader.readAsArrayBuffer(file);
fileReader.onload = function(e){
var res = fileReader.result;
var res2 = new Int8Array(res);
return res2;
}
}
But when I try this I get "undefined" at the "Request Payload" on chrome.

Google Drive resumable upload with javascript

I'm trying to upload files to Google Drive using Google APIs Client Library for JavaScript and resumable upload type.
I authenticate and get the upload URI successfully, but I ran into problems while sending the actual data. If the file contains only ASCII characters, the file is sent successfully to Drive, but in case of special characters (åäö) or binary file (such as PNG) the file gets corrupted. My guess would be that somewhere in the process the file is encoded to unicode in client side.
If I use "btoa()" to encode the raw data to base64 and add header "Content-Encoding: base64" to the data sending request, the file uploads fine. Using this method however increases the overhead for 33%, which is quite a lot when the planned upload size of files is 100MB to 1GB.
Here are some code examples:
Getting the resumable upload URI:
// Authentication is already done
var request = gapi.client.request({
"path": DRIVE_API_PATH, // "/upload/drive/v2/files"
"method": "POST",
"params": {
"uploadType": "resumable"
},
"headers": {
"X-Upload-Content-Type": self.file.type,
//"X-Upload-Content-Length": self.file.size
// If this is uncommented, the upload fails because the file size is
// different (corrupted file). Manually setting to the corrupted file
// size doesn't give 400 Bad Request.
},
"body": {
// self.file is the file object from <input type="file">
"title": self.file.name,
"mimeType": self.file.type,
"Content-Lenght": self.file.size,
}
});
Sending the whole file in one go:
// I read the file using FileReader and readAsBinaryString
// body is the reader.result (or btoa(reader.result))
// and this code is ran after the file has been read
var request = gapi.client.request({
"path": self.resumableUrl, // URI got from previous request
"method": "PUT",
"headers": {
//"Content-Encoding": "base64", // Uploading with base64 works
"Content-Type": self.file.type
},
"body": body
});
Am I missing something? Is it possible to upload file in binary stream? I am new to uploading files in HTML and Javascript and I haven't found any examples using Google Javascript library with resumable upload. There is similar question in SO with no answers.
Blob types are a hot topic for XMLHttpRequest implementations and they are not truly mature. I'd recommend you to stick with base64 encoding. Google's JavaScript client lib doesn't support resumable uploads because it's very unlikely that a client side browser app uploads very large files directly to Google Drive.
What works
To upload a binary blob, use github/googleapi's cors-upload-sample or use my gist fork, UploaderForGoogleDrive, which will grab access_token out of the gapi client for you.
Here is an ugly mixture of Promise and callback code that works for me. As a prerequisite, gapi,UploaderForGoogleDrive, JSZip need to be loaded via <script> tags. The snippet also omits gapi initialization and the API secrets, which are also necessary.
function bigCSV(){ // makes a string for a 300k row CSV file
const rows = new Array(300*1000).fill('').map((v,j)=>{
return [j,2*j,j*j,Math.random(),Math.random()].join(',');
});
return rows.join("\n");
}
function bigZip(){ // makes a ZIP file blob, about 8MB
const zip = new window.JSZip();
zip.folder("A").file("big.csv", bigCSV());
return zip.generateAsync({type:"blob", compression:"DEFLATE"});
// returns Promise<blob>
}
function upload2(zipcontent){
'use strict';
const parent = 'root';
const spaces = 'drive';
const metadata = {
name: 'testUpload2H.zip',
mimeType: 'application/zip',
parents: [parent]
};
const uploader = new window.UploaderForGoogleDrive({
file: zipcontent,
metadata: metadata,
params: {
spaces,
fields: 'id,name,mimeType,md5Checksum,size'
},
onProgress: function(x){
console.log("upload progress:",Math.floor(100*x.loaded/x.total));
},
onComplete: function(x){
if (typeof(x)==='string') x = JSON.parse(x);
// do something with the file metadata in x
console.log("upload complete: ");
},
onError: function(e){ console.log("upload error: ",e); }
});
uploader.upload();
}
function uploadZipFile(){
'use strict';
(bigZip()
.then(upload2)
);
}
What doesn't work
As of Nov 2017, uploading a binary blob with the gapi.client.request call is not going to work, because of an issue where gapi removes the PUT payload
I've also tried using base64 with gapi, which works. but deposits base64 files, not true binaries; and the fetch API in cors mode, which half-worked but produced CORS-related errors and response hiding, at least for me.

Categories

Resources