Using btoa to parse xls from javascript to C# - javascript

I have written a program that sends an excel file uploaded by the user from javascript to C# where it is then formatted. The code works well for .xlsx files; however, when I try to parse .xls files, I receive the following error:
System.ArgumentNullException: 'Value cannot be null. Parameter name:
s'.
After some further testing with breakpoints, I believe I have found the problem, but I cannot find a solution. The following code is the javascipt used to encode the file:
document.getElementById('selectedFile').addEventListener('change', function (event) {
var reader = new FileReader();
reader.onload = function () {
filename = event.target.files[0].name;
var uint8Array = new Uint8Array(this.result);
fileContent = btoa(String.fromCharCode.apply(null, uint8Array));
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
When the file uploaded is of type .xlsx, fileContent returns the correct value; however, if an .xls file is uploaded, it returns null, therefore breaking the code further on.
Is there a way to make this code work for .xls files? Alternatively, is there a way I can convert it to .xlsx before I parse it to the backend?

.xlsx is a text format (XML) so String.fromCharCode should work appropriately.
.xls is a binary file format. I would not expect String functions to work properly with it.
You can use Excel to convert between formats. The conversion is non-trivial and, while not impossible to do from javascript, probably not worth spending a couple months on.

Related

Error: Corrupted zip : can't find end of central directory "When trying to read a slice of .xlsx file."- XLSX

I am trying to read the .xlsx file using FileReader and XLSX module. So If I am trying to read the whole .xlsx file then it's working fine. But I just want to read only the top 5 to 10 lines of .xlsx file. so that's why instead of passing an original file which can be quite large some time greater than 100mb. I am just passing a slice of the file to the FileReader so it can read the file faster otherwise if I pass the original file which can be 100mb then it will take too much time in processing and I do not want to process a whole file to just read top 10 lines. Here is the code :
onFile(event:any){
let data = event.target.files[0];
let fileReader = new FileReader();
fileReader.onload = (e) => {
let arrayBuffer = e.target.result;
console.log(arrayBuffer);
let workbook = XLSX.read(arrayBuffer, {type:"array"});
let first_sheet_name = workbook.SheetNames[0];
let worksheet = workbook.Sheets[first_sheet_name];
console.log(XLSX.utils.sheet_to_json(worksheet,{raw:true}));
}
fileReader.readAsArrayBuffer(data.slice(0,500));
}
so I think the problem here is during slicing I am trying to slice starting 500 bytes using the slice method and then passing it to the FileReader. so my question is here how can I slice a .xlsx file properly because using the current way file getting corrupt.
Note: other files like .csv or .tsv working fine even if I am slicing them using the current way. I was also able to read the .xlsx file using the same way a few months ago. but I don't know why I am facing this issue now.

Uploading files using js fileReader results in corrupted files

I am trying to upload a file using js file reader and AJAX to my server.
I used FileAPI and fileReader to read the file and convert it to string and then send it to the server via an AJAX request.
Here is my client side js code :
function upload() {
var file = document.getElementById('periodExcel').files[0];
if (file) {
console.log(file);
var reader = new FileReader();
reader.readAsText(file, file.type);
console.log(reader);
reader.onload = uploadAndRun;
}
}
function uploadAndRun(event) {
console.log(event);
var result = event.target.result;
$('#js').html(result);
var fileName = document.getElementById('periodExcel').files[0].name; //Should be 'picture.jpg'
$.post('./upload.php', {data: result, name: fileName}, function(result2){
$('#php').html(result2);
});
}
Here is the upload php script:
file_put_contents('upload/' . $_POST['name'], $_POST['data']);
it just write the file using php file_put_contents function.
My problem is that the uploaded file is corrupted and has a different size than the original file (it is larger).
I tried to use php file_get_contents function to read the same file and write it again using file_put_contents and the result file was fine and same as the original one.
I then tried to compare the two strings (the one that comes from the file reader and the one that comes from file_get_contents ) and compares the two strings using strcmp, that gives me that the string that come from the fileReader is larger than the one comes from file_get_contents.
So, what is the problem with my code and how to use the FileReader to upload file in this way while using readAsText function.
You are using the wrong collection in PHP. To access uploaded file stream use $_FILES.
See here:
http://php.net/manual/en/reserved.variables.files.php
and here: http://php.net/manual/en/features.file-upload.post-method.php
In short, PHP runtime takes care of reading the upload stream from the HTTP request, stores it locally in a temp folder and exposes the above map for you to access the temp file and possibly move it to another location (or do whatever else you need to do with it).

Converting base64 data from json response into ms excel file using javascript

I've seen SO questions similar to my use case w/ angular and other server side platforms but not for pure javascript.
I have an app where I do a $.ajax and do a get call to an API, which returns a previously converted excel file (excel to base64); I need to re-convert this base64 data back into it's original form - i.e. into Excel file. I tried retracing the steps I took to convert the excel into base64, reversing some of them, but I'm not able to generate the original file. An excel file IS being generated, but it still has base64 data and therefore opens w/ errors and in a corrupted state.
Has anyone else successfully done this?
Below is my code and fiddle link: (I didn't add the base64 json data (responseData) here since it's large, but it's on the fiddle)
var bindata = window.atob(responseData);
function DownloadExcel() {
window.location.href = "data:application/vnd.ms-excel;base64, bindata"
}
var blob = new Blob([responseData], {type: 'application/vnd.ms-excel'});
if (window.navigator && window.navigator.msSaveBlob) {
window.navigator.msSaveBlob(blob);
}
else {
var objectUrl = URL.createObjectURL(blob);
window.open(objectUrl);
}
jsfiddle link: https://jsfiddle.net/damon_matt/2ofz6xrd/

Compress dcm files with JSZip

im currently working on a dicom file upload system that uploads .dcm files with jquery file uploader. It is working fine but as DICOM data-sets can get very large i want to compress the files with JSZip before the upload.
Simply i am passing the file object to a zip function that returns the zipped file object. This is working fine with commonly known files but not with DICOM files. I've already tried to encode the files to base64 string before zipping but that doesn't work either.
JSZip always throws me the following error:
Uncaught Error: The data of 'IM-0001-0001.dcm' is in an unsupported format !
I am using the following file compress function:
compressFile: function(data) {
var zip = new JSZip();
var file = data.files[0];
zip.file(file.name, file, {binary:false});
content = zip.generate({
compression: 'DEFLATE'
});
return content;
}
I have also tried with base64 and binary in the .file options but that didn't made the trick.
Has anyone a clue on how to get that working? Im a beginner to JS so im sorry for noobish questions ^^
Kind Regards
You need to use a FileReader to read the content of data.files[0] first:
var reader = new FileReader();
reader.onload = (function(e) {
var zip = new JSZip(e.target.result);
var result = zip.generate({
compression: 'DEFLATE'
});
// do something with result
}
reader.readAsArrayBuffer(data.files[0]);
See also this example.
Warning, FileReader is asynchronous: you can't make your function returns the result.

JPEG data obtained from FileReader doesn't match data in file

I'm trying to select a local JPEG file in the web browser via the HTML5 FileReader so I can submit it to a server without reloading the page. All the mechanics are working and I think I'm transferring and saving the exact data that JavaScript gave me, but the result is an invalid JPEG file on the server. Here's the basic code that demonstrates the problem:
<form name="add_photos">
​<input type=​"file" name=​"photo" id=​"photo" /><br />
​<input type=​"button" value=​"Upload" onclick=​"upload_photo()​;​" />​
</form>
<script type="text/javascript">
function upload_photo() {
file = document.add_photos.photo.files[0];
if (file) {
fileReader = new FileReader();
fileReader.onload = upload_photo_ready;
fileReader.readAsBinaryString(file);
}
}
function upload_photo_ready(event) {
data = event.target.result;
// alert(data);
URL = "submit.php";
ajax = new XMLHttpRequest();
ajax.open("POST", URL, 1);
ajax.setRequestHeader("Ajax-Request", "1");
ajax.send(data);
}
</script>
Then my PHP script does this:
$data = file_get_contents("php://input");
$filename = "test.jpg";
file_put_contents($filename, $data);
$result = imagecreatefromjpeg($filename);
That last line throws a PHP error "test.jpg is not a valid JPEG file." If I download the data back to my Mac and try to open it in Preview, Preview says the file "may be damaged or use a file format that Preview doesn’t recognize."
If I open both the original file on my desktop and the uploaded file on the server in text editors to inspect their contents, they are almost but not quite the same. The original file starts like this:
ˇÿˇ‡JFIFˇ˛;CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 90
But the uploaded file starts like this:
ÿØÿàJFIFÿþ;CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 90
Interestingly, if I view the data in a JavaScript alert with the commented-out line above, it looks just like the uploaded file's data, so it seems as if the FileReader isn't giving the correct data at the very beginning, as opposed to a problem that is introduced while transferring or saving the data on the server. Can anyone explain this?
I'm using Safari 6 and I also tried Firefox 14.
UPDATE: I just figured out that if I skip the FileReader code and change ajax.send(data) to ajax.send(file), the image is transferred and saved correctly on the server. So my problem is basically solved, but I'll award the answer points to anyone who can explain why my original approach with readAsBinaryString didn't work.
Your problem lies with readAsBinaryString. This will transfer the binary data byte-for-byte into a string, so that you will send a text string to your PHP file. Now a text string always has an encoding; and when you use XmlHttpRequest to upload a string, by default it will use UTF-8.
So each character, which was originally supposed to represent one byte, will be encoded as UTF-8... which uses multiple bytes for each character with a code point above 127!
Your best best is to use readAsArrayBuffer instead of readAsBinaryString. This will avoid all the character set conversions (that are necessary when dealing with strings).

Categories

Resources