Related
var profileImage = fileInputInByteArray;
$.ajax({
url: 'abc.com/',
type: 'POST',
dataType: 'json',
data: {
// Other data
ProfileImage: profileimage
// Other data
},
success: {
}
})
// Code in WebAPI
[HttpPost]
public HttpResponseMessage UpdateProfile([FromUri]UpdateProfileModel response) {
//...
return response;
}
public class UpdateProfileModel {
// ...
public byte[] ProfileImage {get ;set; }
// ...
}
<input type="file" id="inputFile" />
I am using ajax call to post byte[] value of a input type = file input to web api which receives in byte[] format. However, I am experiencing difficulty of getting byte array. I am expecting that we can get the byte array through File API.
Note: I need to store the byte array in a variable first before passing through ajax call
[Edit]
As noted in comments above, while still on some UA implementations, readAsBinaryString method didn't made its way to the specs and should not be used in production.
Instead, use readAsArrayBuffer and loop through it's buffer to get back the binary string :
document.querySelector('input').addEventListener('change', function() {
var reader = new FileReader();
reader.onload = function() {
var arrayBuffer = this.result,
array = new Uint8Array(arrayBuffer),
binaryString = String.fromCharCode.apply(null, array);
console.log(binaryString);
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file" />
<div id="result"></div>
For a more robust way to convert your arrayBuffer in binary string, you can refer to this answer.
[old answer] (modified)
Yes, the file API does provide a way to convert your File, in the <input type="file"/> to a binary string, thanks to the FileReader Object and its method readAsBinaryString.
[But don't use it in production !]
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var binaryString = this.result;
document.querySelector('#result').innerHTML = binaryString;
}
reader.readAsBinaryString(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
If you want an array buffer, then you can use the readAsArrayBuffer() method :
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = this.result;
console.log(arrayBuffer);
document.querySelector('#result').innerHTML = arrayBuffer + ' '+arrayBuffer.byteLength;
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
$(document).ready(function(){
(function (document) {
var input = document.getElementById("files"),
output = document.getElementById("result"),
fileData; // We need fileData to be visible to getBuffer.
// Eventhandler for file input.
function openfile(evt) {
var files = input.files;
// Pass the file to the blob, not the input[0].
fileData = new Blob([files[0]]);
// Pass getBuffer to promise.
var promise = new Promise(getBuffer);
// Wait for promise to be resolved, or log error.
promise.then(function(data) {
// Here you can pass the bytes to another function.
output.innerHTML = data.toString();
console.log(data);
}).catch(function(err) {
console.log('Error: ',err);
});
}
/*
Create a function which will be passed to the promise
and resolve it when FileReader has finished loading the file.
*/
function getBuffer(resolve) {
var reader = new FileReader();
reader.readAsArrayBuffer(fileData);
reader.onload = function() {
var arrayBuffer = reader.result
var bytes = new Uint8Array(arrayBuffer);
resolve(bytes);
}
}
// Eventlistener for file input.
input.addEventListener('change', openfile, false);
}(document));
});
<!DOCTYPE html>
<html>
<head>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
</head>
<body>
<input type="file" id="files"/>
<div id="result"></div>
</body>
</html>
Modern browsers now have the arrayBuffer method on Blob's:
document.querySelector('input').addEventListener('change', async (event) => {
const buffer = await event.target.files[0].arrayBuffer()
console.log(buffer)
}, false)
🎉 🎉
This is a long post, but I was tired of all these examples that weren't working for me because they used Promise objects or an errant this that has a different meaning when you are using Reactjs. My implementation was using a DropZone with reactjs, and I got the bytes using a framework similar to what is posted at this following site, when nothing else above would work: https://www.mokuji.me/article/drop-upload-tutorial-1 . There were 2 keys, for me:
You have to get the bytes from the event object, using and during a FileReader's onload function.
I tried various combinations, but in the end, what worked was:
const bytes = e.target.result.split('base64,')[1];
Where e is the event. React requires const, you could use var in plain Javascript. But that gave me the base64 encoded byte string.
So I'm just going to include the applicable lines for integrating this as if you were using React, because that's how I was building it, but try to also generalize this, and add comments where necessary, to make it applicable to a vanilla Javascript implementation - caveated that I did not use it like that in such a construct to test it.
These would be your bindings at the top, in your constructor, in a React framework (not relevant to a vanilla Javascript implementation):
this.uploadFile = this.uploadFile.bind(this);
this.processFile = this.processFile.bind(this);
this.errorHandler = this.errorHandler.bind(this);
this.progressHandler = this.progressHandler.bind(this);
And you'd have onDrop={this.uploadFile} in your DropZone element. If you were doing this without React, this is the equivalent of adding the onclick event handler you want to run when you click the "Upload File" button.
<button onclick="uploadFile(event);" value="Upload File" />
Then the function (applicable lines... I'll leave out my resetting my upload progress indicator, etc.):
uploadFile(event){
// This is for React, only
this.setState({
files: event,
});
console.log('File count: ' + this.state.files.length);
// You might check that the "event" has a file & assign it like this
// in vanilla Javascript:
// var files = event.target.files;
// if (!files && files.length > 0)
// files = (event.dataTransfer ? event.dataTransfer.files :
// event.originalEvent.dataTransfer.files);
// You cannot use "files" as a variable in React, however:
const in_files = this.state.files;
// iterate, if files length > 0
if (in_files.length > 0) {
for (let i = 0; i < in_files.length; i++) {
// use this, instead, for vanilla JS:
// for (var i = 0; i < files.length; i++) {
const a = i + 1;
console.log('in loop, pass: ' + a);
const f = in_files[i]; // or just files[i] in vanilla JS
const reader = new FileReader();
reader.onerror = this.errorHandler;
reader.onprogress = this.progressHandler;
reader.onload = this.processFile(f);
reader.readAsDataURL(f);
}
}
}
There was this question on that syntax, for vanilla JS, on how to get that file object:
JavaScript/HTML5/jQuery Drag-And-Drop Upload - "Uncaught TypeError: Cannot read property 'files' of undefined"
Note that React's DropZone will already put the File object into this.state.files for you, as long as you add files: [], to your this.state = { .... } in your constructor. I added syntax from an answer on that post on how to get your File object. It should work, or there are other posts there that can help. But all that Q/A told me was how to get the File object, not the blob data, itself. And even if I did fileData = new Blob([files[0]]); like in sebu's answer, which didn't include var with it for some reason, it didn't tell me how to read that blob's contents, and how to do it without a Promise object. So that's where the FileReader came in, though I actually tried and found I couldn't use their readAsArrayBuffer to any avail.
You will have to have the other functions that go along with this construct - one to handle onerror, one for onprogress (both shown farther below), and then the main one, onload, that actually does the work once a method on reader is invoked in that last line. Basically you are passing your event.dataTransfer.files[0] straight into that onload function, from what I can tell.
So the onload method calls my processFile() function (applicable lines, only):
processFile(theFile) {
return function(e) {
const bytes = e.target.result.split('base64,')[1];
}
}
And bytes should have the base64 bytes.
Additional functions:
errorHandler(e){
switch (e.target.error.code) {
case e.target.error.NOT_FOUND_ERR:
alert('File not found.');
break;
case e.target.error.NOT_READABLE_ERR:
alert('File is not readable.');
break;
case e.target.error.ABORT_ERR:
break; // no operation
default:
alert('An error occurred reading this file.');
break;
}
}
progressHandler(e) {
if (e.lengthComputable){
const loaded = Math.round((e.loaded / e.total) * 100);
let zeros = '';
// Percent loaded in string
if (loaded >= 0 && loaded < 10) {
zeros = '00';
}
else if (loaded < 100) {
zeros = '0';
}
// Display progress in 3-digits and increase bar length
document.getElementById("progress").textContent = zeros + loaded.toString();
document.getElementById("progressBar").style.width = loaded + '%';
}
}
And applicable progress indicator markup:
<table id="tblProgress">
<tbody>
<tr>
<td><b><span id="progress">000</span>%</b> <span className="progressBar"><span id="progressBar" /></span></td>
</tr>
</tbody>
</table>
And CSS:
.progressBar {
background-color: rgba(255, 255, 255, .1);
width: 100%;
height: 26px;
}
#progressBar {
background-color: rgba(87, 184, 208, .5);
content: '';
width: 0;
height: 26px;
}
EPILOGUE:
Inside processFile(), for some reason, I couldn't add bytes to a variable I carved out in this.state. So, instead, I set it directly to the variable, attachments, that was in my JSON object, RequestForm - the same object as my this.state was using. attachments is an array so I could push multiple files. It went like this:
const fileArray = [];
// Collect any existing attachments
if (RequestForm.state.attachments.length > 0) {
for (let i=0; i < RequestForm.state.attachments.length; i++) {
fileArray.push(RequestForm.state.attachments[i]);
}
}
// Add the new one to this.state
fileArray.push(bytes);
// Update the state
RequestForm.setState({
attachments: fileArray,
});
Then, because this.state already contained RequestForm:
this.stores = [
RequestForm,
]
I could reference it as this.state.attachments from there on out. React feature that isn't applicable in vanilla JS. You could build a similar construct in plain JavaScript with a global variable, and push, accordingly, however, much easier:
var fileArray = new Array(); // place at the top, before any functions
// Within your processFile():
var newFileArray = [];
if (fileArray.length > 0) {
for (var i=0; i < fileArray.length; i++) {
newFileArray.push(fileArray[i]);
}
}
// Add the new one
newFileArray.push(bytes);
// Now update the global variable
fileArray = newFileArray;
Then you always just reference fileArray, enumerate it for any file byte strings, e.g. var myBytes = fileArray[0]; for the first file.
This is simple way to convert files to Base64 and avoid "maximum call stack size exceeded at FileReader.reader.onload" with the file has big size.
document.querySelector('#fileInput').addEventListener('change', function () {
var reader = new FileReader();
var selectedFile = this.files[0];
reader.onload = function () {
var comma = this.result.indexOf(',');
var base64 = this.result.substr(comma + 1);
console.log(base64);
}
reader.readAsDataURL(selectedFile);
}, false);
<input id="fileInput" type="file" />
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = this.result,
array = new Uint8Array(arrayBuffer),
binaryString = String.fromCharCode.apply(null, array);
console.log(binaryString);
console.log(arrayBuffer);
document.querySelector('#result').innerHTML = arrayBuffer + ' '+arrayBuffer.byteLength;
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
Here is one answer to get the actual final byte array , just using FileReader and ArrayBuffer :
const test_function = async () => {
... ... ...
const get_file_array = (file) => {
return new Promise((acc, err) => {
const reader = new FileReader();
reader.onload = (event) => { acc(event.target.result) };
reader.onerror = (err) => { err(err) };
reader.readAsArrayBuffer(file);
});
}
const temp = await get_file_array(files[0])
console.log('here we finally ve the file as a ArrayBuffer : ',temp);
const fileb = new Uint8Array(fileb)
... ... ...
}
where file is directly the File object u want to read , this has to be done in a async function...
I had a requirement of reading users data from file, the file can be of any format(.csv, .xls, ...)
Users data looks like
Name Email Number Image
Yaswanth yas....#gmail.com 0123456789 file:///home/yash/Desktop/.....jpg
When i reach the image value cells i should pick the image file from local path and convert to base64 but i was not able to read rather i get the following error
Not allowed to load local resource
Later i thought i could have one hidden input file and dynamically set the input value and have onchange even but for security reasons the input type file is read only so dropped my idea and trying with FileReader
Here is the code which i tried
changeListener($event,uploadType) : void {
this.readThis($event.target,uploadType);
}
readThis(inputValue: any,uploadType: any): void {
if(uploadType === 'image') {
}
if(uploadType === 'file') {
var file:File = inputValue.files[0];
this.readThisFile(inputValue);
}
}
readThisFile(inputValue: any): void {
const target: DataTransfer = <DataTransfer>(inputValue);
if (target.files.length !== 1) throw new Error('Cannot use multiple files');
const reader: FileReader = new FileReader();
reader.onload = (e: any) => {
/* read workbook */
const bstr: string = e.target.result;
const wb: XLSX.WorkBook = XLSX.read(bstr, { type: 'binary' });
/* grab first sheet */
const wsname: string = wb.SheetNames[0];
const ws: XLSX.WorkSheet = wb.Sheets[wsname];
/* save data */
this.data = <AOA>(XLSX.utils.sheet_to_json(ws));
console.log(this.data);
this.uploadData(this.data)
};
reader.readAsBinaryString(target.files[0]);
}
uploadData(dataToUpload:any[]) {
var doublemulUserList = []
for (let index = 1; index <= dataToUpload.length; index++) {
this.userObject = {
"Name": "",
"Email": "",
"Number": "",
"Image": ""
};
this.userObject.Name = dataToUpload[index-1]['Name'];
this.userObject.Email = dataToUpload[index-1]['Email']
this.userObject.Number = dataToUpload[index-1]['Number']
var xhr = new XMLHttpRequest();
xhr.open("GET", dataToUpload[index-1]['Image'], true);
xhr.responseType = "blob";
xhr.onload = function (e) {
console.log(this.response);
var reader = new FileReader();
reader.onload = function(event) {
var res = event.target.result;
console.log(res)
}
var file = this.response;
reader.readAsDataURL(file)
};
xhr.send()
}
}
Can some one share some links or an idea on how to go about this problem.
var profileImage = fileInputInByteArray;
$.ajax({
url: 'abc.com/',
type: 'POST',
dataType: 'json',
data: {
// Other data
ProfileImage: profileimage
// Other data
},
success: {
}
})
// Code in WebAPI
[HttpPost]
public HttpResponseMessage UpdateProfile([FromUri]UpdateProfileModel response) {
//...
return response;
}
public class UpdateProfileModel {
// ...
public byte[] ProfileImage {get ;set; }
// ...
}
<input type="file" id="inputFile" />
I am using ajax call to post byte[] value of a input type = file input to web api which receives in byte[] format. However, I am experiencing difficulty of getting byte array. I am expecting that we can get the byte array through File API.
Note: I need to store the byte array in a variable first before passing through ajax call
[Edit]
As noted in comments above, while still on some UA implementations, readAsBinaryString method didn't made its way to the specs and should not be used in production.
Instead, use readAsArrayBuffer and loop through it's buffer to get back the binary string :
document.querySelector('input').addEventListener('change', function() {
var reader = new FileReader();
reader.onload = function() {
var arrayBuffer = this.result,
array = new Uint8Array(arrayBuffer),
binaryString = String.fromCharCode.apply(null, array);
console.log(binaryString);
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file" />
<div id="result"></div>
For a more robust way to convert your arrayBuffer in binary string, you can refer to this answer.
[old answer] (modified)
Yes, the file API does provide a way to convert your File, in the <input type="file"/> to a binary string, thanks to the FileReader Object and its method readAsBinaryString.
[But don't use it in production !]
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var binaryString = this.result;
document.querySelector('#result').innerHTML = binaryString;
}
reader.readAsBinaryString(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
If you want an array buffer, then you can use the readAsArrayBuffer() method :
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = this.result;
console.log(arrayBuffer);
document.querySelector('#result').innerHTML = arrayBuffer + ' '+arrayBuffer.byteLength;
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
$(document).ready(function(){
(function (document) {
var input = document.getElementById("files"),
output = document.getElementById("result"),
fileData; // We need fileData to be visible to getBuffer.
// Eventhandler for file input.
function openfile(evt) {
var files = input.files;
// Pass the file to the blob, not the input[0].
fileData = new Blob([files[0]]);
// Pass getBuffer to promise.
var promise = new Promise(getBuffer);
// Wait for promise to be resolved, or log error.
promise.then(function(data) {
// Here you can pass the bytes to another function.
output.innerHTML = data.toString();
console.log(data);
}).catch(function(err) {
console.log('Error: ',err);
});
}
/*
Create a function which will be passed to the promise
and resolve it when FileReader has finished loading the file.
*/
function getBuffer(resolve) {
var reader = new FileReader();
reader.readAsArrayBuffer(fileData);
reader.onload = function() {
var arrayBuffer = reader.result
var bytes = new Uint8Array(arrayBuffer);
resolve(bytes);
}
}
// Eventlistener for file input.
input.addEventListener('change', openfile, false);
}(document));
});
<!DOCTYPE html>
<html>
<head>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
</head>
<body>
<input type="file" id="files"/>
<div id="result"></div>
</body>
</html>
Modern browsers now have the arrayBuffer method on Blob's:
document.querySelector('input').addEventListener('change', async (event) => {
const buffer = await event.target.files[0].arrayBuffer()
console.log(buffer)
}, false)
🎉 🎉
This is a long post, but I was tired of all these examples that weren't working for me because they used Promise objects or an errant this that has a different meaning when you are using Reactjs. My implementation was using a DropZone with reactjs, and I got the bytes using a framework similar to what is posted at this following site, when nothing else above would work: https://www.mokuji.me/article/drop-upload-tutorial-1 . There were 2 keys, for me:
You have to get the bytes from the event object, using and during a FileReader's onload function.
I tried various combinations, but in the end, what worked was:
const bytes = e.target.result.split('base64,')[1];
Where e is the event. React requires const, you could use var in plain Javascript. But that gave me the base64 encoded byte string.
So I'm just going to include the applicable lines for integrating this as if you were using React, because that's how I was building it, but try to also generalize this, and add comments where necessary, to make it applicable to a vanilla Javascript implementation - caveated that I did not use it like that in such a construct to test it.
These would be your bindings at the top, in your constructor, in a React framework (not relevant to a vanilla Javascript implementation):
this.uploadFile = this.uploadFile.bind(this);
this.processFile = this.processFile.bind(this);
this.errorHandler = this.errorHandler.bind(this);
this.progressHandler = this.progressHandler.bind(this);
And you'd have onDrop={this.uploadFile} in your DropZone element. If you were doing this without React, this is the equivalent of adding the onclick event handler you want to run when you click the "Upload File" button.
<button onclick="uploadFile(event);" value="Upload File" />
Then the function (applicable lines... I'll leave out my resetting my upload progress indicator, etc.):
uploadFile(event){
// This is for React, only
this.setState({
files: event,
});
console.log('File count: ' + this.state.files.length);
// You might check that the "event" has a file & assign it like this
// in vanilla Javascript:
// var files = event.target.files;
// if (!files && files.length > 0)
// files = (event.dataTransfer ? event.dataTransfer.files :
// event.originalEvent.dataTransfer.files);
// You cannot use "files" as a variable in React, however:
const in_files = this.state.files;
// iterate, if files length > 0
if (in_files.length > 0) {
for (let i = 0; i < in_files.length; i++) {
// use this, instead, for vanilla JS:
// for (var i = 0; i < files.length; i++) {
const a = i + 1;
console.log('in loop, pass: ' + a);
const f = in_files[i]; // or just files[i] in vanilla JS
const reader = new FileReader();
reader.onerror = this.errorHandler;
reader.onprogress = this.progressHandler;
reader.onload = this.processFile(f);
reader.readAsDataURL(f);
}
}
}
There was this question on that syntax, for vanilla JS, on how to get that file object:
JavaScript/HTML5/jQuery Drag-And-Drop Upload - "Uncaught TypeError: Cannot read property 'files' of undefined"
Note that React's DropZone will already put the File object into this.state.files for you, as long as you add files: [], to your this.state = { .... } in your constructor. I added syntax from an answer on that post on how to get your File object. It should work, or there are other posts there that can help. But all that Q/A told me was how to get the File object, not the blob data, itself. And even if I did fileData = new Blob([files[0]]); like in sebu's answer, which didn't include var with it for some reason, it didn't tell me how to read that blob's contents, and how to do it without a Promise object. So that's where the FileReader came in, though I actually tried and found I couldn't use their readAsArrayBuffer to any avail.
You will have to have the other functions that go along with this construct - one to handle onerror, one for onprogress (both shown farther below), and then the main one, onload, that actually does the work once a method on reader is invoked in that last line. Basically you are passing your event.dataTransfer.files[0] straight into that onload function, from what I can tell.
So the onload method calls my processFile() function (applicable lines, only):
processFile(theFile) {
return function(e) {
const bytes = e.target.result.split('base64,')[1];
}
}
And bytes should have the base64 bytes.
Additional functions:
errorHandler(e){
switch (e.target.error.code) {
case e.target.error.NOT_FOUND_ERR:
alert('File not found.');
break;
case e.target.error.NOT_READABLE_ERR:
alert('File is not readable.');
break;
case e.target.error.ABORT_ERR:
break; // no operation
default:
alert('An error occurred reading this file.');
break;
}
}
progressHandler(e) {
if (e.lengthComputable){
const loaded = Math.round((e.loaded / e.total) * 100);
let zeros = '';
// Percent loaded in string
if (loaded >= 0 && loaded < 10) {
zeros = '00';
}
else if (loaded < 100) {
zeros = '0';
}
// Display progress in 3-digits and increase bar length
document.getElementById("progress").textContent = zeros + loaded.toString();
document.getElementById("progressBar").style.width = loaded + '%';
}
}
And applicable progress indicator markup:
<table id="tblProgress">
<tbody>
<tr>
<td><b><span id="progress">000</span>%</b> <span className="progressBar"><span id="progressBar" /></span></td>
</tr>
</tbody>
</table>
And CSS:
.progressBar {
background-color: rgba(255, 255, 255, .1);
width: 100%;
height: 26px;
}
#progressBar {
background-color: rgba(87, 184, 208, .5);
content: '';
width: 0;
height: 26px;
}
EPILOGUE:
Inside processFile(), for some reason, I couldn't add bytes to a variable I carved out in this.state. So, instead, I set it directly to the variable, attachments, that was in my JSON object, RequestForm - the same object as my this.state was using. attachments is an array so I could push multiple files. It went like this:
const fileArray = [];
// Collect any existing attachments
if (RequestForm.state.attachments.length > 0) {
for (let i=0; i < RequestForm.state.attachments.length; i++) {
fileArray.push(RequestForm.state.attachments[i]);
}
}
// Add the new one to this.state
fileArray.push(bytes);
// Update the state
RequestForm.setState({
attachments: fileArray,
});
Then, because this.state already contained RequestForm:
this.stores = [
RequestForm,
]
I could reference it as this.state.attachments from there on out. React feature that isn't applicable in vanilla JS. You could build a similar construct in plain JavaScript with a global variable, and push, accordingly, however, much easier:
var fileArray = new Array(); // place at the top, before any functions
// Within your processFile():
var newFileArray = [];
if (fileArray.length > 0) {
for (var i=0; i < fileArray.length; i++) {
newFileArray.push(fileArray[i]);
}
}
// Add the new one
newFileArray.push(bytes);
// Now update the global variable
fileArray = newFileArray;
Then you always just reference fileArray, enumerate it for any file byte strings, e.g. var myBytes = fileArray[0]; for the first file.
This is simple way to convert files to Base64 and avoid "maximum call stack size exceeded at FileReader.reader.onload" with the file has big size.
document.querySelector('#fileInput').addEventListener('change', function () {
var reader = new FileReader();
var selectedFile = this.files[0];
reader.onload = function () {
var comma = this.result.indexOf(',');
var base64 = this.result.substr(comma + 1);
console.log(base64);
}
reader.readAsDataURL(selectedFile);
}, false);
<input id="fileInput" type="file" />
document.querySelector('input').addEventListener('change', function(){
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = this.result,
array = new Uint8Array(arrayBuffer),
binaryString = String.fromCharCode.apply(null, array);
console.log(binaryString);
console.log(arrayBuffer);
document.querySelector('#result').innerHTML = arrayBuffer + ' '+arrayBuffer.byteLength;
}
reader.readAsArrayBuffer(this.files[0]);
}, false);
<input type="file"/>
<div id="result"></div>
Here is one answer to get the actual final byte array , just using FileReader and ArrayBuffer :
const test_function = async () => {
... ... ...
const get_file_array = (file) => {
return new Promise((acc, err) => {
const reader = new FileReader();
reader.onload = (event) => { acc(event.target.result) };
reader.onerror = (err) => { err(err) };
reader.readAsArrayBuffer(file);
});
}
const temp = await get_file_array(files[0])
console.log('here we finally ve the file as a ArrayBuffer : ',temp);
const fileb = new Uint8Array(fileb)
... ... ...
}
where file is directly the File object u want to read , this has to be done in a async function...
To make a long story short:
How to Asynchronously write an ArrayBuffer directly to file using nsIArrayBufferInputStream in Firefox extension ?
It seems that MDN does not have any documentation on nsIArrayBufferInputStream.
I know I can use nsIStringInputStream and convert the BufferArray to String, but this poses a big performance hit
also converting ArrayBuffer to string using this code:
String.fromCharCode.apply(null, new Uint16Array(buf));
Does not work if the buffer is 500 KB or bigger, so we must loop over it one char at a time:
for (let i = 0; i < buf.length; i++){
s += String.fromCharCode(buf16[i]);
}
Or, I can use nsIBinaryOutputStream.writeByteArray but it cannot be used with NetUtil.asyncCopy (or can it?)
//this works ok, but is synchronous :-(
function writeBinFile(aFile, data){
Components.utils.import("resource://gre/modules/FileUtils.jsm");
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
if(typeof aFile === 'string') aFile = nsFile(aFile);
var stream = FileUtils.openSafeFileOutputStream(aFile, FileUtils.MODE_WRONLY | FileUtils.MODE_CREATE);
var binaryStream = Cc["#mozilla.org/binaryoutputstream;1"].createInstance(Ci.nsIBinaryOutputStream);
binaryStream.setOutputStream(stream);
binaryStream.writeByteArray(data, data.length);
FileUtils.closeSafeFileOutputStream(stream);
}
And the long story is...
I have been trying to use nsIArrayBufferInputStream
http://dxr.mozilla.org/mozilla-central/source/netwerk/base/public/nsIArrayBufferInputStream.idl
but with no success. the code I tried:
function fileWrite(file, data, callback) {
Cu.import("resource://gre/modules/FileUtils.jsm");
Cu.import("resource://gre/modules/NetUtil.jsm");
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
if (typeof file == 'string') file = new nsFile(file);
let ostream = FileUtils.openSafeFileOutputStream(file)
let istream = Cc["#mozilla.org/io/arraybuffer-input-stream;1"].createInstance(Ci.nsIArrayBufferInputStream);
istream.setData(data, 0, data.length);
let bstream = Cc["#mozilla.org/binaryinputstream;1"].createInstance(Ci.nsIBinaryInputStream);
bstream.setInputStream(istream);
//NetUtil.asyncCopy(istream, ostream,
NetUtil.asyncCopy(bstream, ostream,
function(status) {
if (callback) callback(Components.isSuccessCode(status));
}
);
}
The ArrayBuffer data param is the responce from XMLHttpRequest:
function getBinFile(url, dir) {
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
let oReq = new XMLHttpRequest();
oReq.open("GET", url, true);
oReq.responseType = "arraybuffer";
oReq.onload = function(oEvent) {
var arrayBuffer = oReq.response;
if (arrayBuffer) {
//let byteArray = new Uint8Array(arrayBuffer);
let byteArray = arrayBuffer;
dir = /\\$/.test(dir) ? dir: dir + '\\';
let file = nsFile(dir + decodeURIComponent(url.split('/').pop()));
fileWrite(file, byteArray);
}
};
oReq.send(null);
}
calling like this:
getBinFile( 'http://....', 'c:\\demo\\');
A file is created but with no contents!
I'm answering myself in case anyone stumbles upon this question...
with help from Josh Matthews (of Mozilla) i found the answer:
use byteLength instead of length
istream.setData(data, 0, data.byteLength);
I have a list of files I need to save and in addition to the name I need to send the readAsDataURL to the server as well.
The problem is I am not sure how to do it with the async nature of readAsDataURL. Because to save the DATAURL to the array I need to look up the name of the file which is in the files list. and I cannot pass the file to the async method of readAsDataURL. How do you write this properly to work? The end result is I want a list of files sent to the server in one JSZip file.
function saveFileList(files)
{
for (var i = 0, file; file = files[i]; i++) {
var fr = new FileReader();
fr.onload = function(e){
if (e.target.readyState == FileReader.DONE) {
var tt = e.target.result.split(",")[1];
//update the record in the list with the result
}
};
var pp = fr.readAsDataURL(file);
}
If you have FileList and you need to get array of base64 string, you need do this
export async function fileListToBase64(fileList) {
// create function which return resolved promise
// with data:base64 string
function getBase64(file) {
const reader = new FileReader()
return new Promise(resolve => {
reader.onload = ev => {
resolve(ev.target.result)
}
reader.readAsDataURL(file)
})
}
// here will be array of promisified functions
const promises = []
// loop through fileList with for loop
for (let i = 0; i < fileList.length; i++) {
promises.push(getBase64(fileList[i]))
}
// array with base64 strings
return await Promise.all(promises)
}
use it like this
const arrayOfBase64 = await fileListToBase64(yourFileList)
You need another function around it, so you can pass the file in. Try this:
var reader = new FileReader();
reader.onload = (function(theFile) {
return function(e) {
if(reader.readyState == FileReader.DONE)
alert(theFile.name); // The file that was passed in.
}
};
})(file);
reader.readAsDataURL(file);
An alternative to Russell G's answer:
var reader = new FileReader();
reader.onload = function(event){
payload = event.target.result;
var filename = file.name, filetype = file.type;//etc
//trigger a custom event or execute a callback here to send your data to server.
};
reader.onerror = function(event){
//handle any error in here.
};
reader.readAsDataURL(file);