I'm looking for a way to split up any text/data file on the front end in the browser before being uploaded as multiple files. My limit is 40KB per upload. So if a user uploads a 400KB file, it would split this file into 10 separate chunks or 10 separate files on the front end before uploading it to the server.
Currently, I'm doing it by converting this file into a base64 formatted string, then split this string by 40KB which comes out to 10 separate chunks. From there I upload each chunk as with a filename of chunk-1-of-10, chunk-2-of-10...
When pulling down these files, I just concat all these chunks back and deconvert it from base64 into its file format.
Is there a better way of doing this? Is there a library that handles all of this instead of writing it from scratch? I'm not sure if the base64 route is the best way to do this.
There is no need for reading the content into ram with the FileReader
using base64 will only increase the size of what you need to upload, base64 takes up ~33% more in size
Use Blob.slice to get chunks
blob slices (chunks) will not increase the memory, it will just create a new reference to the old blob with a changed offset and a new size to where it should start reading from.
when fetch sends the data it will be piped directly from the disk to the network without even touching the main thread.
// simulate a file from a input
const file = new File(['a'.repeat(1000000)], 'test.txt')
const chunkSize = 40000
const url = 'https://httpbin.org/post'
for (let start = 0; start < file.size; start += chunkSize) {
const chunk = file.slice(start, start + chunkSize + 1)
const fd = new FormData()
fd.set('data', chunk)
await fetch(url, { method: 'post', body: fd }).then(res => res.text())
}
You could avoid having to base64 encode by using a FileReader and then sending as binary:
const url = 'http://www.example.com/upload';
document.getElementById('file-uploader').addEventListener('change', function(e) {
const size = 40000;
var reader = new FileReader();
var buf;
var file = document.getElementById('file-uploader').files[0];
reader.onload = function(e) {
buf = new Uint8Array(e.target.result);
for (var i = 0; i < buf.length; i += size) {
var fd = new FormData();
fd.append('fname', [file.name, i+1, 'of', buf.length].join('-'));
fd.append('data', new Blob([buf.subarray(i, i + size)]));
var oReq = new XMLHttpRequest();
oReq.open("POST", url, true);
oReq.onload = function (oEvent) {
// Uploaded.
};
oReq.send(fd);
}
}
reader.readAsArrayBuffer(file);
});
<input type="file" id="file-uploader"/>
Related
I am working on a project where I have to upload an image as form data along with other text fields. I have my file in Base64 string at first, then I convert it into a file before uploading it to the server.
const data = await fetch(base64String);
const blob = await data.blob();
const file = await new File([blob], 'avatar', { type: 'image/png' });
I logged the base64String in the client side before uploading it to the server. Then I upload file to the server as a File. Before saving it to MongoDB when I log it as a base64 string again in the server side, I see my string is not the same as before. I feel like while converting the base64 to file in the client side I am doing something wrong. Help me out please.
I have figured out my problem. When I take image file input from my computer I get a base64 string like below -
dataimage/jpegbase64/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAA...
But, when I convert it back into a file it expects a string like below -
/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAA....
So, basically, I had to trim the string accordingly to match the expected format and wrote a base64 to file conversion function following this answer.
Here is my function to convert a base64 string to an image file
export function getFileFromBase64(string64:string, fileName:string) {
const trimmedString = string64.replace('dataimage/jpegbase64', '');
const imageContent = atob(trimmedString);
const buffer = new ArrayBuffer(imageContent.length);
const view = new Uint8Array(buffer);
for (let n = 0; n < imageContent.length; n++) {
view[n] = imageContent.charCodeAt(n);
}
const type = 'image/jpeg';
const blob = new Blob([buffer], { type });
return new File([blob], fileName, { lastModified: new Date().getTime(), type });
}
I'm looking for a way to split up any text/data file on the front end in the browser before being uploaded as multiple files. My limit is 40KB per upload. So if a user uploads a 400KB file, it would split this file into 10 separate chunks or 10 separate files on the front end before uploading it to the server.
Currently, I'm doing it by converting this file into a base64 formatted string, then split this string by 40KB which comes out to 10 separate chunks. From there I upload each chunk as with a filename of chunk-1-of-10, chunk-2-of-10...
When pulling down these files, I just concat all these chunks back and deconvert it from base64 into its file format.
Is there a better way of doing this? Is there a library that handles all of this instead of writing it from scratch? I'm not sure if the base64 route is the best way to do this.
There is no need for reading the content into ram with the FileReader
using base64 will only increase the size of what you need to upload, base64 takes up ~33% more in size
Use Blob.slice to get chunks
blob slices (chunks) will not increase the memory, it will just create a new reference to the old blob with a changed offset and a new size to where it should start reading from.
when fetch sends the data it will be piped directly from the disk to the network without even touching the main thread.
// simulate a file from a input
const file = new File(['a'.repeat(1000000)], 'test.txt')
const chunkSize = 40000
const url = 'https://httpbin.org/post'
for (let start = 0; start < file.size; start += chunkSize) {
const chunk = file.slice(start, start + chunkSize + 1)
const fd = new FormData()
fd.set('data', chunk)
await fetch(url, { method: 'post', body: fd }).then(res => res.text())
}
You could avoid having to base64 encode by using a FileReader and then sending as binary:
const url = 'http://www.example.com/upload';
document.getElementById('file-uploader').addEventListener('change', function(e) {
const size = 40000;
var reader = new FileReader();
var buf;
var file = document.getElementById('file-uploader').files[0];
reader.onload = function(e) {
buf = new Uint8Array(e.target.result);
for (var i = 0; i < buf.length; i += size) {
var fd = new FormData();
fd.append('fname', [file.name, i+1, 'of', buf.length].join('-'));
fd.append('data', new Blob([buf.subarray(i, i + size)]));
var oReq = new XMLHttpRequest();
oReq.open("POST", url, true);
oReq.onload = function (oEvent) {
// Uploaded.
};
oReq.send(fd);
}
}
reader.readAsArrayBuffer(file);
});
<input type="file" id="file-uploader"/>
I have implemented the chunk upload feature in my component. For this used file.slice method to split the file like below,
` var start = 0, end = chunkSize;
while (start < fileSize) {
var blob = file.slice(start, end-1);
formData.append('chunkFile', blob, file.name);
var xhr = new XMLHttpRequest();
xhr.addEventListener('load', function (e) { console.log(e); }, false);
xhr.open('POST', '../saveUrl');
xhr.send(formData);
start = end;
end += chunkSize;
}`
But the generated chunk files are looks repeated the first chunk file. For example, I have uploaded a 5MB image file, split as 1MB chunk files. But the result is all the 5 chunk files are same as first chunk file.
Could you please guide me, how to get the other split chunk files?
Thanks in advance.
I need to modify existing frontend (angular) code that involves uploading files to a server. Now the files need to be encrypted before being uploaded.
The current approach uses FormData to append a number of files and send them in a single request as shown below:
function uploadFiles(wrappers){
var data = new FormData();
// Add each file
for(var i = 0; i < wrappers.length; i++){
var wrapper = wrappers[i];
var file = wrapper.file;
data.append('file_' + i, file);
}
$http.post(uri, data, requestCfg).then(
/*...*
I have been using Forge in other projects, but never in this sort of context and don't really see how to encrypt files on the fly and still append them as FormData contents.
Forge provides an easy API:
var key = forge.random.getBytesSync(16);
var iv = forge.random.getBytesSync(8);
// encrypt some bytes
var cipher = forge.rc2.createEncryptionCipher(key);
cipher.start(iv);
cipher.update(forge.util.createBuffer(someBytes));
cipher.finish();
var encrypted = cipher.output;
The backend recieves files using Formidable and all the file hanlding is already wired. I would thus like to stick to using the existing front-end logic but simply insert the encryption logic. In that, it's not the entire formdata that must be encrypted... I haven't found a good lead yet to approach this.
Suggestions are very welcome!
Ok, found a solution and added the decrypt code as well. This adds a layer of async code.
function appendFile(aFile, idx){
// Encrypt if a key was provided for this protocol test
if(!key){
data.append('dicomfile_' + idx, file);
appendedCount++;
onFileAppended();
}
else{
var reader = new FileReader();
reader.onload = function(){
// 1. Read bytes
var arrayBuffer = reader.result;
var bytes = new Uint8Array(arrayBuffer); // byte array aka uint8
// 2. Encrypt
var cipher = forge.cipher.createCipher('AES-CBC', key);
cipher.start({iv: iv});
cipher.update(forge.util.createBuffer(bytes));
cipher.finish();
// 3. To blob (file extends blob)
var encryptedByteCharacters = cipher.output.getBytes(); // encryptedByteCharacters is similar to an ATOB(b64) output
// var asB64 = forge.util.encode64(encryptedBytes);
// var encryptedByteCharacters = atob(asB64);
// Convert to Blob object
var blob = byteCharsToBlob(encryptedByteCharacters, "application/octet-stream", 512);
// 4. Append blob
data.append('dicomfile_' + idx, blob, file.name);
// Decrypt for the sake of testing
if(true){
var fileReader = new FileReader();
fileReader.onload = function() {
arrayBuffer = this.result;
var bytez = new Uint8Array(arrayBuffer);
var decipher = forge.cipher.createDecipher('AES-CBC', key);
decipher.start({iv: iv});
decipher.update(forge.util.createBuffer(bytez));
decipher.finish();
var decryptedByteCharacters = decipher.output.getBytes();
var truz = bytes === decryptedByteCharacters;
var blob = byteCharsToBlob(decryptedByteCharacters, "application/octet-stream", 512);
data.append('decrypted_' + idx, blob, file.name + '.decrypted');
appendedCount++;
onFileAppended();
};
fileReader.readAsArrayBuffer(blob);
}
else{
// z. Resume processing
appendedCount++;
onFileAppended();
}
}
// Read file
reader.readAsArrayBuffer(aFile);
}
}
function onFileAppended(){
// Only proceed when all files were appended and optionally encrypted (async)
if(appendedCount !== wrappers.length) return;
/* resume processing, upload or do whathever */
In one of my views, I have a file upload control. It supports file uploading either via drag and drop, or via the standard file dialog opened after a button click.
How to do this in my e2e tests1?
1 Just one of the two options will be enough
You can upload files using Javascript blobs. This requires the FileApi, which isn't compatible with older browsers (http://caniuse.com/fileapi). But since you mentioned using drag and drop uploads, which uses the FileApi, it shouldn't matter too much.
There are two ways you can upload files using the blob API. One is very easy and the other is simply a continuation of the first.
Using Javascript, you can create a new blob with:
var blob = new Blob("content", contentType);
For example, this will create a blob object that contains the text "Hello World!".
var foo = new Blob("Hello World!", {type: "text/plain"});
You could also use the following method is better for non-plaintext files, such as pdf's. You have to convert the file to Base64 (you can use something like this) and create the blob using the Base64 data.
Use this function (a slightly modified version of this) to create the blob.
function b64toBlob(b64Data, contentType, sliceSize) {
b64Data = b64Data.replace(/\s/g, '');
contentType = contentType || '';
sliceSize = sliceSize || 1024;
function charCodeFromCharacter(c) {
return c.charCodeAt(0);
}
var byteCharacters = atob(b64Data);
var byteArrays = [];
for (var offset = 0; offset < byteCharacters.length; offset += sliceSize) {
var slice = byteCharacters.slice(offset, offset + sliceSize);
var byteNumbers = Array.prototype.map.call(slice, charCodeFromCharacter);
var byteArray = new Uint8Array(byteNumbers);
byteArrays.push(byteArray);
}
var blob = new Blob(byteArrays, {type: contentType});
return blob;
}
For example, this will create a PDF blob object.
var pdf = "JVBERi0xLjQKJcfsj6IKNSAwIG9...=="; //base64 encoded file as a String
var pdfBlob = b64toBlob(pdf, "application/pdf", 1024);
After you create the blob with one of the methods above, it can be treated as a file. For example, you could put the file into a FormData object (if you're doing uploads like this):
var fd = new FormData();
fd.append("uploadedFile", pdfBlob, "My PDF.pdf"*);
*Filename parameter only seems to work on Chrome as of now.