If you are sending data that is base64-encoded and compressed (using, say, python's zlib.compress()), you can use the native Chrome function window.atob() to convert from base64 to binary data. Is there any similar native javascript function to decompress the zlib-compressed data? Is there some hack to do this?
I know that code to decompress data is already in the browser because it can receive HTML sent with gzip headers.
I am not looking for a javascript library to do decompression.
If you come up a decompression scheme on the browser, I can compress it in that format for transmission. In other words, any decompression routine is acceptable.
Here's a hack to paint a PNG containing compressed data into a canvas and reading the data back, pixel by pixel: Compression using Canvas and PNG-embedded data. If you want anything that uses a browser's native compression library, here's one option. Unfortuantely, you have to convert the ImageData to string within javascript.
There is no such function exposed.
2020 Update
Chrome 80+ supports CompressionStream and DecompressionStream APIs
Gzip-compress a stream
const compressedReadableStream = inputReadableStream.pipeThrough(new CompressionStream('gzip'));
Deflate-compress an ArrayBuffer
function compressArrayBuffer(input) {
const stream = new Response(input).body
.pipeThrough(new CompressionStream('deflate'));
return new Response(stream).arrayBuffer();
}
Gzip-decompress a Blob to a Blob
This treats the input as a gzip file regardless of the mime-type. The output Blob has an empty mime-type.
async function DecompressBlob(blob) {
const ds = new DecompressionStream('gzip');
const decompressedStream = blob.stream().pipeThrough(ds);
return await new Response(decompressedStream).blob();
}
https://github.com/WICG/compression/blob/master/explainer.md
Try window.btoa.
Related
I've been trying to use JS's XMLHttpRequest Class for file uploading. I initially tried something like this:
const file = thisFunctionReturnsAFileObject();
const request = new XMLHttpRequest();
request.open('POST', '/upload-file');
const rawFileData = await file.arrayBuffer();
request.send(rawFileData);
The above code works (yay!), and sends the raw binary data of the file to my server.
However...... It uses a TON of memory (because the whole file gets stored in memory, and JS isn't particulary memory friendly)... I found out that on my machine (16GB RAM), I couldn't send files larger than ~100MB, because JS would allocate too much memory, and the Chrome tab would crash with a SIGILL code.
So, I thought it would be a good idea to use ReadableStreams here. It has good enough browser compatibility in my case (https://caniuse.com/#search=ReadableStream) and my TypeScript compiler told me that request.send(...) supports ReadableStreams (I later came to the conclusion that this is false). I ended up with code like this:
const file = thisFunctionReturnsAFileObject();
const request = new XMLHttpRequest();
request.open('POST', '/upload-file');
const fileStream = file.stream();
request.send(fileStream);
But my TypeScript compiler betrayed me (which hurt) and I recieved "[object ReadableStream]" on my server ಠ_ಠ.
I still haven't explored the above method too much, so I'm not sure if there might be a way to do this. I'd also appreciate help on this very much!
Splitting the request in chunk would be an optimal solution, since once a chunk has been sent, we can remove it from memory, before the whole request has even been recieved.
I have searched and searched, but haven't found a way to do this yet (which is why I'm here...). Something like this in pseudocode would be optimal:
const file = thisFunctionReturnsAFileObject();
const request = new XMLHttpRequest();
request.open('POST', '/upload-file');
const fileStream = file.stream();
const fileStreamReader = fileStream.getReader();
const sendNextChunk = async () => {
const chunk = await fileStreamReader.read();
if (!chunk.done) { // chunk.done implies that there is no more data to be read
request.writeToBody(chunk.value); // chunk.value is a Uint8Array
} else {
request.end();
break;
}
}
sendNextChunk();
I'd like to expect this code to send the request in chunks and end the request when all chunks are sent.
The most helpful resource I tried, but didn't work:
Method for streaming data from browser to server via HTTP
Didn't work because:
I need the solution to work in a single request
I can't use RTCDataChannel, it must be in a plain HTTP request (is there an other way to do this than XMLHttpRequest?)
I need it to work in modern Chrome/Firefox/Edge etc. (no IE support is fine)
Edit: I don't want to use multipart-form (FormData Class). I want to send actual binary data read from the filestream in chunks.
You can't do this with XHR afaik. But the more modern fetch API does support passing a ReadableStream for the request body. In your case:
const file = thisFunctionReturnsAFileObject();
const response = await fetch('/upload-file', {
method: 'POST',
body: file.stream(),
});
However, I'm not certain whether this will actually use chunked encoding.
You are facing a Chrome bug where they do set an hard-limit of 256MB to the size of the ArrayBuffer that can be sent.
But anyway, sending an ArrayBuffer will create a copy of the data, so you should rather send your data as a File directly, since this will only read the File exactly like you wanted it to be, as a stream by small chunks.
So taking your first code block that would give
const file = thisFunctionReturnsAFileObject();
const request = new XMLHttpRequest();
request.open('POST', '/upload-file');
request.send(file);
Ans this will work in Chrome too, even with few Gigs files. The only limit you would face here would be before, when you'd do whatever processing you are doing on that File.
Regarding posting ReadableStreams, this will eventually come, but as of today July the 13th of 2020, only Chrome has started working on its implementation, and we web-devs still can't play with it, and specs are still having hard times to come with something stable.
But it's not a problem for you, since you would not win anything doing so anyway. Posting a ReadableStream made from a static File is useless, both fetch and xhr will do this internally already.
A Python backend reads a binary file, base64 encodes it, inserts it into a JSON doc and sends it to a JavaScript frontend:
#Python
with open('some_binary_file', 'rb') as in_file:
return base64.b64encode(in_file.read()).decode('utf-8')
The JavaScript frontend fetches the base64 encoded string from the JSON document and turns it into a binary blob:
#JavaScript
b64_string = response['b64_string'];
decoded_file = atob(b64_string);
blob = new Blob([decoded_file], {type: 'application/octet-stream'});
Unfortunately when downloading the blob the encoding seems to be wrong but I'm not sure where the problem is. E.g. it is an Excel file that I can't open anymore. In the Python part I've tried different decoders ('ascii', 'latin1') but that doesn't make a difference. Is there a problem with my code?
I found the answer here. The problem was at the JavaScript side. It seems only applying atob to the base64 encoded string doesn't work for binary data. You'll have to convert that into a typed byte array. What I ended up doing (LiveScript):
byte_chars = atob base64_str
byte_numbers = [byte_chars.charCodeAt(index) for bc, index in byte_chars]
byte_array = new Uint8Array byte_numbers
blob = new Blob [byte_array], {type: 'application/octet-stream'}
i'm not 100% sure but from what i read when i send a blob (binary data) over websocket, the blob does not contain any file information. (Also the official specification states that wesockets only send the raw binary)
the filesize
the mimetype
user info (explain later)
i'm using https://github.com/websockets/ws
Testing:
Sending directly the blob from an input file.
ws.send(this.files[0]) //this should already contain the info
Creating a new blob with the native javascript api from file setting the proper mimetype.
ws.send(new Blob([this.files[0]],{type:this.files[0].type})); //also this
on both sides you can get only the effective blob without any other information.
Is it possible to append let's say a 4kb predefined json data converted also to binary that contains important information like the mimetype and the filesize,
and then just split off the 4kb when needed?
{"mime":"txt/plain","size":345}____________4KB_REST_OF_THE_BINARY
OR
ws.send({"mime":"txt\/plain","size":345})
ws.send(this.files[0])
Even if the first one is the worst solution ever it would allow me to send everything in one time.
The second one has a big problem:
it's a chat that allows to send also files like documents,images,music videos.
i could write some sort of handshaking system when sending the file/user info before i send the binary data.
BUT
if another person sends also a file, as it's async, the handshaking system has no chance to determine wich file is the right one for the correct user and mimetype.
So how do you properly send a binary file in a multiuser async envoirement?
i know i can convert to base64 but thats 30% bigger.
btw. Totally disappointed with Apple... while chrome shows every binary data properly, my ios devices are not able to handle blob's, only images will show in blob or base64 format, not even a simple txt file. Basically only a <img> tag can read dynamic files.
How everything works (now):
user sends a file
nodejs gets the binary data, also user info... but not mimetype,filename,size.
nodejs broadcasts the raw binary file to all the users.(can't specify user & file info)
clients create a bloburl (who send that? XD).
EDIT
what i have now:
client 1 (sends a file)CHROME
fileInput.addEventListener('change',function(e){
var file=this.files[0];
ws.send(new Blob([file],{
type:file.type //<- SET MIMETYPE
}));
//file.size
},false);
note: file is already a blob ... but this is how you would normally create a new blob specifying the mimetype.
server (broadcasts the binary data to the other clients)NODEJS
aaaaaand the mimetype is gone...
ws.addListener('message',function(binary){
var b=0,c=wss.clients.length;
while(b<c){
wss.clients[b++].send(binary)
}
});
client 2 (recieves the binary)CHROME
ws.addEventListener('message',function(msg){
var blob=new Blob([msg.data],{
type:'application/octet-stream' //<- LOST
});
var file=window.URL.createObjectURL(blob);
},false);
note: m.data is already a blob ... but this is how you would normally create a new blob specifying the mimetype witch is lost.
In client 2 i need the mimetype and naturally i also need the info about the user, wich can be retrieved from client 1 or the server (not a good choice)...
You're a bit out of luck with this because Node doesn't support the Blob interface and so any data you send or receive in Binary with Node is just Binary. You would have to have something that knew how to interpret a Blob object.
Here's an idea, and let me know if this works. Reading through the documentation for websockets\ws it says it supports sending and receiving ArrayBuffers. Which means you can use TypedArrays.
Here's where it gets nasty. You set a certain fixed n number of bytes at the beginning of every TypedArray to signal the mime type encoded in utf8 or what have you, and the rest of your TypedArray contains your file's bytes.
I would recommend using UInt8Array because utf8 characters are 8 bits long and your text will probably be readable when encoded that way. As for the file bits you'll probably just end up writing those down somewhere and appending an ending to it.
Also note, this method of interpretation works both ways whether from Node or in the Browser.
This solution is really just a form of type casting and you might get some unexpected results. The fixed length of your mime type field is crucial.
Here it is illustrated. Copy, paste, set the image file to whatever you want and then run that. You'll see the mime type I set pop out.
var fs = require('fs');
//https://stackoverflow.com/questions/8609289/convert-a-binary-nodejs-buffer-to-javascript-arraybuffer
function toUint8Array(buffer) {
var ab = new ArrayBuffer(buffer.length);
var array = new Uint8Array(ab);
for (var i = 0; i < buffer.length; ++i) {
array[i] = buffer[i];
}
return array;
}
//data is a raw Buffer object
fs.readFile('./ducklings.png', function (err, data) {
var mime = new Buffer('image/png');
var allBuffed = Buffer.concat([mime, data]);
var array = toUint8Array(allBuffed);
var mimeBytes = array.subarray(0,9); //number of characters in mime Buffer
console.log(String.fromCharCode.apply(null, mimeBytes));
});
Here's how you do it on the client side:
SOLUTION A: GET A PACKAGE
Get buffer, an implementation of Node's Buffer API for browsers. The solution to concatenate Byte buffers will work exactly as before. You can append fields like To: and what not as well. The way you format your headers in order to best serve your clients will be an evolving process I'm sure.
SOLUTION B: OLD SCHOOL
STEP 1: Convert your Blob to an ArrayBuffer
Notes: How to convert a String to an ArrayBuffer
var fr = new FileReader();
fr.addEventListener('loadend', function () {
//Asynchronous action in part 2.
var message = concatenateBuffers(headerStringAsBuffer, fr.result);
ws.send(message);
});
fr.readAsArrayBuffer(blob);
STEP 2: Concatenate ArrayBuffers
function concatenateBuffers(buffA, buffB) {
var byteLength = buffA.byteLength + buffB.byteLength;
var resultBuffer = new ArrayBuffer(byteLength);
//wrap ArrayBuffer in a typedArray/view
var resultView = new Uint8Array(resultBuffer);
var viewA = new Uint8Array(resultBuffer);
var viewB = new Uint8Array(resultBuffer);
//Copy 8 bit integers AKA Bytes
resultView.set(viewA);
resultView.set(viewB, viewA.byteLength);
return resultView.buffer
}
STEP 3: Receive and Reblob
I'm not going to repeat how to convert the concatenated String bytes back into a string because I've done it in the server example, but for turning the file bytes into a blob of your mime type is fairly simple.
new Blob(buffer.slice(offset, buffer.byteLength), {type: mimetype});
This Gist by robnyman goes into further details on how you would use an image transmitted via XHR, put it into localstorage, and use it in an image tag on your page.
I liked #Breedly's idea of prepending a fixed length byte array to indicate mime type of the ArrayBuffer so I created this npm package that I use when dealing with websockets but maybe others' might find it useful.
Example usage
const {
arrayBufferWithMime,
arrayBufferMimeDecouple
} = require('arraybuffer-mime')
// some image array buffer
const uint8 = new Uint8Array(1)
uint8[0] = 1
const ab = uint8.buffer
const mime = 'image/png'
const abWithMime = arrayBufferWithMime(ab, mime)
const {mime, arrayBuffer} = arrayBufferMimeDecouple(abWithMime)
console.log(mime) // "image/png"
console.log(arrayBuffer) // ArrayBuffer
I am encoding a MP3 file to Base64 in Node Js using this method :
encodebase64 = function(mp3file){
var bitmap = fs.readFileSync(mp3file);
var encodedstring = new Buffer(bitmap).toString('base64');
fs.writeFileSync('encodedfile.bin', encodedstring);}
and then again I want to construct the MP3 file from the Base64 bin file, but the file created is missing some headers , so obviously there's a problem with the decoding.
the decoding function is :
decodebase64 = function(encodedfile){
var bitmap = fs.readFileSync(encodedfile);
var decodedString = new Buffer(bitmap, 'base64');
fs.writeFileSync('decodedfile.mp3', decodedString);}
I wondered if anyone can help
Thanks.
Perhaps it is an issue with the encoding parameter. See this answer for details. Try using utf8 when decoding to see if that makes a difference. What platforms are you running your code on?
#Noah mentioned an answer about base64 decoding using Buffers, but if you use the same code from the answer, and you try to create MP3 files with it, then they won't play and their file size will be larger than original ones just like you experienced in the beginning.
We should write the buffer directly to the mp3 file we want to create without converting it(the buffer) to an ASCII string:
// const buff = Buffer.from(audioContent, 'base64').toString('ascii'); // don't
const buff = Buffer.from(audioContent, 'base64');
fs.writeFileSync('test2.mp3', buff);
More info about fs.writeFile / fs.writeFileAsync
I am writing an Adobe Air app in HTML/JavaScript and I am trying to base64 encode an image so I can add it to and XML RPC request. I have tried many methods and nothing seems to work.
I see that actionscript has a Base64Encoder class that look like it would work, is there any way to utilize this in JavaScript?
Thanks #some for the link.
I used the btoa() function to base64 encode image data like this:
var loader = new air.URLLoader();
loader.dataFormat = air.URLLoaderDataFormat.BINARY;
loader.addEventListener(air.Event.COMPLETE,function(e){
var base64image = btoa(loader.data);
});
var req = new air.URLRequest('file://your_path_here');
loader.load(req);
I was trying to upload an image using metaWeblog.newMediaObject, but it turns out that the data doesn't need to be base64 encoded, so the binary value was all that was needed.