Decode Javascript packed binary data at server PHP side - javascript

I have HTML form with 2 controls - file input control #resume and submit button #action_cv.
Most of files I can (and want) upload to server are binary.
There is reading and packing (encoding) this data at front side (Javascript):
function readFile(file, callback) {
let reader = new FileReader();
reader.onload = callback;
reader.readAsArrayBuffer(file);
}
$(document).ready(function () {
$('#action_cv').on('submit', function (event) {
let input_data = {
resume: null,
resume_data: null
};
let resume = null;
if (($("#resume"))[0].files.length > 0) {
resume = ($("#resume"))[0].files[0];
input_data['resume'] = resume.name;
readFile(resume, function (evt) {
let data = evt.target.result;
let bs =
String.fromCharCode.apply(null, new Uint8Array(data));
input_data['resume_data'] = bs;
});
}
// AJAX call with input_data skipped here...
})
});
Here data contains raw binary data from file, and bs - already packed for AJAX submission to server written by PHP. My question is very simple - how to unpack this encoded (packed) binary in PHP to get original file at server side? (No need to provide writing file operation - it seems evident)

Related

Why is nodeJs not reading entire binary file from disk?

I have a PDF file which I want to read into memory using NodeJS. Ideally I'd like to encode it using base64 for transferring it. But somehow the read function does not seem to read the full PDF file, which makes no sense to me. The original PDF was generated using pdfKit, and is ok and viewable using a PDF reader program.
The original file test.pdf has 90kB on disk. But if I read and write it back to disk there are just 82kB and the new PDF test-out.pdf is not ok. The pdf viewer says:
Unable to open document. The pdf document is damaged.
The base64 encoding therefore also does not work correctly. I tested it using this webservice. Does someone know why and what is happening here? And how to resolve it.
I found this post already.
fs = require('fs');
let buf = fs.readFileSync('test.pdf'); // returns raw buffer binary data
// buf = fs.readFileSync('test.pdf', {encoding:'base64'}); // for the base64 encoded data
// ...transfer the base64 data...
fs.writeFileSync('test-out.pdf', buf); // should be pdf again
EDIT MCVE:
const fs = require('fs');
const PDFDocument = require('pdfkit');
let filepath = 'output.pdf';
class PDF {
constructor() {
this.doc = new PDFDocument();
this.setupdocument();
this.doc.pipe(fs.createWriteStream(filepath));
}
setupdocument() {
var pageNumber = 1;
this.doc.on('pageAdded', () => {
this.doc.text(++pageNumber, 0.5 * (this.doc.page.width - 100), 40, {width: 100, align: 'center'});
}
);
this.doc.moveDown();
// draw some headline text
this.doc.fontSize(25).text('Some Headline');
this.doc.fontSize(15).text('Generated: ' + new Date().toUTCString());
this.doc.moveDown();
this.doc.font('Times-Roman', 11);
}
report(object) {
this.doc.moveDown();
this.doc
.text(object.location+' '+object.table+' '+Date.now())
.font('Times-Roman', 11)
.moveDown()
.text(object.name)
.font('Times-Roman', 11);
this.doc.end();
let report = fs.readFileSync(filepath);
return report;
}
}
let pdf = new PDF();
let buf = pdf.report({location: 'athome', table:'wood', name:'Bob'});
fs.writeFileSync('outfile1.pdf', buf);
The encoding option for fs.readFileSync() is for you to tell the readFile function what encoding the file already is so the code reading the file knows how to interpret the data it reads. It does not convert it into that encoding.
In this case, your PDF is binary - it's not base64 so you are telling it to try to convert it from base64 into binary which causes it to mess up the data.
You should not be passing the encoding option at all and you will then get the RAW binary buffer (which is what a PDF file is - raw binary). If you then want to convert that to base64 for some reason, you can then do buf.toString('base64') on it. But, that is not its native format and if you write that converted data back out to disk, it won't be a legal PDF file.
To just read and write the same file out to a different filename, leave off the encoding option entirely:
const fs = require('fs');
let buf = fs.readFileSync('test.pdf'); // get raw buffer binary data
fs.writeFileSync('test-out.pdf', buf); // write out raw buffer binary data
After a lot of searching I found this Github issue. The problem in my question seems to be the call of doc.end() which for some reason doesn't wait for the stream to finish (finish event of write stream). Therefore as suggested in the Github issue, the following approaches work:
callback based:
doc = new PDFDocument();
writeStream = fs.createWriteStream('filename.pdf');
doc.pipe(writeStream);
doc.end()
writeStream.on('finish', function () {
// do stuff with the PDF file
});
or promise based:
const stream = fs.createWriteStream(localFilePath);
doc.pipe(stream);
.....
doc.end();
await new Promise<void>(resolve => {
stream.on("finish", function() {
resolve();
});
});
or even nicer, instead of calling doc.end() direcly, call the function savePdfToFile below:
function savePdfToFile(pdf : PDFKit.PDFDocument, fileName : string) : Promise<void> {
return new Promise<void>((resolve, reject) => {
// To determine when the PDF has finished being written sucessfully
// we need to confirm the following 2 conditions:
//
// 1. The write stream has been closed
// 2. PDFDocument.end() was called syncronously without an error being thrown
let pendingStepCount = 2;
const stepFinished = () => {
if (--pendingStepCount == 0) {
resolve();
}
};
const writeStream = fs.createWriteStream(fileName);
writeStream.on('close', stepFinished);
pdf.pipe(writeStream);
pdf.end();
stepFinished();
});
}
This function should correctly handle the following situations:
PDF generated successfully
Error is thrown inside pdf.end() before write stream is closed
Error is thrown inside pdf.end() after write stream has been closed

How do I send image to server via socket.io?

I've been beating my head over this and I can't find a proper solution.
I want to be able to upload images to the server via socket.io emit and save them to a MongoDB database later. How do I do this? I've seen people doing it with base64 encoding but I can't figure out how that exactly works, there are other questions on this website asking about sending an image to client from server via socket.io but none about this. All help is appreciated. <3
Goal: To upload an image to server with socket.emit('image', someimagefile) or similar.
I'd really appreciate if you provide a similar way to send an image to the client.
As you mentioned, you can convert the image to base64 using FileReader.readAsDataURL and send the encoded string, and decode it on the server:
document.getElementById('file').addEventListener('change', function() {
const reader = new FileReader();
reader.onload = function() {
const base64 = this.result.replace(/.*base64,/, '');
socket.emit('image', base64);
};
reader.readAsDataURL(this.files[0]);
}, false);
socket.on('image', async image => {
const buffer = Buffer.from(image, 'base64');
await fs.writeFile('/tmp/image', buffer).catch(console.error); // fs.promises
});
Or better use FileReader.readAsArrayBuffer to get an array of bytes that you'll send to the server.
document.getElementById('file').addEventListener('change', function() {
const reader = new FileReader();
reader.onload = function() {
const bytes = new Uint8Array(this.result);
socket.emit('image', bytes);
};
reader.readAsArrayBuffer(this.files[0]);
}, false);
socket.on('image', async image => {
// image is an array of bytes
const buffer = Buffer.from(image);
await fs.writeFile('/tmp/image', buffer).catch(console.error); // fs.promises
});
To receive from the server:
// Server side
socket.emit('image', image.toString('base64')); // image should be a buffer
// Client side
socket.on('image', image => {
// create image with
const img = new Image();
// change image type to whatever you use, or detect it in the backend
// and send it if you support multiple extensions
img.src = `data:image/jpg;base64,${image}`;
// Insert it into the DOM
});
Base64 can work, but one more thing to keep in mind is that socket buffer size limit is 1 MB. (This can be increased according to docs).
So I guess if the file size is huge, its better to stream it with something like socket.io-stream
i don't know if any one is looking for it anymore but I made possible to send media via socket.io... here is the code:
// sending media from client side
$("#send_media").change(function (e) {
var data = e.originalEvent.target.files[0];
var reader = new FileReader();
reader.onload = function (evt) {
var msg = {};
msg.file = evt.target.result;
msg.fileName = data.name;
socket.emit("base64 file", msg);
console.log(msg)
};
reader.readAsDataURL(data);
});
// showing media to ui
socket.on("base64 image", (msg) => {
console.log("as", msg);
$(".messages")
.append(`<img src=${msg.file} alt="Red dot" />`);
scrollToBottom();
});
// sending media from server side
socket.on("base64 file", function (msg) {
console.log("received base64 file from server: " + msg.fileName);
socket.username = msg.username;
io.to(roomId).emit('base64 image', //exclude sender
// io.sockets.emit(
// "base64 file", //include sender
{
file: msg.file,
fileName: msg.fileName,
}
);
});

Sending file to server from a HTML input file

I have a web page where I upload file from an input type file HTML element.
Then I need to send that file by Url with Ajax to a Java server to a REST web service where it will be processed.
I tried FileReader().readAsDataURL function in javascript, file is encoded in base64 then sent to server by url. But when I try to decode it in Java, it fails.
How can I achieve this ?
My code client-side :
var file = document.getElementById('add_attach').files[0];
var filename = document.getElementById('add_attach').value;
if (file) {
var reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = function(e) {
var name = encodeURIComponent(filename);
var file_content = e.target.result;
//Ajax request to server with sending file_content
};
}
My code server-side :
file_content = file_content.substring(file_content.indexOf("base64"));
file_content = file_content.replace("base64,", "");
byte[] decodedBytes = Base64.getDecoder().decode(file_content);
The errror I get :
java.lang.IllegalArgumentException: Illegal base64 character 20

Python / Django fails at decoding file encoded as base64 by javascript

I'm using this, in react, to base64 encode an image file:
fileToBase64 = (filename, filepath) => {
return new Promise(resolve => {
var file = new File([filename], filepath);
var reader = new FileReader();
reader.onload = function(event) {
resolve(event.target.result);
};
reader.readAsDataURL(file);
});
};
Which gets called by this:
handleChangeFile = event => {
const { name, files } = event.target;
if (files.length) {
const file = files[0];
let fields = this.state.fields;
this.fileToBase64(file).then(result => {
fields[name].value = result;
});
fields[name].isFilled = true;
this.setState({
fields: fields
});
}
};
And the whole fields variable gets posted to a django server, no issues so far.
On the python django end:
str_encoded = request.data["file"]
str_decoded = base64.b64decode(str_encoded)
The second line returns an error that binascii.Error: Invalid base64-encoded string: length cannot be 1 more than a multiple of 4. I've googled and read that this is probably a padding issue, but I don't know how to fix it.
You will have to strip the base64 string from the prefix added by javascript.
The prefix is sth like data:{type};base64,{actual-base64-string-follows}
In php, where I had same issue, I tested if string starts with "data:" prefix and I strip it from start of string up to the position of the ; (semicolon) plus 8 characters (to catch the final ";base64,").
Then you can use python to decode the base64 string remaining as it is now a valid base64 string.

image files corrupt with websocket file transfer

I am using websockets for file transfer, while i am downloading a file i am recieving the data as it is, but when I open an image file it was corrupted. Data files are downloading fine, the code goes as follows.
try {
fileEntry = fs.root.getFile(filename, { create : creat_file });
var byteArray = new Uint8Array(data.data.length);
for (var i = 0; i < data.data.length; i++) {
byteArray[i] = data.data.charCodeAt(i) & 0xff;
}
BlobBuilderObj = new WebKitBlobBuilder();
BlobBuilderObj.append(byteArray.buffer);
if (!writer) {
writer = fileEntry.createWriter();
pos = 0;
}
//self.postMessage(writer.position);
writer.seek(pos);
writer.write(BlobBuilderObj.getBlob());
pos += 4096;
}
catch (e) {
errorHandler(e);
}
It looks like you are reading data from a WebSocket as a string, converting it to a Blob, and then writing this to a file.
If you have control of the WebSocket server then the best thing would be to send the data as binary frames instead of UTF-8 text data. If you can get the server to send the data as binary frames then you can just tell the WebSocket to deliver the data as Blobs:
ws.binaryType = "blob";
ws.onmessage = function (event) {
if (event.data instanceof Blob) {
// event.data is a Blob
} else {
// event.data is a string
}
}
If that is not an option and you can only send text frames from the server, then you will need to encode the binary data to text before sending it from the server and then decode the text on the other end. If you try and send binary data directly as text frames over WebSockets then doing charCodeAt(x) && 0xff will result in corrupt data.
For example you could base64 encode the data at the server and then base64 decode the data in the client:
ws.onmessage = function (event) {
raw = window.atob(event.data);
}
Update:
There is a very well performing pure Javascript base64 decode/encode contained in websockify. It decodes to an an array of numbers from 0-255 but could be easily modified to return a string instead if that is what you require (Disclaimer: I made websockify).

Categories

Resources