I am getting corrupted image icon while displaying b64 encoded png image response from rest API.
javascript-
function getcap(){
var http = new XMLHttpRequest()
http.open("GET", "http://localhost:8888/newcaptcha",true)
http.setRequestHeader("Content-Type", "text/plain;charset=UTF-8");
http.setRequestHeader("Access-Control-Allow-Origin", "http://localhost:8888");
http.send()
http.onload = () => {
var resp=unescape(encodeURIComponent(http.responseText));
var b64Response = window.btoa(resp);
console.log('data:image/png;base64,'+b64Response);
document.getElementById("capimg").src = 'data:image/png;base64,'+b64Response;
}
}
html -
<div id="newCaptcha" onClick="getcap()" ><h5>new captcha:</h5><img id="capimg" width="30" height ="30"/></div>
b64 encoded response-
server code -
#CrossOrigin(origins = "http://localhost:8080")
#RequestMapping(value = "/newcaptcha", method = RequestMethod.GET, produces = "image/png")
public #ResponseBody byte[] getnewCaptcha() {
try {
Random random = new Random();
imgkey= random.nextInt(3);
InputStream is = this.getClass().getResourceAsStream("/"+captcheMap.get(imgkey)+".png");
BufferedImage img = ImageIO.read(is);
ByteArrayOutputStream bao = new ByteArrayOutputStream();
ImageIO.write(img, "png", bao);
return bao.toByteArray();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
The base 64 response attached doesn't seem to actually load the image, if I open it in browser.
Secondly, I can see that that one problem that can cause this is reloading of DOM element img, if its not handled by any framework, you may have to manually intervene. To check this, you can test using a local image and load that. If it doesn't work, then you got your root cause. And if it does, then this base64 response is an issue.
Also, check the console for any errors and do update here.
As I pointed out in comments, probably you don't need b64. However, if you really want, read this.
There are tons on questions on Stackoverflow on this subject, and few answers. I have put together all pieces.
The point is that btoa() badly supports binary data.
Here: convert binary data to base-64 javaScript you find the suggestion to use arraybuffers as responseType, instead of just text.
Here: ArrayBuffer to base64 encoded string you find a function that converts arraybuffers to b64.
Putting all togheter:
function getcap(){
var http = new XMLHttpRequest();
http.open("GET", "/newcaptcha",true);
http.responseType = 'arraybuffer';
http.send();
http.onload = () => {
console.log(http.response);
var b64Response = _arrayBufferToBase64(http.response);
document.getElementById("capimg").src = 'data:image/png;base64,'+b64Response;
}
}
function _arrayBufferToBase64( buffer ) {
var binary = '';
var bytes = new Uint8Array( buffer );
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode( bytes[ i ] );
}
return window.btoa( binary );
}
Related
I'm trying to save jpg files with cloud code on parse server ...
On Android I can do it using this way
Bitmap bitmap = ((BitmapDrawable) myImageView.getDrawable()).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte [] byteArrayPhotoUpdate = stream.toByteArray();
final ParseFile pictureFileParse = new ParseFile( newUserInfo.getObjectId() + ".JPEG",byteArrayPhotoUpdate);
newUserInfo.put("profile_picture",pictureFileParse);
newUserInfo.saveInBackground();
But I have no idea how to do this in the cloud code. I call my cloud code functions like this
HashMap<String, String> params = new HashMap();
ParseCloud.callFunctionInBackground("myCloudFuncion", params, new FunctionCallback<String>() {
#Override
public void done(String aFloat, ParseException e) {
}
});
but I have no idea how to pass a bitmap in hashmap params.
I already searched the internet, but nothing that I found in helped, the links that refer to something useful, is already old and outdated, from the epoch of the old parse ...
In parse docs I found this
var base64 = "V29ya2luZyBhdCBQYXJzZSBpcyBncmVhdCE=";
var file = new Parse.File("myfile.txt", { base64: base64 });
Which made me confused because I do not know if the 2 "base64" parameters refer to variable or base64 type
Should I convert my bitmap to base64 and send it as parameter to the cloud code?
If you have been through this and know how, I will be very happy to know your solution.
Thank you!
you need convert your image bitmap for base64 like that:
Bitmap bitmap = ((BitmapDrawable) img.getDrawable()).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte [] byteArrayPhotoUpdate = stream.toByteArray();
String encodedfile = new String(Base64.encodeBase64(byteArrayPhotoUpdate), "UTF-8");
And then, send your string base64 in params, like that:
HashMap<String, String> params = new HashMap();
params.put("fileInfo",encodedfile);
ParseCloud.callFunctionInBackground("saveParseUserInfo", params, new FunctionCallback<String>() {
#Override
public void done(String aFloat, ParseException e) {
Log.i("ewaeaweaweaweawe", "done: " + aFloat);
}
});
Now in your cloud code, use that:
Parse.Cloud.define("saveParseUserInfo", function(request, response) {
var userId = request.user.id;
var base64 = request.params.fileInfo;
var userClass = Parse.Object.extend("User");
//create a user object to set ACL
var userObject = userClass.createWithoutData(userId);
//create new ParseObject
var userPublicClass = Parse.Object.extend("userPublic");
var userPublic = new userPublicClass();
var aclAction = new Parse.ACL(userObject);
aclAction.setPublicReadAccess(true);
userPublic.setACL(aclAction);
userPublic.set("name", "name random");
userPublic.set("username", "username_random");
//Now create a Parse File object
var file = new Parse.File("photo.jpeg", { base64: base64 });
//set file object in a colum profile_picture
userPublic.set("profile_picture",file);
//save
userPublic.save(null, { useMasterKey: true,
success: function(actionSuccess) {
response.success("saved!!");
},
error: function(action, error) {
// Execute any logic that should take place if the save fails.
// error is a Parse.Error with an error code and message.
response.error(error.message);
}
});
});
I hope it's help you.
This answer works if you do not wish to use Base64 that requires API 26 and above for android.
I know João Armando has answered this question, but this is for the benefit of others who, like me, are supporting versions before API 26 for Android.
P.S. The Base64.encodeBase64(...) is deprecated and Base64.getEncoder()... is used now, which requires API 26.
There are 3 key parts to the solution:
Convert your bitmap to byteArray
Send this byteArray directly as params when calling your cloud function
Format this byteArray in cloud code itself
In Android:
Convert bitmap to byte[]
Bitmap bitmap = <Your source>;
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Send as params when calling cloud function
HashMap<String, Object> params = new HashMap<>();
params.put("imageInByteArray", byteArray);
ParseCloud.callFunctionInBackground("yourCloudFunction", params, new FunctionCallback<Map>() {
#Override
public void done(Map object, ParseException e) {
if(e == null){
// Success
} else {
// Failed
}
}
});
In cloud function/code
Depends on the version of javascript you use, the codes may differ. I am using a backend-as-a-service provider, which has improved from promises-related codes. The logic should still be applicable regardless.
Parse.Cloud.define("reportId", async request => {
// Retrieve and set values from client app
const imageInByteArray = request.params.imageInByteArray;
// Format as ParseFile
var file = new Parse.File("image.png", imageInByteArray);
// Initialize your class, etc.
....
// Save your object
await yourImageObject.save(null, {useMasterKey:true});
});
I always get this error in the downloaded zip file C:\Users\me\Downloads\test.zip: Unexpected end of archive
My current code is:
var blob = new Blob([data], { // data here is the binary content
type: 'octet/stream',
});
var zipUrl = window.URL.createObjectURL(blob);
var fileName = orderNo;
fileName += '.zip';
downloadFile(null, fileName, null, zipUrl, null); // just creates a hidden anchor tag and triggers the download
The response of the call is a binary (I think). Binary Content Here
But the preview is a base64. Base64 Content. And it is the correct one. The way I verify it is by using this fiddle.
You can refer to the screenshot of the network here
I put the base64 content in this line var sampleBytes = base64ToArrayBuffer(''); And the zip downloaded just opens fine.
Things I have tried so far.
Adding this headers to the GET call
var headers = {
Accept: 'application/octet-stream',
responseType: 'blob',
};
But I get Request header field responseType is not allowed by Access-Control-Allow-Headers in preflight response.
We're using an already ajax.service.js in our AngularJS project.
From this answer
var blob = new Blob([yourBinaryDataAsAnArrayOrAsAString], {type: "application/octet-stream"});
var fileName = "myFileName.myExtension";
saveAs(blob, fileName);
There are other things that I have tried that I have not listed. I will edit the questions once I find them again
But where I'm current at right now. The preview is correct base64 of the binary file. Is it possible to use that instead of the binary? (If it is I will not find the other methods that I've tested) I tried some binary to base64 converters but they don't work.
So I just went and ditched using the ajax.service.js, that we have, for this specific call.
I used the xhr snippet from this answer. I just added the headers necessary for our call: tokens and auth stuff.
And I used this code snippet for the conversion thing.
And the code looks like this:
fetchBlob(url, function (blob) {
// Array buffer to Base64:
var base64 = btoa(String.fromCharCode.apply(null, new Uint8Array(blob)));
var blob = new Blob([base64ToArrayBuffer(base64)], {
type: 'octet/stream',
});
var zipUrl = window.URL.createObjectURL(blob);
var fileName = orderNo;
fileName += ' Attachments ';
fileName += moment().format('DD-MMM-YYYY');
fileName += '.zip';
downloadFile(null, fileName, null, zipUrl, null); // create a hidden anchor tag and trigger download
});
function fetchBlob(uri, callback) {
var xhr = new XMLHttpRequest();
xhr.open('GET', uri, true);
xhr.responseType = 'arraybuffer';
var x = AjaxService.getAuthHeaders();
xhr.setRequestHeader('auth_stuff', x['auth_stuff']);
xhr.setRequestHeader('token_stuff', x['token_stuff']);
xhr.setRequestHeader('Accept', 'application/octet-stream');
xhr.onload = function (e) {
if (this.status == 200) {
var blob = this.response;
if (callback) {
callback(blob);
}
}
};
return xhr.send();
};
function base64ToArrayBuffer(base64) {
var binaryString = window.atob(base64);
var binaryLen = binaryString.length;
var bytes = new Uint8Array(binaryLen);
for (var i = 0; i < binaryLen; i++) {
var ascii = binaryString.charCodeAt(i);
bytes[i] = ascii;
};
return bytes;
}
I'm getting a file from a server with AJAX (Angular).The file is a simple XLSX document, sent like this:
ob_start();
$file = \PHPExcel_IOFactory::createWriter($xls, 'Excel2007');
$file->save('php://output');
$response->setContent(ob_get_clean());
$response->headers->replace(array(
'Content-Type' => 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'Content-Disposition' => 'attachment;filename=file.xlsx"'
));
When I make a request from frontend, I use Accept header too. Then I save the file with angular-file-saver using FileSaver.js and Blob.js.
But the received file is corrupt and I can't open it in Excel: it's size is (for example) 12446 bytes, but Chrome's DevTools Network tab shows responses Content-Length header as 7141 bytes.
How can I solve this problem?
UPD:
I'm sending a request like this:
$http.get(baseURL + '/' + entity + '/export/?' + condition + sort, {
headers: {'Accept': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet; charset=utf-8'}
});
and downloading file just like this:
var data = new Blob([response.data], {type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;charset=utf-8'});
FileSaver.saveAs(data, 'file.xlsx');
The way I got around the problem was using plain JS AJAX, instead of AngularJS. (There might be a problem with AngularJS and JQuery handling binary responses.)
This should work:
var request = new XMLHttpRequest();
request.open('GET', 'http://yourserver/yourpath', true);
request.responseType = 'blob';
request.onload = function (e) {
if (this.status === 200) {
var blob = this.response;
if (window.navigator.msSaveOrOpenBlob) {
var fileNamePattern = /filename[^;=\n]*=((['"]).*?\2|[^;\n]*)/;
window.navigator.msSaveBlob(blob, fileNamePattern.exec(request.getResponseHeader("content-disposition"))[1]);
} else {
var downloadLink = window.document.createElement('a');
var contentTypeHeader = request.getResponseHeader("Content-Type");
var b = new Blob([blob], { type: contentTypeHeader });
downloadLink.href = window.URL.createObjectURL(b);
var fileNamePattern = /filename[^;=\n]*=((['"]).*?\2|[^;\n]*)/;
downloadLink.download = fileNamePattern.exec(request.getResponseHeader("content-disposition"))[1];
document.body.appendChild(downloadLink);
downloadLink.click();
document.body.removeChild(downloadLink);
window.URL.revokeObjectURL(b);
}
}
};
request.send();
Code is based on this and this.
FYI, I found that new Blob([response.data], ...) returns almost double the size of response.data when response.data is not returned as blob, but text/plain or application/vnd.openxmlformats-officedocument.spreadsheetml.sheet. To get around it, you need to pass it an array of bytes instead:
var i, l, d, array;
d = this.result;
l = d.length;
array = new Uint8Array(l);
for (var i = 0; i < l; i++){
array[i] = d.charCodeAt(i);
}
var b = new Blob([array], {type: 'application/octet-stream'});
window.location.href = URL.createObjectURL(b);
Code is from here.
Anyways, since the AJAX response is not correct using AngularJS, you won't get a valid xlsx file this way. You need to go with vanilla JS.
To make a long story short:
How to Asynchronously write an ArrayBuffer directly to file using nsIArrayBufferInputStream in Firefox extension ?
It seems that MDN does not have any documentation on nsIArrayBufferInputStream.
I know I can use nsIStringInputStream and convert the BufferArray to String, but this poses a big performance hit
also converting ArrayBuffer to string using this code:
String.fromCharCode.apply(null, new Uint16Array(buf));
Does not work if the buffer is 500 KB or bigger, so we must loop over it one char at a time:
for (let i = 0; i < buf.length; i++){
s += String.fromCharCode(buf16[i]);
}
Or, I can use nsIBinaryOutputStream.writeByteArray but it cannot be used with NetUtil.asyncCopy (or can it?)
//this works ok, but is synchronous :-(
function writeBinFile(aFile, data){
Components.utils.import("resource://gre/modules/FileUtils.jsm");
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
if(typeof aFile === 'string') aFile = nsFile(aFile);
var stream = FileUtils.openSafeFileOutputStream(aFile, FileUtils.MODE_WRONLY | FileUtils.MODE_CREATE);
var binaryStream = Cc["#mozilla.org/binaryoutputstream;1"].createInstance(Ci.nsIBinaryOutputStream);
binaryStream.setOutputStream(stream);
binaryStream.writeByteArray(data, data.length);
FileUtils.closeSafeFileOutputStream(stream);
}
And the long story is...
I have been trying to use nsIArrayBufferInputStream
http://dxr.mozilla.org/mozilla-central/source/netwerk/base/public/nsIArrayBufferInputStream.idl
but with no success. the code I tried:
function fileWrite(file, data, callback) {
Cu.import("resource://gre/modules/FileUtils.jsm");
Cu.import("resource://gre/modules/NetUtil.jsm");
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
if (typeof file == 'string') file = new nsFile(file);
let ostream = FileUtils.openSafeFileOutputStream(file)
let istream = Cc["#mozilla.org/io/arraybuffer-input-stream;1"].createInstance(Ci.nsIArrayBufferInputStream);
istream.setData(data, 0, data.length);
let bstream = Cc["#mozilla.org/binaryinputstream;1"].createInstance(Ci.nsIBinaryInputStream);
bstream.setInputStream(istream);
//NetUtil.asyncCopy(istream, ostream,
NetUtil.asyncCopy(bstream, ostream,
function(status) {
if (callback) callback(Components.isSuccessCode(status));
}
);
}
The ArrayBuffer data param is the responce from XMLHttpRequest:
function getBinFile(url, dir) {
let nsFile = Components.Constructor("#mozilla.org/file/local;1", Ci.nsILocalFile, "initWithPath");
let oReq = new XMLHttpRequest();
oReq.open("GET", url, true);
oReq.responseType = "arraybuffer";
oReq.onload = function(oEvent) {
var arrayBuffer = oReq.response;
if (arrayBuffer) {
//let byteArray = new Uint8Array(arrayBuffer);
let byteArray = arrayBuffer;
dir = /\\$/.test(dir) ? dir: dir + '\\';
let file = nsFile(dir + decodeURIComponent(url.split('/').pop()));
fileWrite(file, byteArray);
}
};
oReq.send(null);
}
calling like this:
getBinFile( 'http://....', 'c:\\demo\\');
A file is created but with no contents!
I'm answering myself in case anyone stumbles upon this question...
with help from Josh Matthews (of Mozilla) i found the answer:
use byteLength instead of length
istream.setData(data, 0, data.byteLength);
I'm trying to POST an image (with Metadata) to Picasa Webalbums from within a Chrome-Extension. Note that a regular post with Content-Type image/xyz works, as I described here. However, I wish to include a description/keywords and the protocol specification describes a multipart/related format with a XML and data part.
I'm getting the Data through HTML5 FileReader and user file input. I retrieve a binary
String using
FileReader.readAsBinaryString(file);
Assume this is my callback code once the FileReader has loaded the string:
function upload_to_album(binaryString, filetype, albumid) {
var method = 'POST';
var url = 'http://picasaweb.google.com/data/feed/api/user/default/albumid/' + albumid;
var request = gen_multipart('Title', 'Description', binaryString, filetype);
var xhr = new XMLHttpRequest();
xhr.open(method, url, true);
xhr.setRequestHeader("GData-Version", '3.0');
xhr.setRequestHeader("Content-Type", 'multipart/related; boundary="END_OF_PART"');
xhr.setRequestHeader("MIME-version", "1.0");
// Add OAuth Token
xhr.setRequestHeader("Authorization", oauth.getAuthorizationHeader(url, method, ''));
xhr.onreadystatechange = function(data) {
if (xhr.readyState == 4) {
// .. handle response
}
};
xhr.send(request);
}
The gen_multipart function just generates the multipart from the input values and the XML template and produces the exact same output as the specification (apart from ..binary image data..), but for sake of completeness, here it is:
function gen_multipart(title, description, image, mimetype) {
var multipart = ['Media multipart posting', " \n", '--END_OF_PART', "\n",
'Content-Type: application/atom+xml',"\n","\n",
"<entry xmlns='http://www.w3.org/2005/Atom'>", '<title>', title, '</title>',
'<summary>', description, '</summary>',
'<category scheme="http://schemas.google.com/g/2005#kind" term="http://schemas.google.com/photos/2007#photo" />',
'</entry>', "\n", '--END_OF_PART', "\n",
'Content-Type:', mimetype, "\n\n",
image, "\n", '--END_OF_PART--'];
return multipart.join("");
}
The problem is, that the POST payload differs from the raw image data, and thus leads to a Bad Request (Picasa won't accept the image), although it worked fine when using
xhr.send(file) // With content-type set to file.type
My question is, how do I get the real binary image to include it in the multipart? I assume it is mangled by just appending it to the xml string, but I can't seem to get it fixed.
Note that due to an old bug in Picasa, base64 is not the solution.
The XMLHttpRequest specification states that the data send using the .send() method is converted to unicode, and encoded as UTF-8.
The recommended way to upload binary data is through FormData API. However, since you're not just uploading a file, but wrapping the binary data within XML, this option is not useful.
The solution can be found in the source code of the FormData for Web Workers Polyfill, which I've written when I encountered a similar problem. To prevent the Unicode-conversion, all data is added to an array, and finally transmitted as an ArrayBuffer. The byte sequences are not touched on transmission, per specification.
The code below is a specific derivative, based on the FormData for Web Workers Polyfill:
function gen_multipart(title, description, image, mimetype) {
var multipart = [ "..." ].join(''); // See question for the source
var uint8array = new Uint8Array(multipart.length);
for (var i=0; i<multipart.length; i++) {
uint8array[i] = multipart.charCodeAt(i) & 0xff;
}
return uint8array.buffer; // <-- This is an ArrayBuffer object!
}
The script becomes more efficient when you use .readAsArrayBuffer instead of .readAsBinaryString:
function gen_multipart(title, description, image, mimetype) {
image = new Uint8Array(image); // Wrap in view to get data
var before = ['Media ... ', 'Content-Type:', mimetype, "\n\n"].join('');
var after = '\n--END_OF_PART--';
var size = before.length + image.byteLength + after.length;
var uint8array = new Uint8Array(size);
var i = 0;
// Append the string.
for (; i<before.length; i++) {
uint8array[i] = before.charCodeAt(i) & 0xff;
}
// Append the binary data.
for (var j=0; j<image.byteLength; i++, j++) {
uint8array[i] = image[j];
}
// Append the remaining string
for (var j=0; j<after.length; i++, j++) {
uint8array[i] = after.charCodeAt(j) & 0xff;
}
return uint8array.buffer; // <-- This is an ArrayBuffer object!
}