In my Angular2 application, I want to show an image which is in Uint8Array format. But I am getting 'Maximum Call Stack Exceeded'. I could render images of size ~48Kb with no error. But when the image size is above ~300Kb is when I am getting this error. This is how I am rendering the image:
(<HTMLInputElement>document.getElementById("imagePreview")).src = "data:image/" + type + ";base64," +
btoa(String.fromCharCode.apply(null, objData.Body));
Can someone please tell me whether I am doing it in the right way. If not, please tell me how to do it correctly
String.fromCharcode() will run into a maximum call stack exceeded with large string data.
To be able to convert said object to base64 you need to implement a loop based on the string length. Something like this comes to mind:
let img: HTMLInputElement = (<HTMLInputElement>document.getElementById("imagePreview"));
let bin : string = '';
for (let i = 0, l = objData.Body.length; i < l; i++) {
bin += String.fromCharCode(objData.Body[i]);
}
img.src = "data:image/" + type + ";base64," + btoa(bin);
Perhaps it is more efficient to chunk up the string in bigger pieces than 1 character, but that's up to you to find the most speedy way :)
I had this issue with rendering base64 16K resolution image from DB, but it had nothing to do with the answer above.
That's the error I had.
As you can see it is caused by sanitization process. So in my case I had to trust the image to prevent sanitization check from running.
This string solved the issue for me, hope that helps someone.
const safeUrl = domSanitizer.bypassSecurityTrustUrl(base64string)
Pass it like that
<img [src]="img">
Related
I have for example here this string
'x���'
Which you may possibly not see depending on the devide you're using. That's the number 2024000250 encoded as a 32 bit signed big endian integer, which I've generated using Node
let buffer = new Buffer(4);
b.writeInt32BE(2024000250).toString();
I'm receiving the 4 bytes in question on the client side but I can't seem to find how to turn them back into an integer...
I might be dead wrong here. But as far as I remember unicode characters can be between 2-4 bytes. When you transfer your binary data as text to client-side you risk corrupting this information because the client is going to interpret them as unicode.
If I were to convert that text to a blob on client side:
var b = new Blob(['x���'],{type:"application/octet-stream"});
b.size; //10
As you can see I receive 10 bytes, which is wrong, it should have been 4.
You can transfer the data directly as a binary string, since you are using Node, on the server side:
function createBuffer(v){
var b = new ArrayBuffer(4),
vw = new DataView(b);
vw.setInt32(0,v);
return b;
}
This will create your buffer, now you cannot just send this as it is to client, either represent it as a json or directly as a binary string. To represent it as binary you don't need the above function, you could have done:
("0".repeat(32) + (2024000250).toString(2)).slice(-32); //"01111000101000111100101011111010"
If you want json, you can do:
function convertBuffToBinaryStr(buff){
var res = [],
l = buff.byteLength,
v = new DataView(buff);
for (var i = 0; i < l; ++i){
res.push(v.getUint8(i));
}
return JSON.stringify(res);
}
Now try seeing what this outputs:
convertBuffToBinaryStr(createBuffer(2024000250)); //"[120,163,202,250]"
Back on the client-side you have to interpret this:
function interpret(json){
json = JSON.parse(json);
return parseInt(json.map((d)=>("0".repeat(8) + d.toString(2)).slice(-8)).join(""),2);
}
Now try:
interpret("[120,163,202,250]"); //2024000250
Note: For your interpret function, you have to use dataView to setUint8 and the use getInt32 at the end, since you are using signed integers, above won't work for all cases.
Well I finally got around to getting this to work.
It is not quite what I started off with but I'm just gonna post this for some other lost souls.
It is worth mentioning that ibrahim's answer contains most of the necessary information but is trying to satisfy the XY problem which my question ended up being.
I just send my binary data, as binary
let buffer = new Buffer(4);
buffer.writeInt32BE(2024000250);
// websocket connection
connection.send(buffer);
Then in the browser
// message listener
let reader = new FileReader();
reader.addEventListener('loadend', () => {
let view = new DataView(reader.result);
// there goes the precious data
console.log(view.getInt32());
});
reader.readAsArrayBuffer(message.data);
In all honesty this tickles my gag reflex. Why am I using a file reader to get some data out of a binary message? There is a good chance a better way of doing this exists, if so please add to this answer.
Other methods I found are the fetch API which is no better than the file reader in terms of hack rating and Blob.prototype.arrayBuffer which is not yet supported fully.
I have a kafka stream with avro message using the confluent.io kafka package. This is working fine for the java applications. But I am now trying to read in these messages in javascript.
I've been attempting to use the kafka-node + avsc packages to decode the messages from a buffer array to string, using the schema. I know that confluent puts the first 5 bytes as a magic byte (0) + 4 bytes for the schema Id.
So I slice the Buffer to remove those bytes and attempt to send this to avsc to decode. But I get an error
return this.buf.utf8Slice(pos, pos + len);
RangeError: out of range index
at RangeError (native)
at Tap.readString (C:\git\workflowapps\workItemsApp\node_modules\avsc\lib\utils.js:452:19)
at StringType._read (C:\git\workflowapps\workItemsApp\node_modules\avsc\lib\types.js:612:58)
Also attempting to manually decode this leaves lots of non-utf8 characters and I am losing data that way.
Sample Code:
consumer.on('message', function(message) {
var val = message.value.slice(4);
sails.log.info('val buffer', val, val.length);
sails.log.info('hex',val.toString('hex'));
var type = avro.parse({"type":"record",
"name":"StatusEvent",
"fields":[{"name":"ApplicationUUID","type":"string"},
{"name":"StatusUUID","type":"string"},
{"name":"Name","type":"string"},
{"name":"ChangedBy","type":"string"},
{"name":"ChangedByUUID","type":"string"},
{"name":"ChangedAt","type":"long"}]
});
var decodedValue = type.fromBuffer(val);
sails.log.info('Decoded', decodedValue);
});
Your slice(4) should be slice(5) (otherwise you're only skipping 4 out of the 5 header bytes). You might also be able to find helpful information here.
this should be straight forward, but I am not sure why I am getting the error, I am using constructor with ArrayBuffer as parameter as shown in mdn, but I am getting the error as invalid arguments, (p.s with dataview I have checked, the data is Int16 only)
the code is:
var view= DataView(arrayBuf);
console.log('arrayBuf.byteLength : '+arrayBuf.byteLength);
console.log('data at 0 : '+view.getInt16(0));
console.log('data at 1 : '+view.getInt16(1));
var int16arry = new Int16Array(arrayBuf);
the console output is:
"arrayBuf.byteLength : 117"
"data at 0 : 22720"
"data at 1 : -16315"
Error: invalid arguments
what is my mistake?
The short answer is that your arrayBuffer is in the wrong size. You can use:
var int16Array = new Int16Array(arrayBuf, 0, Math.floor(arrayBuf.byteLength / 2));
to hack away the problem.
Case specific comment:
I have tried reading the source for your library but i am unable to see why you are getting that extra byte (or what is missing).
The data you are getting is supposed to be 16 bit ints but for some reason you have other data there that take up an uneven amount of bytes, and according to the source as far as i can tell there should be some doubles (javascript floats) in there as well, meaning that "hacking" away the problem might not work.
Im having some trouble with receiving packet data from a TCP stream. I think this is due in part to not understanding the servers responses.
my code (objective c):
unsigned type=0;
unsigned bufferFirstByte=0;
unsigned bufferSecondByte=0;
unsigned bufferThirdByte=0;
NSScanner *hexToInt = [NSScanner scannerWithString:[mutableBuffer objectAtIndex:0]];
[hexToInt scanHexInt:&bufferFirstByte];
hexToInt = [NSScanner scannerWithString:[mutableBuffer objectAtIndex:1]];
[hexToInt scanHexInt:&bufferSecondByte];
hexToInt = [NSScanner scannerWithString:[mutableBuffer objectAtIndex:2]];
[hexToInt scanHexInt:&bufferThirdByte];
hexToInt = [NSScanner scannerWithString:[mutableBuffer objectAtIndex:0]];
[hexToInt scanHexInt:&type];
int len = (bufferSecondByte<<8)+bufferSecondByte;
if (![mutableBuffer count]<(3+len)) {
NSArray *payload = [mutableBuffer subarrayWithRange:NSMakeRange(2,([mutableBuffer count] - 2))];
NSLog(#"length %d",len);
[self processReceive:type length:len payload:payload];
}
is some what modelled from this javascript code:
self.receive = function (itemName, data) {
self.log("Receiving: " + self.toHex(data));
self.ourData += data;
while (self.ourData.length >= 3) {
var type = self.ourData.charCodeAt(0);
var len = (self.ourData.charCodeAt(1) << 8) + self.ourData.charCodeAt(2);
if (self.ourData.length < (3 + len)) { // sanity check: buffer doesn't contain all the data advertised in the packet
break;
}
var payload = self.ourData.substr(3,len);
self.ourData = self.ourData.substr(3 + len);
self.processMessage(type, len, payload); // process payload
}
};
The reason for the modeling is that the command fusion javascript project is talking to the same server I am (a crestron controller).
However I could never get the len thing to work and I think thats whats causing my problem. When looking at a sample packet (05:00:06:00:00:03:00:52:00) the len would equal 1280 (see math above) even though the data portion is only 9bytes.
Currently my code will work but it misses certain data. This happens because of the streaming that TCP does (some packets are conjoined while others are fragmented). But without knowing the data segment size I cannot fix the issue and I believe the answer to that is the len variable. But I dont see how to properly implement it.
My question comes down to this. How can I determine the size of the data segment from this len variable or control my receive method to only except one data segment at a time (which from my research is not possible since TCP is made as a stream)?
I have a feeling there will be questions so Im going to attempt to answer a few of them here.
A. How do you come up with 1280: look at the math in the method ((self.ourData.charCodeAt(1) << 8) + self.ourData.charCodeAt(2);) (5<<8)+0=1280d
B. Why are you using different indexes:
You will notice that the index for what data goes where (payload, len, type). This is merely because they have their payload/data bytes as a string and myn is an array. in the end it is the same data being referenced
Use the following logic:
1) Do you have enough bytes to determine the length? If not, receive some more bytes and try again.
2) Do you have enough bytes to have one complete unit? If not, receive some more bytes and try again.
3) Extract one complete unit using the decoded length from step 1 and process it.
4) If we have no leftover bytes, we are done.
5) Return to step 1 to process the leftover bytes.
ok so i got some help from (this group) which you might not be able to see without a login for the group. in any case there is a 3 byte header. so my len which is 6 and not 1280 like i thought is actually 9 once the 3 is added for the header. and this gets me the value i was loooking for (9) since the data segment is 9bytes.
Thanks for the suggestions david, one up for some good basic knowledge.
I'm looking into converting a Flash application I have into JavaScript but was told that it probably wouldn't be possible due to the number of objects I would have to have on the go.
Is this is true and, if it is, what are the limits?
JavaScript memory limit shows that you can allocate at least 20 MB of memory in Firefox.
There is definitely a limit though, but I doubt you'll meet the limit on memory. It is more likely that your performance will be too bad if you are converting a very dynamic Flash application.
Flash is very efficient at moving objects around since that's its primary function. Using JavaScript to move objects around in HTML is going to way, way slower. Nevertheless quite amazing things can be acheived with JavaScript.
See Lemmings.
An improved version of the script at link text. This is faster since it uses join, and lets the browser have some time to update the page evey now and then.
function allocate_mem() {
var mega=[];
// Strings are stored as UTF-16 = 2 bytes per character.
// Below a 1Mibi byte string is created
for(var i=0; i<65536; i++){
mega.push('12345678')
}
mega=mega.join("");
var x=document.getElementById("max_mem");
var size=0;
var large=[];
function allocate( ) {
++size;
//if (size>400) {alert(large.join("").length/1048576); return; }
large.push("."+mega.slice(0));
x.innerHTML = "max memory = " + size + " MB";
setTimeout(allocate, size %10 ? 0: 200);
}
allocate();
}