Using hex string as Map key best performance way - javascript

I am getting MongoDB ObjectIds, which are 12 bytes and their hex string looks like this: 62d34be4f8cd489f6d5b8e0e.
I want to use the object id as a map key like this:
new Map().set(ObjectId('62d34be4f8cd489f6d5b8e0e'),"value")
That doesn't really work because each ObjectId object is unique so two with the same id are not equal objects.
There are a few ways I can think of:
Use the hex string as map key:
new Map().set("62d34be4f8cd489f6d5b8e0e", "value")
Use a buffer / Uint8Array, and use some special Map which works with arrays as keys
new Map().set(Buffer.from("62d34be4f8cd489f6d5b8e0e", "hex"), "value")
Turn the buffer into a 12 byte string which looks like b�K���H�m[�\x0E
new Map().set(Buffer.from("62d34be4f8cd489f6d5b8e0e", "hex").toString(), "value")
Turn the buffer into a number (this turns a 12 byte buffer into a 16 byte number so it seems a little wierd)
new Map().set(Buffer.from("62d34be4f8cd489f6d5b8e0e", 'hex').readInt16BE(), "value")
Which method has the best performance?

Related

Nodejs Uint8Array,Uint16Array, Uint32Array [duplicate]

This question already has answers here:
Map binary buffer to Int16Array or Int32Array
(2 answers)
convert nodejs buffer to Uint16Array
(1 answer)
Is Uint8Array / Uint16Array conversion broken in Firefox OS simulator?
(1 answer)
Closed last year.
I'm trying to figure out how to work with the buffer's representation in int.
Simple code
var buffer = Buffer.alloc(4, 'a');
var interface16 = new Uint16Array(buffer);
var interface8 = new Uint8Array(buffer);
console.log(interface16);
console.log(interface8);
Why does the log output the same arrays (by value and length)? After all, I'm doing a data representation in the form of 1 and 2 bytes per integer. At least the manuals say that Uint8Array - takes 1 byte from the buffer, which I think of as a sequence of 0s and 1s, in this case 32 ( 4 * 8). That is, I can put up with the fact that it will be an array with 4 elements.
But Uint16Array takes 2 bytes for an integer, that is, the number must be different and the size of the array is 2.
What am I wrong about and which buffer to pass to these constructors in order to visually see the difference?
I suspect that in any case the returned array will be the same length as the number of bytes, or that it is a matter of how the console generates output. Probably. But there is not enough knowledge to understand why.
Thank you for your attention.
P.S. If you also advise some non-brain-exploding literature directly on this issue, it will be great.
In your example, both typed arrays end up having 4 elements, but the Uint8Array only uses 4 bytes to represent its contents, while the Uint8Array uses 8. You can see this by inspecting the byteLength property:
var buffer = Buffer.alloc(4, 'a');
var interface16 = new Uint16Array(buffer);
var interface8 = new Uint8Array(buffer);
console.log('Uint16 length: %d', interface16.length);
console.log('Uint16 byteLength: %d', interface16.byteLength);
console.log('Uint8 length: %d', interface8.length);
console.log('Uint8 byteLength: %d', interface8.byteLength);
Output
Uint16 length: 4
Uint16 byteLength: 8
Uint8 length: 4
Uint8 byteLength: 4
So the typed array constructors are creating typed arrays with the same number of elements as the source buffer, but the Uint16Array constructor is using two bytes instead of one for each element, filling the high-order byte of each element with zero.
first you create a buffer with 4 a's inside.
then you you call this function with the buffer :
the doc says : new Uint16Array(object);
When called with an object argument, a new typed array is created as if by the TypedArray.from() method.
see more : https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/from
it creates an array of 4 uint16 by using every bytes of the buffer
a = 0x61 = 97
you then get four 97.
here is a stackoverflow of a sucessful buffer to uint16 conversion
convert nodejs buffer to Uint16Array

convert Uint8Array into Uint16Array in NodeJs

I have an Uint8Array that i would like to convert into an Uint16Array so that each consecutive pair of element in the Uint8Array is converted in a Uint16 value. At the end, the uint16Array is thus half the size of the Uint8Array. I thus want to cast the whole array, not each values.
the only solution i found keeps the same lenght and change the underlying type of each element of the array. So it does not 'cast' the array but each element of the array, which is not what i want.
If I understood correctly the question, the most native way coming to my mind (which still not is cast but requires a few lines) is converting to a Buffer, than converting it to a Uint16Array.
Following snippet seems to achieve the desired result.
const ar8 = new Uint8Array([0, 1, 1, 0, 0, 2, 2, 0]);
const buf = new Buffer(ar8);
const ar16 = new Uint16Array(buf.buffer, buf.byteOffset, buf.byteLength / Uint16Array.BYTES_PER_ELEMENT);
console.log(ar8, buf, ar16);
On my platform conversion is little endian; sorry but I don't know if conversion is little endian on all platforms or little/big endian conversion is platform dependent.
You can instantiate Uint16Array from Uint8Array buffer value:
const uint8Array = new Uint8Array([1, 164, 5, 57])
const uint16Array = new Uint16Array(uint8Array.buffer)
// Uint16Array(2) [ 41985, 14597 ]
Your current solution should work fine. The "underlying type" of each element in a Uint8Array is number, just like in a Uint16Array. Therefore, casting the whole array should preserve the types of the values.

How do you decide which typed array to use?

I am trying to create a view of ArrayBuffer object in order to JSONify it.
var data = { data: new Uint8Array(arrayBuffer) }
var json = JSON.stringify(data)
It seems that the size of the ArrayBuffer does not matter even with the smallest Uint8Array. I did not get any RangeError so far:) If so, how do I decide which typed array to use?
You decide based on the data stored in the buffer, or, better said, based on your interpretation of that data.
Also an Uint8Array is not an 8 bit array, it's an array of unsigned 8 bit integers. It can have any length. A Uint8Array created from the same ArrayBuffer as a Uint16Array is going to be twice as long, because every byte in the ArrayBuffer is going to be "placed" as one element of the Uint8Array, while for the Uint16Array each pair of bytes is going to "become" one element in the array.
A good explanation of what happens is if we try thinking in binary. Try running this:
var buffer = new ArrayBuffer(2);
var uint8View = new Uint8Array(buffer);
var uint16View = new Uint16Array(buffer);
uint8View[0] = 2;
uint8View[1] = 1;
console.log(uint8View[0].toString(2));
console.log(uint8View[1].toString(2));
console.log(uint16View[0].toString(2));
The output is going to be
10
1
100000010
because displayed as an unsigned 8 bit integer in binary, 2 is 00000010 and 1 is 00000001. (toString strips leading zeroes).
Uint8Array represents an array of bytes. As I said, an element is an unsigned 8 bit integer. We just pushed two bytes to it.
In memory those two bytes are stored side by side as 00000001 00000010 (binary form again used to make things clearer).
Now when you initialize a Uint16Array over the same buffer it's going to contain the same bytes, but because an element is a unsigned 16 bit integer (two bytes), when you access uint16View[0] it's going to take the first two bytes and give them back to you. So 0000000100000010, which is 100000010 with no leading zeroes.
If you interpret this data as base 10 (decimal) integers you'll know it's 0000000100000010 to base 10 (258).
Neither Uint8Array nor Uint16Array store any data themselves. They are simply different ways of accessing bytes in an ArrayBuffer.
how one chooses which one to use? It's not based on preference but on the underlying data. ArrayBuffer is to be used when you receive some binary data from some external source (web socket maybe) and already know what the data represents. It might be a list of unsigned 8 bit integers, or one of signed 16 bit ones, or even a mixed list where you know the first element is an 8 bit integer and the next one is a 16 bit one. Then you can use DataView to read typed items from it.
If you don't know what the data represents you can't choose what to use.

Convert a byte for char A

In javascript I would like to convert this string: '0x7f, 0x88, 0x88, 0x88, 0x88, 0x7f'
to an object that look like this:
['B00000000','B01111111','B10001000','B10001000','B10001000','B10001000','B01111111','B00000000']
Both are displaying char A on a 8*8 Led matrix. How to do this in Javascript?
I won't write your code as you should really figure it out for yourself, but I will give you the (possible) process...
Split the string using .split function into an array
Loop through the
array and convert the Hex to int and store in a new array
Loop
through the new array and convert the ints to the B1111111 format
required. You can do this by starting with 128 and shifting right (or
halving) on each iteration. Compare to the int, if larger you get a 1
and subtract from the int. If not larger you get a 0. Add these 1s
and 0s onto a string and you have the binary representation. Add these strings into a new array, which will be your final result.
You can create an array with var arrayName = []; and then add to it with arrayName.push(someString);
Good luck!

How to create an Int32Array view that starts on an arbitrary byte of the buffer

Given some ArrayBuffer like:
var data = new ArrayBuffer(64);
I want to be able to write 32bit integers at any position (not only 32bit aligned).
For example:
[0][1][2][3][4][5][6][7][8][9] ... byte data
[__________][__________] ... I want to create an Int32Array with 1 byte offset
Is that possible?
Unfortunately, the byteOffset attribute is read-only and if it's set when creating the view, it only accepts multiples of 4 (for int32s).
If you need to read various types of values from arbitrary offsets, DataView is more handy. It has no alignment requirements.

Categories

Resources