Binary string of length 60 - most compact way to store - javascript

I have binary strings of length 60 representing yes/no states for the minutes of an hour and I would like to write them to file in Java. My three objectives are that this should be
compact (better than saving as string)
enabling an easy way to rebuild my binary strings when reading the file from JavaScript
without using third party libraries
My first thought was to convert the string to a Long (8 bytes) and save as such, but it seems complicated to get my binary string back when reading the file in JavaScript due to floating point number format and mantissa length. What is a good way to do this?

Javascript can handle integers correctly up to 253-1, so you can use standard methods if you split the 60-bit data in two, and store it as two 32-bit integers.
Alternatively, you could store the data e.g. as a 15-character hexadecimal string, and recode it into a binary string with something like this:
function hex2bin(s) {
return ("0000000000000000000000000000000" + parseInt(s.substr(0,8), 16).toString(2)).substr(-32)
+ ("000000000000000000000000000" + parseInt(s.substr(8,7), 16).toString(2)).substr(-28);
}
document.write(hex2bin("123456789ABCDEF"));
Or you could use a base-64 string to reduce the data size to 10 characters, and decode it with something like this:
function base642bin(s) {
var b = "", e = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
for (var i = 0; i < 10; i++) b += ("00000" + e.indexOf(s.charAt(i)).toString(2)).substr(-6);
return b;
}
document.write(base642bin("EjRWeJq83v"));
If you use a built-in Java function for base-64 encoding, check which encoding table it uses (sometimes the + and / are substituted for other characters).

Long binary strings can be converted to BigInt and back to binary.
const big = BigInt('0b' + a);
// store, perform math, etc.
const binaryStr = big.toString(2);
This supports values greater than 253 - 1 (Number.MAX_SAFE_INTEGER). E.g. here is a 95-char binary string being converted:
BigInt('0b' + '10100000100100110110010000010101111011011001101110111111111101000000101111001110001111100001101')
24847893154024981730169397005n
The prefix 0b tells the constructor that this is a binary representation.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt

Related

JavaScript BigInt print unsigned binary represenation

How do you print an unsigned integer when using JavaScript's BigInt?
BigInts can be printed as binary representation using toString(2). However for negative values this function just appends a - sign when printing.
BigInt(42).toString(2)
// output => 101010
BigInt(-42).toString(2)
// output => -101010
How do I print the unsigned representation of BigInt(42)? I that with regular numbers you can do (-42 >>> 0).toString(2), however the unsigned right shift seems not to be implemented for BigInt, resulting in an error
(BigInt(-42) >>> BigInt(0)).toString(2)
// TypeError: BigInts have no unsigned right shift, use >> instead
An easy way to get the two's complement representation for negative BigInts is to use BigInt.asUintN(bit_width, bigint):
> BigInt.asUintN(64, -42n).toString(2)
'1111111111111111111111111111111111111111111111111111111111010110'
Note that:
You have to define the number of bits you want (64 in my example), there is no "natural"/automatic value for that.
Given only that string of binary digits, there is no way to tell whether this is meant to be a positive BigInt (with a value close to 2n**64n) or a two's complement representation of -42n. So if you want to reverse the conversion later, you'll have to provide this information somehow (e.g. by writing your code such that it implicitly assumes one or the other option).
Relatedly, this is not how -42n is stored internally in current browsers. (But that doesn't need to worry you, since you can create this output whenever you want/need to.)
You could achieve the same result with a subtraction: ((2n ** 64n) - 42n).toString(2) -- again, you can specify how many bits you'd like to see.
Is there something like bitAtIndex for BigInt?
No, because there is no specification for how BigInts are represented. Engines can choose to use bits in any way they want, as long as the resulting BigInts behave as the specification demands.
#Kyroath:
negative BigInts are represented as infinite-length two's complement
No, they are not: the implementations in current browsers represent BigInts as "sign + magnitude", not as two's complement. However, this is an unobservable implementation detail: implementations could change how they store BigInts internally, and BigInts would behave just the same.
What you probably meant to say is that the two's complement representation of any negative integer (big or not) is conceptually an infinite stream of 1-bits, so printing or storing that in finite space always requires defining a number of characters/bits after which the stream is simply cut off. When you have a fixed-width type, that obviously defines this cutoff point; for conceptually-unlimited BigInts, you have to define it yourself.
Here's a way to convert 64-bit BigInts into binary strings:
// take two's complement of a binary string
const twosComplement = (binaryString) => {
let complement = BigInt('0b' + binaryString.split('').map(e => e === "0" ? "1" : "0").join(''));
return decToBinary(complement + BigInt(1));
}
const decToBinary = (num) => {
let result = ""
const isNegative = num < 0;
if (isNegative) num = -num;
while (num > 0) {
result = (num % BigInt(2)) + result;
num /= BigInt(2);
}
if (result.length > 64) result = result.substring(result.length - 64);
result = result.padStart(64, "0");
if (isNegative) result = twosComplement(result);
return result;
}
console.log(decToBinary(BigInt(5))); // 0000000000000000000000000000000000000000000000000000000000000101
console.log(decToBinary(BigInt(-5))); // 1111111111111111111111111111111111111111111111111111111111111011
This code doesn't do any validation, however.

Conversion issue for a long string of integers in JavaScript

I'm trying to convert a long string which has only integers to numbers.
var strOne = '123456789123456789122';
parseInt(strOne, 10);
// => 123456789123456800000
var strTwo = '1234567891234567891232';
parseInt(strTwo, 10);
// => 1.234567891234568e+21
The expected output should be the same as strOne and strTwo but that isn't happening here. While converting the string to a number, the output gets changed.
What's the best way to fix this issue?
BigInt is now available in browsers.
BigInt is a built-in object that provides a way to represent whole
numbers larger than 253, which is the largest number JavaScript can
reliably represent with the Number primitive.
value The numeric value of the object being created. May be a string or an integer.
var strOne = '123456789123456789122';
var intOne = BigInt(strOne);
var strTwo = '1234567891234567891232';
var intTwo = BigInt(strTwo);
console.log(intOne, intTwo);
You number is unfortunately too large and gets wrapped when the conversion is done.
The largest integer you can express in JavaScript is 2^53-1, it is given by Number.MAX_SAFE_INTEGER, see the MDN doc here.
The reasoning behind that number is that JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent numbers between -(2^53 - 1) and 2^53 - 1.
console.log(Number.MAX_SAFE_INTEGER);
If you want to work with numbers bigger than this limit, you'll have to use a different representation than Number such as String and use a library to handle operations (see the BigInteger library for example).

javascript hex codes with TCP/IP communication

I’m using a node module ‘net’ to create a client application that sends data through a TCP socket. The server-side application accepts this message if it starts and ends with a correct hex code, just for example the data packet would start with a hex “0F” and ends with a hex “0F1C”. How would I create these hex codes with javascript ? I found this code to convert a UTF-8 string into a hex code, not sure if this is what I need as I don’t have much experience with TCP/IP socket connections. Heres some javascript I've used to convert a utf-8 to a hex code. But I'm not sure this is what I'm looking for? Does anyone have experience with TCP/IP transfers and/or javascript hex codes?.
function toHex(str,hex){
try{
hex = unescape(encodeURIComponent(str))
.split('').map(function(v){
return v.charCodeAt(0).toString(16)
}).join('')
}
catch(e){
hex = str
console.log('invalid text input: ' + str)
}
return hex
}
First of all, you do not need to convert your data string into hex values, in order to send it over TCP. Every string in node.js is converted to bytes when sent over the network.
Normally, you'd send over a string like so:
var data = "ABC";
socket.write(data); // will send bytes 65 66 67, or in hex: 44 45 46
Node.JS also allows you to pass Buffer objects to functions like .write().
So, probably the easiest way to achieve what you wish, is to create an appropriate buffer to hold your data.
var data = "ABC";
var prefix = 0x0F; // JavaScript allows hex numbers.
var suffix = 0x0FC1;
var dataSize = Buffer.byteLength(data);
// compute the required buffer length
var bufferSize = 1 + dataSize + 2;
var buffer = new Buffer(bufferSize);
// store first byte on index 0;
buffer.writeUInt8(prefix, 0);
// store string starting at index 1;
buffer.write(data, 1, dataSize);
// stores last two bytes, in big endian format for TCP/IP.
buffer.writeUInt16BE(suffix, bufferSize - 2);
socket.write(buffer);
Explanation:
The prefix hex value 0F requires 1 byte of space. The suffix hex value 0FC1 actually requires two bytes (a 16-bit integer).
When computing the number of required bytes for a string (JavaScript strings are UTF-16 encoded!), str.length is not accurate most of the times, especially when your string has non-ASCII characters in it. For this, the proper way of getting the byte size of a string is to use Buffer.byteLength().
Buffers in node.js have static allocations, meaning you can't resize them after you created them. Hence, you'll need to compute the size of the buffer -in bytes- before creating it. Looking at our data, that is 1 (for our prefix) + Buffer.byteLength(data) (for our data) + 2 (for our suffix).
After that -imagine buffers as arrays of bytes (8-bit values)-, we'll populate the buffer, like so:
write the first byte (the prefix) using writeUInt8(byte, offset) with offset 0 in our buffer.
write the data string, using .write(string[, offset[, length]][, encoding]), starting at offset 1 in our buffer, and length dataSize.
write the last two bytes, using .writeUInt16BE(value, offset) with offset bufferSize - 2. We're using writeUInt16BE to write the 16-bit value in big-endian encoding, which is what you'd need for TCP/IP.
Once we've filled our buffer with the correct data, we can send it over the network, using socket.write(buffer);
Additional tip:
If you really want to convert a large string to bytes, (e.g. to later print as hex), then Buffer is also great:
var buf = Buffer.from('a very large string');
// now you have a byte represetantion of the string.
Since bytes are all 0-255 decimal values, you can easily print them as hex values in console, like so:
for (i = 0; i < buf.length; i++) {
const byte = buf[i];
const hexChar = byte.toString(16); // convert the decimal `byte` to hex string;
// do something with hexChar, e.g. console.log(hexChar);
}

Dealing With Binary / Bitshifts in JavaScript

I am trying to perform some bitshift operations and dealing with binary numbers in JavaScript.
Here's what I'm trying to do. A user inputs a value and I do the following with it:
// Square Input and mod with 65536 to keep it below that value
var squaredInput = (inputVal * inputVal) % 65536;
// Figure out how many bits is the squared input number
var bits = Math.floor(Math.log(squaredInput) / Math.log(2)) + 1;
// Convert that number to a 16-bit number using bitshift.
var squaredShifted = squaredInput >>> (16 - bits);
As long as the number is larger than 46, it works. Once it is less than 46, it does not work.
I know the problem is the in bitshift. Now coming from a C background, I know this would be done differently, since all numbers will be stored in 32-bit format (given it is an int). Does JavaScript do the same (since it vars are not typed)?
If so, is it possible to store a 16-bit number? If not, can I treat it as 32-bits and do the required calculations to assume it is 16-bits?
Note: I am trying to extract the middle 4-bits of the 16-bit value in squaredInput.
Another note: When printing out the var, it just prints out the value without the padding so I couldn't figure it out. Tried using parseInt and toString.
Thanks
Are you looking for this?
function get16bitnumber( inputVal ){
return ("0000000000000000"+(inputVal * inputVal).toString(2)).substr(-16);
}
This function returns last 16 bits of (inputVal*inputVal) value.By having binary string you could work with any range of bits.
Don't use bitshifting in JS if you don't absolutely have to. The specs mention at least four number formats
IEEE 754
Int32
UInt32
UInt16
It's really confusing to know which is used when.
For example, ~ applies a bitwise inversion while converting to Int32. UInt16 seems to be used only in String.fromCharCode. Using bitshift operators converts the operands to either UInt32 or to Int32.
In your case, the right shift operator >>> forces conversion to UInt32.
When you type
a >>> b
this is what you get:
ToUInt32(a) >>> (ToUInt32(b) & 0x1f)

JavaScript - Convert Long Number to String

"" + 237498237498273908472390847239084710298374901823749081237409273492374098273904872398471298374
> '2.3749823749827392e+92'
I calculate IDs in a beautiful and arcane way:
time = new Date().getTime()
pid = process.pid
host = 0; (host +=s.charCodeAt(0) for s in os.hostname())
counter = MIPS.unique_id()
"#{host}#{pid}#{time}#{counter}"
Unfortunately, somewhere along the way the IDs (for example 11207648813339434616800). Unfortunately this means they sometimes turn to 1.1207648813339434e+22.
UPDATE:
It seems to be a "bug/feature" of redis. never expected that.
# Bug with Big Numbers on zadd
redis = require 'redis'
r = redis.createClient()
r.zadd 'zset', '342490809809999998098', 'key', ->
r.zscore 'zset', 'key', (_, results) ->
console.log typeof results # string
console.log results # 3.4249080981000002e+20
Javascript use 8-bytes double to store large numbers, which is 53bit precision. In your case, it is far beyond 53 bits, so you should use a big-number library, which can store big numbers precisely. Try javascript-bignum
Your number gets converted to 2.3749823749827392e+92 before you concatenate the number with the string to convert it.
The only solution is to use a container format that accepts an arbitrary number of digits, which is either a string or an array.
Can you provide us with a few more details as to how you are obtaining this number?

Categories

Resources