how to handle more than 20 digit number (Big integer)? - javascript

My angular program, I need to pass the number which is more than 20 digit to the API request.
num: any;
this.num = 2019111122001424290521878689;
console.log(this.num); // It displays "2.0191111220014244e+27"
I tried to change string from number as below
console.log(this.num.toString()); // It displays "2.0191111220014244e+27"
My expectation is that I need to pass the original big integer into the API request. If I pass as below, it goes as "2.0191111220014244e+27".
BTW, I tried BigInt(this.num), which gives difference number.
Suggest me

In JavaScript, big integer literals have the letter n as a suffix:
var bigNum = 2019111122001424290521878689n;
console.log(bigNum);
For more information, see
MDN JavaScript Reference - BigInt

If you got a large number (> SAFE_INTEGER) from an API, in JSON format, and you want to get the exact value, as as string, you unfortunately can't use JSON.parse(), as it will use the number type and lose precision.
There are alternative JSON parsers out there like LosslessJSON that might solve your problem.

You can use BigInt.
BigInt is a built-in object that provides a way to represent whole numbers larger than 253 - 1, which is the largest number JavaScript can reliably represent with the Number primitive. BigInt can be used for arbitrarily large integers.
const theBiggestInt = 9007199254740991n;
const alsoHuge = BigInt(9007199254740991);
// ↪ 9007199254740991n
const hugeString = BigInt("9007199254740991");
// ↪ 9007199254740991n
const hugeHex = BigInt("0x1fffffffffffff");
// ↪ 9007199254740991n
const hugeBin = BigInt("0b11111111111111111111111111111111111111111111111111111");
// ↪ 9007199254740991n
BigInt is similar to Number in some ways, but also differs in a few key matters — it cannot be used with methods in the built-in Math object and cannot be mixed with instances of Number in operations; they must be coerced to the same type. Be careful coercing values back and forth, however, as the precision of a BigInt may be lost when it is coerced to a Number.
Refer to
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt

The problem is that the number you have there is not an integer. Javascript can only store integers up to the value given by Number.MAX_SAFE_INTEGER. In chrome, this number is 9007199254740991.
The number you have is actually a floating point number, and converting it between floating point and integer will loose some precision.

Related

Number.toString returns "incorrect" hexadecimal value

In order to utilize a third party API, I have to convert the ID number to a hexadecimal. I converted the ID with https://www.rapidtables.com/convert/number/decimal-to-hex.html and got 711DD21A11FA9223FEB43849FF1F3569DC024DCE000000000000150000000001. This works when I use it with the API.
My understanding is that you can perform the same conversion with JS with Number().toString(16). However, when I use that function I get 711dd21a11fa9400000000000000000000000000000000000000000000000000.
The latter value does not work with the API. Any insight into why the JS function returns a different value?
Your number is too big for JavaScript.
The Number.MAX_SAFE_INTEGER constant represents the maximum safe integer in JavaScript.
I advise you to use the BigInt data type.
BigInt is a primitive wrapper object used to represent and manipulate primitive bigint values - which are too large to be represented by the number primitive.
Exampe:
// Your number in decimal
const decimalNumber = BigInt("10000000000000000");
// Your number in hex
const hexNumber = decimalNumber.toString(16);
console.log(`Decimal number: ${decimalNumber}`);
console.log(`Hex number: ${hexNumber}`);

parseFloat stripping last digits and converting to zeros

I have a scenario where I need to parsefloat 19 digit string to number.
e.g. parseFloat("1000000000100000043") gives me 1000000000100000000
but the expected output required is 1000000000100000043
This is likely a precision overflow error.
The Number data type (but also int and float in other languages) have a finite number of bits available to represent a number. Typically around 15-16 decimal digits worth.
When length of original number in the string exceeds available precision, such number can no longer be represented by the target data type.
In this case the parseFloat function fails silently. If you want to catch this situation you need to add code to check incoming data or use another function, possibly a custom one.
Alternatively, you can convert the numeric value back to string and compare it with original to detect a discrepancy.
See also a question regarding double.Parse
You are running into how Javascript numbers are stored. See, e.g., here: https://www.w3schools.com/js/js_numbers.asp
You can use a library like decimal.js to work with large, exact numbers. These libraries store the number as string, but allow you to do mathematical operations.

Using a float in Javascript in a hash function

I Have a hash function like this.
class Hash {
static rotate (x, b) {
return (x << b) ^ (x >> (32-b));
}
static pcg (a) {
let b = a;
for (let i = 0; i < 3; i++) {
a = Hash.rotate((a^0xcafebabe) + (b^0xfaceb00c), 23);
b = Hash.rotate((a^0xdeadbeef) + (b^0x8badf00d), 5);
}
return a^b;
}
}
// source Adam Smith: https://groups.google.com/forum/#!msg/proceduralcontent/AuvxuA1xqmE/T8t88r2rfUcJ
I use it like this.
console.log(Hash.pcg(116)); // Output: -191955715
As long as I send an integer in, I get an integer out. Now here comes the problem. If I have a floating number as input, rounding will happen. The number Hash.pcg(1.1) and Hash.pcg(1.2) will yield the same. I want different inputs to yield different results. A possible solution could be to multiply the input so the decimal is not rounded down, but is there a more elegant and flexible solution to this?
Is there a way to convert a floating point number to a unique integer? Each floating point number would result in a different integer number.
Performance is important.
This isn't quite an answer, but I was running out of room to make it a comment. :)
You'll hit a problem with integers outside of the 32-bit range as well as with non-integer values.
JavaScript handles all numbers as 64-bit floating point. This gives you exact integers over the range -9007199254740991 to 9007199254740991 (±(2^53 - 1)), but the bit-wise operators used in your hash algorithm (^, <<, >>) only work in a 32-bit range.
Since there are far more non-integer numbers possible than integers, no one-to-one mapping is possible with ordinary numbers. You could work something out with BigInts, but that will likely lead to comparatively much slower performance.
If you're willing to deal with the performance hit, your can use JavaScript buffer functions to get at the actual bits of a floating point number. (I'd say more now about how to do that, but I've got to run!)
Edit... back from dinner...
You can convert JavaScript's standard number type, which is 64-bit floating point, to a BigInt like this:
let dv = new DataView(new ArrayBuffer(8));
dv.setFloat64(0, Math.PI);
console.log(dv.getFloat64(0), dv.getBigInt64(0), dv.getBigInt64(0).toString(16).toUpperCase())
The output from this is:
3.141592653589793 4614256656552045848n "400921FB54442D18"
The first item shows that the number was properly stored as byte array, the second shows the BigInt created from the same bits, and the last is the same BigInt over again, but in hex to better show the floating point data format.
Once you've converted a number like this to a BigInt (which is not the same numeric value, but it is the same string of bits) every possible value of number will be uniquely represented.
The same bit-wise operators you used in your algorithm above will work with BigInts, but without the 32-bit limitation. I'm guessing that for best results you'd want to change the 32 in your code to 64, and use 16-digit (instead of 8-digit) hex constants as hash keys.

I can't get proper uint32 number in javascript

i am trying to convert a long number to unit in JavaScript, but the result i got it different from the one i already have in c#.
c#:
var tt=431430059159441001;
var t=(UInt32)tt;//1570754153
js:
var arr =new Uint32Array(1);
arr[0]=431430059159441001;//1570754176
so could any body explain why there is difference.
That's because your number literal is rather a 64 bit integer, and that cannot be represented in JavaScripts regular Number type. The number type is a 64-bit precision floating point number, which can only represent integer values up to around 2**53. So I would recommend to just not use such a huge number literal.
A recent development in the JavaScript world is BigInts. If you can afford to use them, then your code is easy to fix:
var t = Number(BigInt.asUintN(32, 431430059159441001n));
console.log(t); // 1570754153
This is not about uints, but about floats. JavaScript uses floating point numbers, and your number exceeds the maximum range of integers that can safely be represented:
console.log(431430059159441001)
You cannot convert 431430059159441001 to unsigned integer in c#. Max Value of UInt32 is 4294967295. So the var t=(UInt32)431430059159441001; assignment gives Compiler error.
also 431430059159441001 is larger then max value of float number (javascript holds number with float format)

Conversion issue for a long string of integers in JavaScript

I'm trying to convert a long string which has only integers to numbers.
var strOne = '123456789123456789122';
parseInt(strOne, 10);
// => 123456789123456800000
var strTwo = '1234567891234567891232';
parseInt(strTwo, 10);
// => 1.234567891234568e+21
The expected output should be the same as strOne and strTwo but that isn't happening here. While converting the string to a number, the output gets changed.
What's the best way to fix this issue?
BigInt is now available in browsers.
BigInt is a built-in object that provides a way to represent whole
numbers larger than 253, which is the largest number JavaScript can
reliably represent with the Number primitive.
value The numeric value of the object being created. May be a string or an integer.
var strOne = '123456789123456789122';
var intOne = BigInt(strOne);
var strTwo = '1234567891234567891232';
var intTwo = BigInt(strTwo);
console.log(intOne, intTwo);
You number is unfortunately too large and gets wrapped when the conversion is done.
The largest integer you can express in JavaScript is 2^53-1, it is given by Number.MAX_SAFE_INTEGER, see the MDN doc here.
The reasoning behind that number is that JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent numbers between -(2^53 - 1) and 2^53 - 1.
console.log(Number.MAX_SAFE_INTEGER);
If you want to work with numbers bigger than this limit, you'll have to use a different representation than Number such as String and use a library to handle operations (see the BigInteger library for example).

Categories

Resources