Converting large number to string in Javascript/Node - javascript

I have seen other related questions, but they did not solve my problem, or may be I somehow missed the exactly same resolved queries.
Here is the problem. The service that I call up returns a JSON response with some keys having large numbers as values, and I want to pass them on to my view and display. The issue is that they are getting rounded off, which I don't want. Actually its coming inside a buffer from which I am doing now:
JSON.parse(res.body.toString()) // res.body is a Buffer
and sending to view. How can I retain the whole number in the form of a string and send this to view so exactly the same is made available to UI.
I thought may be a replacer will help, but it does not works too.
const replacer = (key, value) => {
if (typeof value === 'number') {
return JSON.stringify(value);
}
return value;
};
//78787878977987787897897897123456786747398
const obj = {
name: 'John',
income: 78787878977987787897897897123456786747398,
car: null
};
var buf = Buffer.from(JSON.stringify(obj));
console.log(buf.toString());
// console.log(JSON.stringify(buf.toString()))
// console.log('func res: ', replacer('key', 78787878977987787897897897123456786747398))
// console.log(obj.income.toString())
console.log(JSON.stringify(obj, replacer));
You can recommend some external trusted library, or better, suggest me the solution through direct code only.
Edit:
The outcome in short is: Convert the response to String before returning from the server. Once it gets into JS (Buffer in my case), the conversion already occurred meaning that from the application side, nothing can be done to retrieve it.
Please let me know if there's a real solution to this without modifying server response.

Unfortunately, the number is higher than max_safe_integer, so if it ever gets parsed as a number, even if it's converted back to a string later (such as with the reviver function, the second parameter to JSON.parse), it won't be reliable. But luckily, since you have a JSON string, you can replace numeric values with string values before JSON.parseing it. For example:
const resBody = '{"foo":"bar", "objs":[{"name":"John", "income": 78787878977987787897897897123456786747398}]}';
const resBodyReplaced = resBody.replace(/: *(\d+)/g, ':"$1"');
console.log(JSON.parse(resBodyReplaced).objs[0].income);

Related

How to make JSON.parse() to treat all the Numbers as BigInt?

I have some numbers in json which overflow the Number type, so I want it to be bigint, but how?
{"foo":[[0],[64],[89],[97]],"bar":[[2323866757078990912,144636906343245838,441695983932742154,163402272522524744],[2477006750808014916,78818525534420994],[18577623609266200],[9008333127155712]]}
TLDR;
You may employ JSON.parse() reviver parameter
Detailed Solution
To control JSON.parse() behavior that way, you can make use of the second parameter of JSON.parse (reviver) - the function that pre-processes key-value pairs (and may potentially pass desired values to BigInt()).
Yet, the values recognized as numbers will still be coerced (the credit for pinpointing this issue goes to #YohanesGultom).
To get around this, you may enquote your big numbers (to turn them into strings) in your source JSON string, so that their values are preserved upon converting to bigint.
As long as you wish to convert to bigint only certain numbers, you would need to pick up appropriate criteria (e.g. to check whether the value exceeds Number.MAX_SAFE_INTEGER with Number.isSafeInteger(), as #PeterSeliger has suggested).
Thus, your problem may be solved with something, like this:
// source JSON string
const input = `{"foo":[[0],[64],[89],[97]],"bar":[[2323866757078990912,144636906343245838,441695983932742154,163402272522524744],[2477006750808014916,78818525534420994],[18577623609266200],[9008333127155712]]}`
// function that implements desired criteria
// to separate *big numbers* from *small* ones
//
// (works for input parameter num of type number/string)
const isBigNumber = num => !Number.isSafeInteger(+num)
// function that enquotes *big numbers* matching
// desired criteria into double quotes inside
// JSON string
//
// (function checking for *big numbers* may be
// passed as a second parameter for flexibility)
const enquoteBigNumber = (jsonString, bigNumChecker) =>
jsonString
.replaceAll(
/([:\s\[,]*)(\d+)([\s,\]]*)/g,
(matchingSubstr, prefix, bigNum, suffix) =>
bigNumChecker(bigNum)
? `${prefix}"${bigNum}"${suffix}`
: matchingSubstr
)
// parser that turns matching *big numbers* in
// source JSON string to bigint
const parseWithBigInt = (jsonString, bigNumChecker) =>
JSON.parse(
enquoteBigNumber(jsonString, bigNumChecker),
(key, value) =>
!isNaN(value) && bigNumChecker(value)
? BigInt(value)
: value
)
// resulting output
const output = parseWithBigInt(input, isBigNumber)
console.log("output.foo[1][0]: \n", output.foo[1][0], `(type: ${typeof output.foo[1][0]})`)
console.log("output.bar[0][0]: \n", output.bar[0][0].toString(), `(type: ${typeof output.bar[0][0]})`)
.as-console-wrapper{min-height: 100% !important;}
Note: you may find RegExp pattern to match strings of digits among JSON values not quite robust, so feel free to come up with yours (as mine was the quickest I managed to pick off the top of my head for demo purposes)
Note: you may still opt in for some library, as it was suggested by #YohanesGultom, yet adding 10k to your client bundle or 37k to your server-side dependencies (possibly, to docker image size) for that sole purpose may not be quite reasonable.

How to define a Date Scalar using GraphQL-JS?

I am trying to define a custom scalar in GraphQL so I can query & process the Dates in my MongoDB collections. I am not sure I understand 100% what a scalar is or does, but it seems to be a sort of type that I define myself. All the examples & tutorials I found were using Apollo or some other type of notation, but I would like to see a solution using GraphQL-JS
So far, I have defined my scalar like so:
const Date = new GraphQLScalarType({
name: "Date",
serialize: (value) => {
return value; //is it correct, to just return the value? Do I need to parse it or turn it into a Date first?
},
parseValue: () => {
return "serialise"; //I am just returning this string here, because I do not know what this function is for
},
parseLiteral(ast) {
return null; //I am just returning null here, because I do not know what this function is for
},
});
I am not sure I understand what each of these functions are supposed to do. And wouldn't there also have to be a deserialize function?
When I query now against my graphql endpoint I do get back something like:
{
"myDate": "2020-07-15T00:00:00.000Z"
}
I guess that my serialise function is at play here? The Date is certainly correct, but I am not sure if I should do anything else with the data before returning it from serialize? Right now I just return whatever I get from my MongoDB database.
Urigo, from The Guild, created graphql-scalars that contains definitions for multiple common scalars used in GraphQL
//is it correct, to just return the value? Do I need to parse it or turn it into a Date first?
It would be wise to validate that value is a Date before returning it.
And yes, just return the value.
//I am just returning null here, because I do not know what this function is for
This is the entry from the abstract-syntax-tree (ast).
See Urigo's code below to see how the ast object is accessed
ast.kind
ast.value
Additionally, take a look at this SO post that describes the difference between parseValue and parseLiteral
Take a look at localDate and that may provide you the example you need to answer your question :)
https://github.com/Urigo/graphql-scalars/blob/master/src/scalars/LocalDate.ts#L34
export const GraphQLLocalDate = /*#__PURE__*/ new GraphQLScalarType({
name: 'LocalDate',
description:
'A local date string (i.e., with no associated timezone) in `YYYY-MM-DD` format, e.g. `2020-01-01`.',
serialize(value) {
// value sent to client as string
return validateLocalDate(value);
},
parseValue(value) {
// value from client as json
return validateLocalDate(value);
},
parseLiteral(ast) {
// value from client in ast
if (ast.kind !== Kind.STRING) {
throw new GraphQLError(
`Can only validate strings as local dates but got a: ${ast.kind}`,
);
}
return validateLocalDate(ast.value);
},
});

How to parse and format strings out of irregular CSV in javascript?

I've scraped this allergy data string from a public website:
Cedar 679 gr/m3 High, Grass 20 gr/m3 Medium, Trees 80 gr/m3 Medium, Molds Low.
Sometimes the number of items is fewer, but the general format for trees and grasses is always like this, with commas separating each type of allergen:
TYPE AMOUNT g/m3 LEVEL
Molds is the exception; assume it will always be a string of text. Assume we don't require the molds data at all.
What library or technique would you use to parse this into a neat JSON object, for example:
{
"Cedar": "679",
"Grass": "20",
"Trees": "80"
}
As Sam stated in the comments, it'd be ideal to utilize npmjs.com/package/csv-parser
However, if you want to use vanilla JS, I wrote a basic script that works given your input:
//function takes a csv string and returns a list of Objects
//and only includes values with 4 parts
function parseCsv(csvString) {
let out = {};
let spacedValues = csvString.split(/,\s*/);
let values = spacedValues.map(str => str.split(" "));
values.forEach((value, index) => {
if (value.length === 4) {
//you can change the value from an object to value[1] if you only need the amount
out[`${value[0]}`] = {
AllergenAmount: value[1],
AllergenUnits: value[2],
AllergenLevel: value[3]
}
});
}
// add an else if here if you want to keep values with more/less than 4 parts of the string
});
return out;
}
//wrapper that implements the builtin JSON.stringify method
const csvToJSONString = csvString => JSON.stringify(parseCsv(csvString));
To use it, just pass the csv string into the csvToJSONString function, and it will return a JSON string. You can also change the properties from an object to value[1] if you only needed the amount (commented in code).
I worked with "csvtojsn" module before in a similar situation and it helped a lot.
https://www.npmjs.com/package/csvtojson
You should try csv-parse I'm using it in my current project and it works like a charm.

Javascript object with arrays to search param style query string

Looking for clean way to convert a javascript object containing arrays as values to a search param compatible query string. Serializing an element from each array before moving to the next index.
Using libraries such as querystring or qs, converts the object just fine, but handles each array independently. Passing the resulting string to the server (which I cannot change) causes an error in handling of the items as each previous value is overwritten by the next. Using any kind of array notation in the query string is not supported. The only option I have not tried is a custom sort function, but seems like it would be worse than writing a custom function to parse the object. Any revision to the object that would generate the expected result is welcome as well.
var qs = require("qs")
var jsobj = {
origString:['abc','123'],
newString:['abcd','1234'],
action:'compare'
}
qs.stringify(jsobj,{encode:false})
qs.stringify(jsobj,{encode:false,indices:false})
qs.stringify(jsobj,{encode:false,indices:false,arrayFormat:'repeat'})
Result returned is
"origString=abc&origString=123&newString=abcd&newString=1234&action=compare"
Result desired would be
"origString=abc&newString=abcd&origString=123&newString=1234&action=compare"
I tried reorder your json:
> var jsobj = [{origString: 'abc', newString: 'abcd' }, {origString: '123',
newString: '1234' }, {action:'compare'}]
> qs.stringify(jsobj,{encode:false})
'0[origString]=abc&0[newString]=abcd&1[origString]=123&1[newString]=1234&2[action]=compare'
But I don't know if this is a good alternative for your problem.
Chalk this up to misunderstanding of the application. After spending some more time with the API I realized my mistake, and as posted above by others, order does no matter. Not sure why my first several attempts failed but the question is 'answered'

How can I get integers from bigquery nodejs api?

I am fetching data from bigquery which I need to store in MongoDB as integer, so that I can perform operations on that data in Mongo. Even though the data types of columns in bigquery is Integer, its nodejs api is returning string in its Javascript object. E.g. I'm getting results that look like [{row1:'3',row2:'4',row3:'5'},{row1:'13',row2:'14',row3:'15'}...]
typeof gives string on each element of object. I can run a loop and convert each element to integer, but that is not scalable on the data set. Also, I don't want all strings to be converted to integers, only ones which are stored as integer in bigquery. I'm using gcloud module in nodejs to fetch data.
assuming you know where the type property is on the response, something like this would work.
var response = [{type: 'Integer', value: '13'} /* other objects.. */];
var mappedResponse = response.map(function(item) {
// Put your logic here
// This implementation just bails
if (item.type != 'Integer') return item;
// This just converts the value to an integer, but beware
// it returns NaN if the value isn't actually a number
item.value = parseInt(item.value);
// you MUST return the item after modifying it.
return item;
});
This still loops over each item, but immediately bails out if it's not what we're looking for. Could also compose multiple maps and filters to generalize this out.
The only way to get by this is by first applying a filter, but this basically achieves the same thing as our initial type check
var mappedResponse = response
// Now we only deal with integers in the map function
.filter(x => x.type == 'Integer)
.map(function(item) {
// This just converts the value to an integer, but beware
// it returns NaN if the value isn't actually a number
item.value = parseInt(item.value);
// you MUST return the item after modifying it.
return item;
});
BigQuery deliberately encodes integers as strings when returning them via API to avoid loss of precision for large values. For now, the only option is to parse them on the client side.

Categories

Resources