I've got a JSON file I'm working with. I'll mostly be doing data transformations on the stuff in the data property. I'd like to be able to easily do stuff like .filter to restaurantName and joined without the hassle of delving into the JSON structure each time.
const data = [
{
position: 1,
title: "Queues Near You",
data: [
[
{
restaurantName: "Tonkotsu",
joined: false,
},
{
restaurantName: "BurgerVille",
joined: false,
},
],
],
},
{
position: 2,
title: "Restaurants Near You",
data: [
[
{
restaurantName: "Seoreni",
joined: false,
},
{
restaurantName: "Jinmu",
joined: false,
},
],
],
},
];
So I decided to make a facade toRestaurantArray, that returns an array of arrays out of restaurantName i.e [["Tonkotsu","BurgerVille"], ["Seoreni", "Jinmu"]], that I can use much easier. Then I created a fromRestaurantArray that recreates the JSON structure from this new array.
Am I approaching this problem the right way with the to and from functions? If not, what's a better route, and if I am, how can I refactor the code below in terms of logic (ignore the any types)? I can tell the code in fromRestaurant that just repeatedly loops over the JSOn structure is terrible but am unsure on a better approach.
type RestaurantDetails = {
restaurantName: string;
joined: boolean;
};
type Restaurant = {
position: number;
title: string;
data: Array<Array<RestaurantDetails>>;
};
function* zip(arrayOne: any[], arrayTwo: any[]) {
const len = arrayOne.length;
for (let i = 0; i < len; i++) {
yield [arrayOne[i], arrayTwo[i]];
}
}
// Expected output is to return just the list inside data
const toRestaurantsArray = (initialRestaurantsData: Restaurant[]) => {
const allRestaurantNames: any = [];
for (const eachRestaurantData of initialRestaurantsData) {
const [restaurants] = eachRestaurantData.data;
const restaurantNames = restaurants.map(
({ restaurantName }) => restaurantName
);
allRestaurantNames.push(restaurantNames);
}
return allRestaurantNames;
};
const fromRestaurantsArray = (
restaurantsArray: any[],
initialRestaurantsData: any[]
) => {
const reconstructedRestaurantsData: any = [];
for (const [restaurantNestedArray, eachRestaurantData] of zip(
restaurantsArray,
initialRestaurantsData
)) {
const dataProperty: any = [];
for (const [restaurantName, eachRestaurantsInnerData] of zip(
restaurantNestedArray,
eachRestaurantData.data[0]
)) {
dataProperty.push({
...eachRestaurantsInnerData,
restaurantName: restaurantName,
});
}
const myTotal = { ...eachRestaurantData, data: dataProperty };
reconstructedRestaurantsData.push(myTotal);
}
return reconstructedRestaurantsData;
};
const restoArray = toRestaurantsArray(data);
const filteredArray = restoArray.filter((arr:string[]) => arr.includes("Tonkotsu"))
const reconstructedArray = fromRestaurantsArray(filteredArray, data);
console.log(reconstructedArray)
I cannot offer a direct solution to your question.
JSON formatting notwithstanding, I think what you are attempting to do is similar to what Firebase's query language and other database products do with JSON.
Here's an article entitled Jolt: JSON to JSON Transformation Library.
Not really my area, but my understanding is that these technologies rely on meta data to index the "searchable" parts of a JSON blob. I once had a passing interest in JSON schemas but, so far, haven't had a practical need for implementing one myself.
Your question here reminded me a bit of schemas and made me think about a practical use case for them beyond validation.
(Back in the day, I use to regularly transform XML documents in a similar manner to what you are attempting to do with JSON. There are specialized libraries for transforming XML.)
Related
I receive JSON data from the service, but the keys change in the data with each request, below I will give an example in three versions.
Exmaple 1:
{
"trackingPayloads": {
"Rltyn4gLRIWRKj9kS0YpWXytG81GZwcPWjEE7f31ALlq": "{"title":"Red Shoes","index":3,"id":"17777","type":"category"}',
"ywtA6OyM0hzVZZvnUjxoxJDI1Er9ArfNr8XKyi1D5Zzk": "{"title":"White Shoes","index":3,"id":"17777","type":"category"}',
}
}
Example 2:
{
"trackingPayloads": {
"36tW7DqZ3H9KKBEAumZmowmUwmDRmVCjQgv5zi9GM3Kz": "{"title":"Red Shoes","index":3,"id":"17777","type":"category"}',
"OgtE51n3YtvrVXWLFjPmpnRt2k5DExF7ovxmBTZrZ6wV": "{"title":"White Shoes","index":3,"id":"17777","type":"category"}',
}
}
Example 3:
{
"trackingPayloads": {
"k2toY29glt2JEp9Wi1X5M7ocno0E0mS4JQVyDuGyQ2rM": "{"title":"Red Shoes","index":3,"id":"17777","type":"category"}'",
"5ef2ec3c3573eebecc9690b69619ec7b9c93b609": "{"title":"White Shoes","index":3,"id":"17777","type":"category"}',
}
}
As you can see, the data included in the keys does not change since I am requesting the same information, but the key will change with each request.
Please help, what are the options to get the data Title, Index and any other content in these keys using node js?
Only one option came to my mind - to rename the keys upon receipt in 1,2,3 ... and then get data from them, but this needs to be done dynamically, since about 120 requests per minute are made, you need to get this data quickly, there are no options to save it to a file (I didn’t understand how)
UPDATE, added my code.
I am attaching an example of my code, the idea is to eventually get the data I need from the right keys from trackingPayloads, please help with the code <3
const AwaitAPIResponse = await ProductAPI(product_sku);
const $ = cheerio.load(AwaitAPIResponse);
const JSONDATA = [];
$('pre').each(function() {
JSONDATA.push($(this).text());
});
const ProductJson = JSON.parse(JSONDATA[0]) // this is where I get all the data
const MainJson = ProductJson["trackingPayloads"] // here I go to the trackingPayloads you saw above
How can I get the data I need?
You can use Object.keys() to get all the different keys of an object and use a loop to go through them.
Therefore, you can rework this code in such a way that each of the values is stored as an element in an array, maybe makes the data easier to work with:
const convert = object => {
const ret = []
for (const key of Object.keys(object)) {
ret.push(object[key])
}
return ret
}
This will give you following result for your use case:
[{"title":"Red Shoes","index":3,"id":"17777","type":"category"},
{"title":"Red Shoes","index":3,"id":"17777","type":"category"}]
The way you'd call this is as follows:
const some_parsed_json = {
"k2toY29glt2JEp9Wi1X5M7ocno0E0mS4JQVyDuGyQ2rM": {
title:"Red Shoes",
index:3,
id:"17777",
type:"category"
},
"5ef2ec3c3573eebecc9690b69619ec7b9c93b609": {
title:"Red Shoes",
index:3,
id:"17777",
type:"category"
}
}
const json_object_values = convertor(some_parsed_json)
If you don't car about the key you could use Object.values on the received object to get the values
Object.values(payload)
// With your example it will return:
// [{"title":"Red Shoes","index":3,"id":"17777","type":"category"},
// {"title":"Red Shoes","index":3,"id":"17777","type":"category"}]
or in a more complete example
async function getParsedValues() {
const responseString = await service.getData(); // '{"trackingPayloads":{"Rltyn4gLRIWRKj9kS0YpWXytG81GZwcPWjEE7f31ALlq":{"title":"Red Shoes","index":3,"id":"17777","type":"category"},"ywtA6OyM0hzVZZvnUjxoxJDI1Er9ArfNr8XKyi1D5Zzk":{"title":"White Shoes","index":3,"id":"17777","type":"category"}}}'
const parsedResponse = JSON.parse(responseString); // { trackingPayloads: { Rltyn4gLRIWRKj9kS0YpWXytG81GZwcPWjEE7f31ALlq: { title:'RedShoes', index: 3, id: '17777', type: 'category' }, ywtA6OyM0hzVZZvnUjxoxJDI1Er9ArfNr8XKyi1D5Zzk:{title:'WhiteShoes', index: 3, id: '17777', type: 'category' } }}
const values = Object.values(parsedResponse); // [{"title":"Red Shoes","index":3,"id":"17777","type":"category"}, {title:'WhiteShoes', index: 3, id: '17777', type: 'category' }]
return values;
}
I've been pondering the best way to handle grouping in my app. It's a video editing app and I am introducing the ability to group layers. If you're familiar with Figma or any design/video editing program then there is usually the ability to group layers.
To keep this simple in the app the video data is a map
const map = {
"123": {
uid: "123",
top: 25,
type: "text"
},
"345": {
uid: "345",
top: 5,
type: "image"
},
"567": {
uid: "567",
top: 25,
type: "group"
children: ["345", "123"]
}
}
Then I am grouping them inside a render function (this feels expensive)
const SomeComponent = () => {
const objects = useMemo(() => makeTrackObjects(map), [map]);
return (
<div>
{objects.map(object => {
return <div>Some layer that will change the data causing re-renders</div>
})}
</div>
)
}
Here is the function that does the grouping
const makeTrackObjects = (map) => {
// converts map to array
const objects = Object.keys(map).map((key: string) => ({ ...map[key] }));
// flat array of all objects to be grouped by their key/id
const objectsInGroup = objects
.filter((object) => object.type === "group")
.map((object) => object.children)
.flat();
// filter out objects that are nested/grouped
const filtered = objects.filter((object) => !objectsInGroup.includes(object.uid))
// insert objects as children during render
const grouped = filtered.map((object) => {
const children = object.children
? {
children: object.children
.map((o, i) => {
return {
...map[o]
};
})
.flat()
}
: {};
return {
...object,
...children
};
});
// the core data is flat but now nested for the UI. Is this inefficient?
return grouped
}
Ideally I would like to keep the data flat, I have a lot of code that I would have to update to go deep in the data. It feels nice to have it flat and transformers in certain areas where needed.
The main question is does this make sense, is it efficient, and if not then why?
If you are running into performance issues, one area you may want to investigate is how you are chaining array functions (map, filter, flat, etc). Each call to one of these functions creates an intermediate collection based on the array it receives. (For instance, if we chained 2 map functions, this is looping through the full array twice). You could increase performance by creating one loop and adding items into a collection. (Here's an article that touches on this being a motivation for transducers.)
I haven't encountered a performance issue with this before, but you may also want to remove spread (...) when unnecessary.
Here is my take on those adjustments on makeTrackObjects.
Update
I also noticed that you are using includes while iterating through an array. This is effectively O(n^2) time complexity because each item will be scanned against the full array. One way to mitigate is to instead use a Set to check if that content already exists, turning this into O(n) time complexity.
const map = {
"123": {
uid: "123",
top: 25,
type: "text"
},
"345": {
uid: "345",
top: 5,
type: "image"
},
"567": {
uid: "567",
top: 25,
type: "group",
children: ["345", "123"]
}
};
const makeTrackObjects = (map) => {
// converts map to array
const objects = Object.keys(map).map((key) => map[key]);
// set of all objects to be grouped by their key/id
const objectsInGroup = new Set();
objects.forEach(object => {
if (object.type === "group") {
object.children.forEach(child => objectsInGroup.add(child));
}
});
// filter out objects that are nested/grouped
const filtered = objects.filter((object) => !objectsInGroup.has(object.uid))
// insert objects as children during render
const grouped = filtered.map((object) => {
const children = {};
if (object.children) {
children.children = object.children.map(child => map[child]);
}
return {
...object,
...children
};
});
// the core data is flat but now nested for the UI. Is this inefficient?
return grouped
}
console.log(makeTrackObjects(map));
This is in the context of a node express route. I receive a get request with a query param that is a list of IDs. Now I need to make a call-out for each ID and store the result of the callout in an array or object. Each element of the first array (containing the IDs) need to be mapped to its corresponding result from the call-out. I don't have a way to modify the endpoint that I'm hitting from this route so I have to make single calls for each ID. I've done some research and so far I have a mixture of code and sudo code like this:
const ids = req.query.ids;
const idMembers = Promise.all(ids.map(async id => {
// here I'd like to create a single object or associative array
[ id: await callout(id); ]
}));
When all promises resolved I need the final result of idMembers to be like: (The response will be an object with nested arrays and objects I've just simplified it for this post but I need to grab that from the res.payload)
{
'211405': { name: 'name1', email: 'email1#test.co' },
'441120': { name: 'name2', email: 'email2#test.co' },
'105020': { name: 'name3', email: 'email4#test.co' }
}
Oh and of course I need to handle the callout and the promise failures and that's when my lack of experience with javascript becomes a real issue. I appreciate your help in advance!!
Some extra thought I'm having is that I'd have to map the results of the resolved promises to their id and then in a separate iteration I can then create my final array/object that maps the ids to the actual payloads that contain the object. This is still not answering any of my questions though. I'm just trying to provide as much information as I've gathered and thought of.
Promise.all returns an array of results (one item per each promise).
Having this temporary structure it is possible to build the needed object.
const arrayOfMembers = Promise.all(ids.map(async id => {
// ...
return { id, value: await callout(id) } // short syntax for { id: id, value: ... } (see https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Object_initializer)
}));
// arrayOfMembers = [
// { id: 211405, value: { name: 'name1', email: 'email1#test.co' } },
// ...
// ]
In pure JS it can be done with for loop or .forEach() call to iterate:
const res = {};
arrayOfMembers.forEach(el => {
const { id, value } = el;
res[el] = value;
});
or by using a single reduce() call
const res = arrayOfMembers.reduce((accumulator, el) => {
const { id, value } = el;
return { ...accumulator, [id]: value };
}, {});
in both cases res will be:
// res = {
// '211405': { name: 'name1', email: 'email1#test.co' },
// ...
// }
P.S.
There is a handy library called lodash. It has tons of small methods for data manipulation.
For example, _.fromPairs() can build an object from [[key1, value1], [key2, value2]] pairs.
As you mentioned you have lodash, so I think the following should work:
const arrayOfKeyValuePairs = Promise.all(ids.map(async id => {
// ...
return [ id, await callout(id) ] // array here so it matches what fromPairs needs
}));
const res = _.fromPairs(arrayOfKeyValuePairs);
I'm using normalizr util to process API response based on non-ids model. As I know, typically normalizr works with ids model, but maybe there is a some way to generate ids "on the go"?
My API response example:
```
// input data:
const inputData = {
doctors: [
{
name: Jon,
post: chief
},
{
name: Marta,
post: nurse
},
//....
}
// expected output data:
const outputData = {
entities: {
nameCards : {
uniqueID_0: { id: uniqueID_0, name: Jon, post: uniqueID_3 },
uniqueID_1: { id: uniqueID_1, name: Marta, post: uniqueID_4 }
},
positions: {
uniqueID_3: { id: uniqueID_3, post: chief },
uniqueID_4: { id: uniqueID_4, post: nurse }
}
},
result: uniqueID_0
}
```
P.S.
I heard from someone about generating IDs "by the hood" in normalizr for such cases as my, but I did found such solution.
As mentioned in this issue:
Normalizr is never going to be able to generate unique IDs for you. We
don't do any memoization or anything internally, as that would be
unnecessary for most people.
Your working solution is okay, but will fail if you receive one of
these entities again later from another API endpoint.
My recommendation would be to find something that's constant and
unique on your entities and use that as something to generate unique
IDs from.
And then, as mentioned in the docs, you need to set idAttribute to replace 'id' with another key:
const data = { id_str: '123', url: 'https://twitter.com', user: { id_str: '456', name: 'Jimmy' } };
const user = new schema.Entity('users', {}, { idAttribute: 'id_str' });
const tweet = new schema.Entity('tweets', { user: user }, {
idAttribute: 'id_str',
// Apply everything from entityB over entityA, except for "favorites"
mergeStrategy: (entityA, entityB) => ({
...entityA,
...entityB,
favorites: entityA.favorites
}),
// Remove the URL field from the entity
processStrategy: (entity) => omit(entity, 'url')
});
const normalizedData = normalize(data, tweet);
EDIT
You can always provide unique id's using external lib or by hand:
inputData.doctors = inputData.doctors.map((doc, idx) => ({
...doc,
id: `doctor_${idx}`
}))
Have a processStrategy which is basically a function and in that function assign your id's there, ie. value.id = uuid(). Visit the link below to see an example https://github.com/paularmstrong/normalizr/issues/256
I am wondering if there is a way I can construct mongo's queries to take advantage of es6 default parameters. I have the following method. I want to return all the data if make, model and year is not specified. I am trying to find an elegant solution but so far all I can think of is manual if else.
getStyles({ make = '', model = '', year = '-1' }) {
return this.db
.collection('styles')
.find({ 'make.niceName': make, 'model.niceName': model, 'year.year': parseInt(year) })
.toArray();
}
Note:
This is causing some confusion. I am using destructing on purpose. The problem is not how to write this function. The problem is how to construct a mongo query so it would ignore empty values.
Assuming getStyles is your own method, sure, you can give make, model, and year defaults. You can also give a default for the whole object you're destructuring so caller doesn't have to pass anything:
function getStyles({make = '', model = '', year = '-1'} = {}) {
// Overall default ------------------------------------^^^^^
return // ...
}
The question is not how to organize/write my function but how to use es6 features to write a cleaner code that would work with mongo. I.E if the user didn't pass anything I want to return all the styles but mongo actually looks for empty fields so it doesn't return anything.
It sounds to me like you don't want default parameters (except perhaps the overall default). Instead, you want to automate how you build the object you pass find.
Given your code example, you can readily do that with Object.keys on your object. So accept as an object, e.g.:
function getStyles(options = {}) {
...an then build your find options based on options:
const findParams = {};
Object.keys(options).forEach(key => {
findParams[key + ".niceName"] = options[key];
});
Live example:
function getStyles(options = {}) {
const findParams = {};
Object.keys(options).forEach(key => {
findParams[key + ".niceName"] = options[key];
});
console.log(`find options: ${JSON.stringify(findParams)}`);
}
let results = getStyles({make: "Ford", model: "Mustang"});
results = getStyles({make: "Ford", model: "Mustang", year: 2017});
If the mapping of the name you accept (make) to the name you need for find (make.niceName) isn't as easy as just appending .niceName, it's easy enough to have a Map (or just object) you build once:
const paramNames = new Map([
["make", "make.niceName"],
["model", "model.niceName"],
["year", "year.niceName"]
]);
...and then use:
const findParams = {};
Object.keys(options).forEach(key => {
const paramName = paramNames.get(key);
if (paramName) {
findParams[paramName] = options[key];
}
});
Live example:
const paramNames = new Map([
["make", "make.niceName"],
["model", "model.niceName"],
["year", "year.niceName"]
]);
function getStyles(options = {}) {
const findParams = {};
Object.keys(options).forEach(key => {
const paramName = paramNames.get(key);
if (paramName) {
findParams[paramName] = options[key];
}
});
console.log(`find options: ${JSON.stringify(findParams)}`);
}
let results = getStyles({make: "Ford", model: "Mustang"});
results = getStyles({make: "Ford", model: "Mustang", year: 2017});
Side note: Defaults don't have to be strings, so if you use numbers for year rather than strings, your default would just be -1, not '-1'.