I have my data as following:
{
meta: {
format: "csv",
info: "desc",
columns: [
{
id: "Name",
type: "Text",
length: 32
},
{
id: "Text",
type: "Text",
length: 128
}]
},
rows: [
["John","xxxx"],
["Alpha","yyyy"],
["Beta","wwww"],
["Gamma","zzzz"]]
}
Now, I am struggling to map the records to a Table control as Columns and Rows. Column seems straight forward, straight map, but the rows since lacks a mapping to column I wonder what could be the simplest way.
Approach Steps:
Make a keys[] from column.id of each columns record.
Traverse the rows[]
Each loop, while keys.length create an object as {keys[j]:row[k]}
Push to an array
Recreate the original JSON but replace Rows arrays with Objects
I am really struggling to translate this into code specially at the rows[] parsing and creating Objects. Is there, I am sure there must be, an efficient way to achieve this.
Here is what you could do. using Array.map and forEach.
var input = {
meta: {
format: "csv",
info: "desc",
columns: [{
id: "Name",
type: "Text",
length: 32
}, {
id: "Text",
type: "Text",
length: 128
}]
},
rows: [
["John", "xxxx"],
["Alpha", "yyyy"],
["Beta", "wwww"],
["Gamma", "zzzz"]
]
};
var columns = input.meta.columns.map((column) => {
return column.id
});
var rows = input.rows.map((row) => {
var obj = {};
row.forEach((column, idx) => {
obj[columns[idx]] = column;
});
return obj;
});
input.rows = rows;
console.log(input);
Related
i have problem with my javascript project.
i have an array with value like this :
exmpArr = ["PX1","PX2","PX3"];
and i want to loop and push it to obj like this:
sections = [
{
rows: [
{ title: exmpArr[i], rowId: exmpArr[i] },
],
},
];
the final value must like this:
sections = [
{
rows: [
{ title: "PX1", rowId: "PX1" },
{ title: "PX2", rowId: "PX2" },
{ title: "PX3", rowId: "PX3" },
],
},
];
what should i do?
what i did is i put for loop inside the object and its not work
map returns a new array from the one you're mapping over. So you can immediately assign that array as the property value when you build your object.
const exmpArr = ['PX1', 'PX2', 'PX3'];
const sections = [
{
rows: exmpArr.map(el => {
return { title: el, rowId: el };
})
}
];
console.log(sections);
I have a json tree structure that I want to normalize into something like a hashmap and then denormalize it back to a tree if needed.
I have a very dynamic tree that I want to use as state in my react-redux project, but for that I somehow need to transform the data so that I can access it without having to search elements recursively in the tree each time I want to update/access the state.
const actual = {
id: "1",
type: 'Container',
data: {},
items: [
{
id: "2",
type: "Subcontainer",
data: {
title: "A custom title",
text: "A random Headline"
},
items: []
},
{
id: "3",
type: "Subcontainer",
data: {
title: "A custom title",
text: "A random Headline"
},
items: []
}
]
};
Now I want to transform it into something like:
const expected = {
1: {
id: "1",
type: 'Container',
data: {},
items: [1, 2]
},
2: {
id: "2",
type: "Subcontainer",
data: {
title: "A custom title",
text: "A random Headline"
},
items: []
},
3: {
id: "3",
type: "Subcontainer",
data: {
title: "A custom title",
text: "A random Headline"
},
items: []
}
};
I found a JS lib called Normalizr, but I absolutely don't get how to create the schemas for it to work.
That was my last try, and it returns only the inner two items and also directly the data object inside without id, items around:
const data = new schema.Entity("data");
const item = new schema.Object({ data });
item.define({ items: new schema.Array(item) });
const items = new schema.Array(item);
const normalizedData = normalize(mock, items);
I'm not going to worry too much about the types, since you can alter those to meet your needs. Going off you're example, I will define
interface Tree {
id: string;
type: string;
data: {
title?: string;
text?: string;
items: Tree[];
}
}
interface NormalizedTree {
[k: string]: {
id: string;
type: string;
data: {
title?: string;
text?: string;
items: string[]
}
}
}
and we want to implement function normalize(tree: Tree): NormalizedTree and function denormalize(norm: NormalizedTree): Tree.
The normalize() function is fairly straightforward since you can recursively walk the tree and collect the normalized trees into one big normalized tree:
function normalize(tree: Tree): NormalizedTree {
return Object.assign({
[tree.id]: {
...tree,
data: {
...tree.data,
items: tree.data.items.map(v => v.id)
}
},
}, ...tree.data.items.map(normalize));
}
In English, we are making a single normalized tree with a property with key tree.id and a value that's the same as tree except the data.items property is mapped to just the ids. And then we are mapping each element of data.items with normalize to get a list of normalized trees that we spread into that normalized tree via the Object.assign() method. Let's make sure it works:
const normalizedMock = normalize(mock);
console.log(normalizedMock);
/* {
"1": {
"id": "1",
"type": "Container",
"data": {
"items": [
"2",
"3"
]
}
},
"2": {
"id": "2",
"type": "Subcontainer",
"data": {
"title": "A custom title",
"text": "A random Headline",
"items": []
}
},
"3": {
"id": "3",
"type": "Subcontainer",
"data": {
"title": "A custom title",
"text": "A random Headline",
"items": []
}
}
} */
Looks good.
The denormalize() function is a little trickier, because we need to trust that the normalized tree is valid and actually represents a tree with a single root and no cycles. And we need to find and return that root. Here's one approach:
function denormalize(norm: NormalizedTree): Tree {
// make Trees with no children
const treeHash: Record<string, Tree> =
Object.fromEntries(Object.entries(norm).
map(([k, v]) => [k, { ...v, data: { ...v.data, items: [] } }])
);
// keep track of trees with no parents
const parentlessTrees =
Object.fromEntries(Object.entries(norm).map(([k, v]) => [k, true]));
Object.values(norm).forEach(v => {
// hook up children
treeHash[v.id].data.items = v.data.items.map(k => treeHash[k]);
// trees that are children do have parents, remove from parentlessTrees
v.data.items.forEach(k => delete parentlessTrees[k]);
})
const parentlessTreeIds = Object.keys(parentlessTrees);
if (parentlessTreeIds.length !== 1)
throw new Error("uh oh, there are " +
parentlessTreeIds.length +
" parentless trees, but there should be exactly 1");
return treeHash[parentlessTreeIds[0]];
}
In English... first we copy the normalized tree into a new treeHash object where all the data.items are empty. This will eventually hold our denormalized trees, but right now there are no children.
Then, in order to help us find the root, we make a set of all the ids of the trees, from which we will remove any ids corresponding to trees with parents. When we're all done, there should hopefully be a single id left, that of the root.
Then we start populating the children of treeHash's properties, by mapping the corresponding data.items array from the normalized tree to an array of properties of treeHash. And we remove all of these child ids from parentlessTreeIds.
Finally, we should have exactly one property in parentlessTreeIds. If not, we have some kind of forest, or cycle, and we throw an error. But assuming we do have a single parentless tree, we return it.
Let's test it out:
const reconsitutedMock = denormalize(normalizedMock);
console.log(reconsitutedMock);
/* {
"id": "1",
"type": "Container",
"data": {
"items": [
{
"id": "2",
"type": "Subcontainer",
"data": {
"title": "A custom title",
"text": "A random Headline",
"items": []
}
},
{
"id": "3",
"type": "Subcontainer",
"data": {
"title": "A custom title",
"text": "A random Headline",
"items": []
}
}
]
}
} */
Also looks good.
Playground link to code
I would recommend .flatMap for this kind of transformations:
const flattenTree = element => [
element,
...element.data.items.flatMap(normalizeTree)
]
This move you from this shape:
{
id: 1,
data: { items: [
{
id: 2,
data: { items: [
{ id: 3, data: { items: [] } },
] }
] }
}
to this one:
[
{ id: 1, data: {...}},
{ id: 2, data: {...}},
{ id: 3, data: {...}},
]
Then once you have a flat array, you can transform it further to remove the references and create an object from entries:
const normalizedTree = element => {
let flat = flattenTree(element)
// only keep the id of each items:
// [{ id: 1, data:{...}}] -> [1]
// for (const el of flat) {
// el.data.items = el.data.items.map(child => child.id)
// }
// note that the for loop will change the initial data
// to preserve it you can achieve the same result with
// a map that will copy every elements:
const noRefs = flat.map(el => ({
...el,
data: {
...el.data,
items: el.data.items.map(child => child.id),
},
}))
// then if you need an object, 2 steps, get entries, [id, elem]:
const entries = noRefs.map(({ id, ...element }) => [id, element])
// then the built-in `Object.fromEntries` do all the work for you
// using the first part of the entry as key and last as value:
return Object.fromEntries(entries)
// if you have multiple objects with the same id's, only the last one
// will be in your returned object
}
I have an array containing 7 objects, all of the articles. I need to be able to show only the first 3 articles
const myArray = [
{
id: "article_1",
type: "articles"
},
{
id: "article_2",
type: "articles"
},
{
id: "article_3",
type: "articles"
},
{
id: "article_4",
type: "articles"
},
{
id: "article_5",
type: "articles"
},
{
id: "article_6",
type: "articles"
},
{
id: "article_7",
type: "articles"
}
]
const filteredArticles = myArray.filter(article => myArray.length > 3)
console.log(filteredArticles)
Unfortunately, it returns an empty array instead of an array with articles.
I know a solution could be to use if(myArray.length > 3) { //show only index 0, 1, 2, 3 } else { // blah blah}
But I am trying to use js .filter which should easily hide the number of articles over 3.
Try
myArray.filter((_,i) => i < 3)
const myArray = [
{
id: "article_1",
type: "articles"
},
{
id: "article_2",
type: "articles"
},
{
id: "article_3",
type: "articles"
},
{
id: "article_4",
type: "articles"
},
{
id: "article_5",
type: "articles"
},
{
id: "article_6",
type: "articles"
},
{
id: "article_7",
type: "articles"
}
]
const filteredArticles = myArray.filter((_,i) => i < 3);
console.log(filteredArticles);
The second argument of the callback passed to filter is the current index. So you can do something like:
const filteredArticles = myArray.filter((article, i) => {
return i < 3;
});
That would be truthy for only the first three elements, so you're filtered array would be the first three. Keep in mind that filter will still check the rest of the array, so there might be a more performant way of doing this.
EDIT: As some commenters have mentions, .slice would be a better way of doing this as you don't have to then iterate over the rest of the array as in the filter solution.
I have an array of objects, each object is similar to:
{ word: 'intentional',
definition: 'done by intention or design',
type: 'adjective',
Synonyms: [ 'conscious', 'deliberate', 'intended', 'knowing', ] }
I am trying to convert the whole array into following json format:
{
"conscious": {
"data": ["done by intention or design"],
"type": "adjective",
"Synonym For": ["intentional"]
},
"deliberate": {
"data": ["done by intention or design"],
"type": "adjective",
"Synonym For": ["intentional"]
},
...
}
This json format is an input to another program, which I do not control.
I am running it on node.js.
How can I declare an object and then loop through the array to fill it as intended?
var obj = { word: 'intentional',
definition: 'done by intention or design',
type: 'adjective',
Synonyms: [ 'conscious', 'deliberate', 'intended', 'knowing' ] },
res = obj.Synonyms.reduce(function(s,a) {
s[a] = { data: [obj.definition], type: obj.type, SynonymFor: [obj.word] };
return s;
}, {});
console.log(res);
var jsonObj = {};
wordArray.forEach((word) => {
word.Synonyms.forEach((synonym) => {
jsonObj[synonym] = {
data: [word.definition],
type: word.type,
'Synonym For': [word.word]
};
})
})
I'm trying to handle JSON with nested structure with ExtJS4. Please do not answer like here
because it's wrong answer. I use the expandData: true with model mappings and it works for me really fine.
The problem I expect is with one field that is array of objects. So, here is my code sample:
Ext.define('EdiWebUI.model.Document', {
extend: 'Ext.data.Model',
fields: [
{name: 'document_header_documentReceiveDateTime', mapping: 'document.header.documentReceiveDateTime', type: 'string'},
{name: 'document_header_documentProcessDateTime', mapping: 'document.header.documentProcessDateTime', type: 'string'},
{name: 'document_header_documentID', mapping: 'document.header.documentID', type: 'string'},
...
{name: 'lines', type: 'auto'},
...
{name: 'attachments_documentFile_fileName', mapping: 'attachments.documentFile.fileName', type: 'string'},
{name: 'attachments_documentFile_content', mapping: 'attachments.documentFile.content', type: 'string'}
],
hasMany: [
{model: 'DocumentLines', name: 'lines', associationKey: 'lines'}
],
proxy: {
type: 'rest',
url: '/document',
reader: {
type: 'json',
root: 'data'
},
writer: {
expandData: true,
writeAllFields: true,
nameProperty: 'mapping'
}
}
});
Ext.define('DocumentLines',{
extend: 'Ext.data.Model',
fields: [
{'name': 'line_lineItem_lineNumber', mapping: 'line.lineItem.lineNumber', type: 'string'},
{'name': 'line_lineItem_orderedQuantity', mapping: 'line.lineItem.orderedQuantity', type: 'string'},
{'name': 'line_lineItem_orderedUnitPackSize', mapping: 'line.lineItem.orderedUnitPackSize', type: 'string'},
...
});
So, it working well when reading JSON like this:
{
"data": {
"document": {
"header": {
"documentReceiveDateTime": "2014-03-25T08:34:24",
"documentProcessDateTime": "2014-03-25T08:44:51",
"documentID": "83701540",
...,
"lines": [
{
"line": {
"lineItem": {
"lineNumber": "1",
"orderedQuantity": "5.000",
"orderedUnitPackSize": "1.000"
}
}
},
{
"line": {
"lineItem": {
"lineNumber": "2",
"orderedQuantity": "4.000",
"orderedUnitPackSize": "1.000"
}
}
}
]
...
but I can't make writer to parse lines. When I'm truing to save my document I already have output like this:
{ lines:
[ { line_lineItem_lineNumber: 1,
line_lineItem_ean: '4352345234523',
line_lineItem_orderedQuantity: '45'} ],
(other parts of document are expanded well)
So, here is a question: Is there a way to make it works as I need?
...or I should make a trick on a server side (as I actually do now)...
Thanks in advance.
You have two choices here:
The proper way, which is to use the stores capabilities: define your dataWriter and code your own function in order to get the json you want.
Don't use the store to update your records, create the json you want and use an Ajax request to update the records you need to update.
Both ways uses Ajax anyway, the first one should be preferred.
I would define my writer in the same file as the store, something like:
Ext.define('MyApp.custom.Writer',{
/*
* Formats the data for each record before sending it to the server.
* This method should be overridden to format the data in a way that differs from the default.
*/
getRecordData: function(record) {
var data = {};
/*
* Parse your record and give it whatever structure you need here..
*/
data.lines = [];
return data;
}
});
Although you seem to have one extra level of indirection in your Json, the "lineItem" is not necessarly needed as you already have a one-to-one relationship between line <-> lineItem and lineItem <-> and the object defined by lineItem. But this is a different question.
I've used the answer above, but wanted to share the code to make it a little bit easier for people who try the same thing.
Dr. Leevsey's Code from above worked for me but had the disadvantag that it puts everything inside an array. For my project it worked better if it returned an object (with the child objects) and didn't return an array if the base object is not an array.
Here is the code:
Ext.define('MyApp.util.customWriter',
{
extend: 'Ext.data.writer.Json',
getRecordData: function (record, operation) {
var data = record;
var me = this;
var toObject = function (name, value) {
var o = {};
o[name] = value;
return o;
};
var itemsToObject = function (item) {
for (prop in item) {
if (Array.isArray(item[prop])) {
me.getRecordData(item[prop]);
}
else {
if (item.hasOwnProperty(prop)) {
var nameParts = prop.split('.');
var j = nameParts.length - 1;
if (j > 0) {
var tempObj = item[prop];
for (; j > 0; j--) {
tempObj = me.toObject(nameParts[j], tempObj);
}
item[nameParts[0]] = item[nameParts[0]] || {};
Ext.Object.merge(item[nameParts[0]], tempObj);
delete item[prop];
}
}
}
}
};
if (!Array.isArray(data)) {
data = data.getData();
itemsToObject(data);
}
else {
var dataLength = data.length;
for (var i = 0; i < dataLength; i++) {
itemsToObject(data[i]);
}
}
return data;
}
});