This is the result I want to achieve
dataset: [
dataset: [
{
seriesname: "",
data: [
{
value: "123",
},
{
value: "123",
},
]
},
]
]
My problem right now is that the second dataset gets duplicated.
This is how I am setting it (val is an integer and allYears is an array of integers):
this.grphColumn.dataSource.dataset[0].dataset = this.allYears.map(el => {
return {
seriesname: "Planned",
data: [{value: val}, {value: val}]
}
});
How can I make it so the dataset doesn't get duplicated?
You have to map the values separately, if you dont want the seriesName to be Repeated..
const yearsMap = this.allYears.map((el) => { return { value: el } });
this.grphColumn.dataSource.dataset[0].dataset = {
seriesname: "Planned",
data: yearsMap
}
I've have a complex data structure with multiple nested arrays in place.
Below is the current structure
var contentData = {
data: {
content: [
{
type: "column",
sections: [
{
sub: [
{
type: "heading-1",
text: "Heading Text"
}
]
}
]
},
{
type: "acc-item",
sections: [
{
sub: [
{
type: "heading-1",
text: "Heading Text"
},
{
type: "ordered-item",
text: "Item 1"
},
{
type: "unordered-item",
text: "Item 2"
}
]
}
]
},
{
type: "acc-item",
sections: [
{
sub: [
{
type: "heading-1",
text: "Heading Text 2"
}
]
}
]
}
]
}
}
So What I wanted is,
I wanted to group all the ordered-item & unordered-item into a new object like {type: 'list', items:[all list items]}.
I need to extract all items which are inside sub and push it to new object embedded and it should placed in the root level like below,
{type:"acc-item",embedded:[{type:"heading-1",text:"Heading Text 2"}]};
So What I've done so far,
I can able to group acc-item, but not the ordered-item & unordered-item.
So my final expected result should like this,
[{
"type": "column",
"embedded": [
{
"type": "heading-1",
"text": "Heading Text"
}
]
},
{
"type": "acc-group",
"items": [
{
"type": "acc-item",
"embedded": [
{
"type": "heading-1",
"text": "Heading Text"
},
{
"type": "list",
"items": [
{
"type": "ordered-item",
"text": "Item 1"
},
{
"type": "unordered-item",
"text": "Item 2"
}
]
}
]
},
{
"type": "acc-item",
"embedded": [
{
"type": "heading-1",
"text": "Heading Text 2"
}
]
}
]
}]
Below is my code,
var group,contentData={data:{content:[{type:"column",sections:[{sub:[{type:"heading-1",text:"Heading Text"}]}]},{type:"acc-item",sections:[{sub:[{type:"heading-1",text:"Heading Text"},{type:"ordered-item",text:"Item 1"},{type:"unordered-item",text:"Item 2"}]}]},{type:"acc-item",sections:[{sub:[{type:"heading-1",text:"Heading Text 2"}]}]}]}},types=[["list",["ordered-item","unordered-item"]],["accordion",["acc-item"]]];
var result = contentData.data.content.reduce((r, o) => {
var type = (types.find(({ 1: values }) => values.indexOf(o.type) > -1)|| {})[0];
if (!type) {
r.push(o);
group = undefined;
return r;
}
if (!group || group.type !== type) {
group = { type, items: [] };
r.push(group);
}
group.items.push(o);
return r;
}, []);
document.body.innerHTML = '<pre>' + JSON.stringify(result, null, ' ') + '</pre>';
You could store the last items array as well as the last embedded array and use them until a column type is found.
var contentData = { data: { content: [{ type: "column", sections: [{ sub: [{ type: "heading-1", text: "Heading Text" }] }] }, { type: "acc-item", sections: [{ sub: [{ type: "heading-1", text: "Heading Text" }, { type: "ordered-item", text: "Item 1" }, { type: "unordered-item", text: "Item 2" }] }] }, { type: "acc-item", sections: [{ sub: [{ type: "heading-1", text: "Heading Text 2" }] }] }] } },
list = ["ordered-item", "unordered-item"],
lastItems, lastEmbedded,
result = contentData.data.content.reduce((r, { type, sections }) => {
if (type === 'column') {
r.push({ type, embedded: sections.reduce((q, { sub }) => q.concat(sub), []) });
lastItems = undefined;
lastEmbedded = undefined;
return r;
}
if (!lastItems) r.push({ type: "acc-group", items: lastItems = [] });
lastItems.push(...sections.map(({ sub }) => ({
type,
embedded: sub.reduce((q, o) => {
if (list.includes(o.type)) {
if (!lastEmbedded) q.push({ type: 'list', items: lastEmbedded = [] });
lastEmbedded.push(o);
} else {
q.push(o);
lastEmbedded = undefined;
}
return q;
}, [])
})));
return r;
}, []);
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
The Array.prototype and Object.prototype methods are perfect for this kind of thing.
And you're right that this is some complicated kind of logic.
I would suggest that you definitely need some unit tests for this, and try break in to separate pieces.
Here's how I'm thinking I'd do it.
1. Group By the type to create your groups..
I'm actually creating a more generic solution that you've asked for here. That is, I'm not just grouping the 'acc-item', but everything.
I did a quick search for 'array group by javascript' and it gives us this answer which suggests using Array.reduce, so let's do that.
const groupedData = contentData.data.content.reduce((acc, cur) => {
//Check if this indexed array already exists, if not create it.
const currentArray = (acc[`${cur.type}-group`] && acc[`${cur.type}-group`].items) || [];
return {
...acc,
[`${cur.type}-group`]: {
type: `${cur.type}-group`,
items: [...currentArray, cur]
}
}
}, {});
2. Now for each of those items, we need to look at their subs, and group just the list items.
To do this, we basically want to find all the `item -> sections -> sub -> types and filter them into two arrays. A quick google on how to create two arrays using a filter gives me this answer.
First though, we need to flatten that sections-> subs thing, so lets just do that.
function flattenSectionsAndSubs(item) {
return {
type: item.type,
subs: item.sections.reduce((acc, cur) => ([...acc, ...cur.sub]), [])
};
}
And I'll just copy paste that partition function in:
function partition(array, isValid) {
return array.reduce(([pass, fail], elem) => {
return isValid(elem) ? [[...pass, elem], fail] : [pass, [...fail, elem]];
}, [[], []]);
}
const listTypes = ['ordered-item', 'unordered-item'];
function createEmbeddedFromItem(item) {
const [lists, nonLists] = partition(item.subs, (v) => listTypes.includes(v.type);
return {
type: item.type,
embedded: [
...nonLists,
{
type: "list",
items: lists
}
]
}
}
Putting this all together and we get.
const contentData = {
data: {
content: [{
type: "column",
sections: [{
sub: [{
type: "heading-1",
text: "Heading Text"
}]
}]
},
{
type: "acc-item",
sections: [{
sub: [{
type: "heading-1",
text: "Heading Text"
},
{
type: "ordered-item",
text: "Item 1"
},
{
type: "unordered-item",
text: "Item 2"
}
]
}]
},
{
type: "acc-item",
sections: [{
sub: [{
type: "heading-1",
text: "Heading Text 2"
}]
}]
}
]
}
}
function partition(array, isValid) {
return array.reduce(([pass, fail], elem) => {
return isValid(elem) ? [
[...pass, elem], fail
] : [pass, [...fail, elem]];
}, [
[],
[]
]);
}
function flattenSectionsAndSubs(item) {
return {
type: item.type,
subs: item.sections.reduce((acc, cur) => ([...acc, ...cur.sub]), [])
};
}
const listTypes = ['ordered-item', 'unordered-item'];
function createEmbeddedFromItem(item) {
const [lists, nonLists] = partition(item.subs, (v) => listTypes.includes(v.type));
return {
type: item.type,
embedded: [
...nonLists,
{
type: "list",
items: lists
}
]
}
}
const groupedData = contentData.data.content.reduce((acc, cur) => {
//Check if this indexed array already exists, if not create it.
const currentArray = (acc[`${cur.type}-group`] && acc[`${cur.type}-group`].items) || [];
const flattenedItem = flattenSectionsAndSubs(cur);
const embeddedItem = createEmbeddedFromItem(flattenedItem);
return {
...acc,
[`${cur.type}-group`]: {
type: `${cur.type}-group`,
items: [...currentArray, embeddedItem]
}
}
}, {});
console.log(groupedData);
Now this doesn't exactly match what you've asked for - but it should probably work.
You can add your own bits into only add a list item, if the array isn't empty, and to stop the column from being in its own group.
The thing is - tbh it seems like a little bit of a red flag that you would create an array of items that don't having matching structures, which is why I've done it this way.
I have an array of objects, each object is similar to:
{ word: 'intentional',
definition: 'done by intention or design',
type: 'adjective',
Synonyms: [ 'conscious', 'deliberate', 'intended', 'knowing', ] }
I am trying to convert the whole array into following json format:
{
"conscious": {
"data": ["done by intention or design"],
"type": "adjective",
"Synonym For": ["intentional"]
},
"deliberate": {
"data": ["done by intention or design"],
"type": "adjective",
"Synonym For": ["intentional"]
},
...
}
This json format is an input to another program, which I do not control.
I am running it on node.js.
How can I declare an object and then loop through the array to fill it as intended?
var obj = { word: 'intentional',
definition: 'done by intention or design',
type: 'adjective',
Synonyms: [ 'conscious', 'deliberate', 'intended', 'knowing' ] },
res = obj.Synonyms.reduce(function(s,a) {
s[a] = { data: [obj.definition], type: obj.type, SynonymFor: [obj.word] };
return s;
}, {});
console.log(res);
var jsonObj = {};
wordArray.forEach((word) => {
word.Synonyms.forEach((synonym) => {
jsonObj[synonym] = {
data: [word.definition],
type: word.type,
'Synonym For': [word.word]
};
})
})
I have my data as following:
{
meta: {
format: "csv",
info: "desc",
columns: [
{
id: "Name",
type: "Text",
length: 32
},
{
id: "Text",
type: "Text",
length: 128
}]
},
rows: [
["John","xxxx"],
["Alpha","yyyy"],
["Beta","wwww"],
["Gamma","zzzz"]]
}
Now, I am struggling to map the records to a Table control as Columns and Rows. Column seems straight forward, straight map, but the rows since lacks a mapping to column I wonder what could be the simplest way.
Approach Steps:
Make a keys[] from column.id of each columns record.
Traverse the rows[]
Each loop, while keys.length create an object as {keys[j]:row[k]}
Push to an array
Recreate the original JSON but replace Rows arrays with Objects
I am really struggling to translate this into code specially at the rows[] parsing and creating Objects. Is there, I am sure there must be, an efficient way to achieve this.
Here is what you could do. using Array.map and forEach.
var input = {
meta: {
format: "csv",
info: "desc",
columns: [{
id: "Name",
type: "Text",
length: 32
}, {
id: "Text",
type: "Text",
length: 128
}]
},
rows: [
["John", "xxxx"],
["Alpha", "yyyy"],
["Beta", "wwww"],
["Gamma", "zzzz"]
]
};
var columns = input.meta.columns.map((column) => {
return column.id
});
var rows = input.rows.map((row) => {
var obj = {};
row.forEach((column, idx) => {
obj[columns[idx]] = column;
});
return obj;
});
input.rows = rows;
console.log(input);
I'm working on a PrestaShop page with the file extension ".tpl". I get the javascript code to auto complete like this:
var currencies = [
{ value: 'Afghan afghani', data: 'AFN' },
{ value: 'Albanian lek', data: 'ALL' },
{ value: 'Algerian dinar', data: 'DZD' },
{ value: 'European euro', data: 'EUR' },
{ value: 'Angolan kwanza', data: 'AOA' },
{ value: 'East Caribbean dollar', data: 'XCD' },
{ value: 'Vietnamese dong', data: 'VND' },
{ value: 'Yemeni rial', data: 'YER' },
{ value: 'Zambian kwacha', data: 'ZMK' },
{ value: 'Zimbabwean dollar', data: 'ZWD' },];
While I also already have a foreach like the example below:
{foreach from=$currencies item=currency}
{$currency.name}
{$currency.code}
{/foreach}
How to output currencies value with foreach? I tried this code:
var currencies = [
{foreach from=$currencies item=currency}
{ value: '{$currency.name}', data: '{$currency.code}' },
{/foreach},];
http://i.stack.imgur.com/DhYgL.jpg
You can use json_encode to output a PHP array to JavaScript
This is the JavaScript code in the TPL
var currencies = JSON.parse('{$currencies|json_encode}');
{$currencies|json_encode} will output something like this
[{ value: 'Afghan afghani', data: 'AFN' },
{ value: 'Albanian lek', data: 'ALL' },
{ value: 'Algerian dinar', data: 'DZD' }, ...]
This output will be passed to the JavaScript function JSON.parse which will transform the output string to a JavaScript object
var newArray = [];
for (var i=0; i < currencies.length; i++) {
newArray.push({value: whatever, data: whateverVar})
}
I am still not sure what you want but that is the best I can give from what I think you want.