i'm currently struggling with a redux reducer.
//backend response
const response = {
data: {
results: {
222: {
items: ['id1', 'id3']
},
333: {
items: ['id2', 'id4', 'id999 (UNKNOWN)']
}
}
}
};
//currently saved in redux state
const stateItems = [
{
id: 'id1',
name: 'item ONE'
}, {
id: 'id2',
name: 'item TWO'
}, {
id: 'id3',
name: 'item THREE'
}, {
id: 'id4',
name: 'item FOUR'
}, {
id: 'id5',
name: 'item FIVE (UNUSED)'
}, {
id: 'id6',
name: 'item SIX (UNUSED)'
}
];
//converting items: ['ids'] => items: [{id: 'id', name: 'itemName'}]
const result = Object.values(response.data.results).map((keys, index, array) => {
keys.items = keys.items.map(itemId => {
return stateItems[stateItems.findIndex(x => x.id === itemId)];
});
return response.data.results;
});
//final result should be:
const expectedFinalResult = {
222: {items: [{id: 'id1', name: 'item ONE'}, {id: 'id3', name: 'item THREE'}]},
333: {items: [{id: 'id2', name: 'item TWO'}, {id: 'id4', name: 'item FOUR'}]}
};
//both should be equal:
console.log(JSON.stringify(expectedFinalResult));
console.log(JSON.stringify(result));
console.log('same result: ' + JSON.stringify(result) === JSON.stringify(expectedFinalResult));
I ran out of ideas, how to realize it. UNUSED and UNKNOWN should be filtered out as well. So that the final result in this example just be like in the const expectedFinalResult. Currently the const result return a wrong result back.
Hopefully someone have an better idea oder better approach.
Thank you
You're on the right track with Object.entries. You can use destructuring to pick out the key ('222', '333') and the value object's items array, then use that array to filter stateItems and produce the items array for each entry in the result:
const result = {};
for (const [key, {items}] of Object.entries(response.data.results)) {
result[key] = {
items: stateItems.filter(item => items.includes(item.id))
};
}
Live Example:
//backend response
const response = {
data: {
results: {
222: {
items: ['id1', 'id3']
},
333: {
items: ['id2', 'id4', 'id999 (UNKNOWN)']
}
}
}
};
//currently saved in redux state
const stateItems = [
{
id: 'id1',
name: 'item ONE'
}, {
id: 'id2',
name: 'item TWO'
}, {
id: 'id3',
name: 'item THREE'
}, {
id: 'id4',
name: 'item FOUR'
}, {
id: 'id5',
name: 'item FIVE (UNUSED)'
}, {
id: 'id6',
name: 'item SIX (UNUSED)'
}
];
const result = {};
for (const [key, {items}] of Object.entries(response.data.results)) {
result[key] = {
items: stateItems.filter(item => items.includes(item.id))
};
}
//final result should be:
const expectedFinalResult = {
222: {items: [{id: 'id1', name: 'item ONE'}, {id: 'id3', name: 'item THREE'}]},
333: {items: [{id: 'id2', name: 'item TWO'}, {id: 'id4', name: 'item FOUR'}]}
};
//both should be equal:
console.log(JSON.stringify(result, null, 4));
.as-console-wrapper {
max-height: 100% !important;
}
That makes multiple passes through stateItems. If it or response.data.results is really, really, really, really big (like hundreds of thousands), it may be worthwhile to do a Map of the stateItems by id instead:
// Create map of state items (only once each time stateItems changes):
const stateItemMap = new Map(stateItems.map(item => [item.id, item]));
// Map results (each time you get results):
const result = {};
for (const [key, {items}] of Object.entries(response.data.results)) {
result[key] = {
items: items.map(id => stateItemMap.get(id))
};
}
Live Example:
//backend response
const response = {
data: {
results: {
222: {
items: ['id1', 'id3']
},
333: {
items: ['id2', 'id4', 'id999 (UNKNOWN)']
}
}
}
};
//currently saved in redux state
const stateItems = [
{
id: 'id1',
name: 'item ONE'
}, {
id: 'id2',
name: 'item TWO'
}, {
id: 'id3',
name: 'item THREE'
}, {
id: 'id4',
name: 'item FOUR'
}, {
id: 'id5',
name: 'item FIVE (UNUSED)'
}, {
id: 'id6',
name: 'item SIX (UNUSED)'
}
];
// Create map of state items (only once each time stateItems changes):
const stateItemMap = new Map(stateItems.map(item => [item.id, item]));
// Map results (each time you get results):
const result = {};
for (const [key, {items}] of Object.entries(response.data.results)) {
result[key] = {
items: items.map(id => stateItemMap.get(id))
};
}
//final result should be:
const expectedFinalResult = {
222: {items: [{id: 'id1', name: 'item ONE'}, {id: 'id3', name: 'item THREE'}]},
333: {items: [{id: 'id2', name: 'item TWO'}, {id: 'id4', name: 'item FOUR'}]}
};
//both should be equal:
console.log(JSON.stringify(result, null, 4));
.as-console-wrapper {
max-height: 100% !important;
}
You could do it with reduce:
Object.entries(response.data.results)
.reduce((acc, [key, { items }]) => ({
...acc,
[key]: { // we can use the dynamic keys to add in our accumulated object
items: items
.map(itemId => stateItems.find(x => x.id === itemId)) // you can use find directly instead of findIndex then access
.filter(Boolean) // we skip the unneeded elements
}
}), {});
Related
First I must say sorry if this question is already answered, but I have not found the answer I am looking for :(
I have an array of objects with unknown nesting depth (it can be 20-30 or even more) and I want to filter it's 'name' property based on input field value.
public nestedArray = [
{id: 1, name: 'Example_1', children: []},
{id: 2, name: 'Test', children: []},
{id: 3, name: 'Test Name', children: [
{id: 10, name: 'Child name', children: [
{id: 20, name: 'Example_14', children: []},
{id: 30, name: 'Last Child', children: []}
]
}
]
}
];
The result I want to receive is an array of objects with only one level deep with 'name' field which includes input value.
For example my input value is 'am', so the result would be:
resultsArray = [
{id: 1, name: 'Example_1'},
{id: 3, name: 'Test Name'},
{id: 10, name: 'Child name'},
{id: 20, name: 'Example_14'}
];
There is no problem to do it on the first level like that:
public filter(array: any[], input_value: string): void {
array = array.filter(el => {
return el.name.toLowerCase().includes(input_value.toLowerCase()));
}
}
Thanks in advance!
You could map the array and their children and take a flat result of objects where the string is matching the name property.
const
find = value => ({ children, ...o }) => [
...(o.name.includes(value) ? [o] : []),
...children.flatMap(find(value))
],
data = [{ id: 1, name: 'Example_1', children: [] }, { id: 2, name: 'Test', children: [] }, { id: 3, name: 'Test Name', children: [{ id: 10, name: 'Child name', children: [{ id: 20, name: 'Example_14', children: [] }, { id: 30, name: 'Last Child', children: [] }] }] }],
result = data.flatMap(find('am'));
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Another solution with a single result array and a classic approach.
const
find = (array, value) => {
const
iter = array => {
for (const { children, ...o } of array) {
if (o.name.includes(value)) result.push(o);
iter(children);
}
},
result = [];
iter(array);
return result;
},
data = [{ id: 1, name: 'Example_1', children: [] }, { id: 2, name: 'Test', children: [] }, { id: 3, name: 'Test Name', children: [{ id: 10, name: 'Child name', children: [{ id: 20, name: 'Example_14', children: [] }, { id: 30, name: 'Last Child', children: [] }] }] }],
result = find(data, 'am');
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
There is one scenario where i need to replace the existing records from cached data with new incoming data source. Looking for the cleaner approach to handle the array operations.
For example:
var userCategory = [
{
id: 'platinum',
name: 'bob',
},
{
id: 'platinum',
name: 'bar',
},
{
id: 'platinum',
name: 'foo',
},
{
id: 'gold',
name: 'tom',
},
{
id: 'silver',
name: 'billy',
},
];
Here is new users of particular category
var newPlatinumUsers = [
{
id: 'platinum',
name: 'bob',
},
{
id: 'platinum',
name: 'mike',
},
];
This is the expected result needed:
var expected = [
{
id: 'platinum',
name: 'bob',
},
{
id: 'platinum',
name: 'mike',
},
{
id: 'gold',
name: 'tom',
},
{
id: 'silver',
name: 'billy',
},
];
I tried with filtering all the platinum user from existing records then added the new records but it looks verbose
Is there any cleaner approach like lodash operator??
Thanks for your time!!!
May you are looking for this.
function getUnique(arr){
// removing duplicate
let uniqueArr = [...new Set(arr)];
document.write(uniqueArr);
}
const array = ['acer','HP','Apple','Apple','something'];
// calling the function
getUnique(array);
Verify my answer if it help you.
Please find the Javascript implementation of the same
var userCategory = [
{ id: 'platinum', name: 'bob', },
{ id: 'platinum', name: 'bar', },
{ id: 'platinum', name: 'foo', },
{ id: 'gold', name: 'tom', },
{ id: 'silver', name: 'billy', },
];
var newPlatinumUsers = [
{ id: 'platinum', name: 'bob', },
{ id: 'platinum', name: 'mike', },
];
const result = [...newPlatinumUsers];
userCategory.forEach((node) => {
if(node.id !== 'platinum') {
result.push(node);
}
});
console.log(result);
With this solution you can change more than one category:
var userCategory = [
{id: 'platinum',name: 'bob'},
{id: 'platinum',name: 'bar'},
{id: 'platinum',name: 'foo'},
{id: 'gold',name: 'tom'},
{id: 'silver',name: 'billy'},
];
var newUsers = [
{id: 'platinum',name: 'bob'},
{id: 'platinum',name: 'mike'},
{id: 'gold',name: 'will'},
{id: 'gold',name: 'jerry'},
];
const idsToReplace = {}
const result = [...newUsers]
result.forEach(u => {
idsToReplace[u.id] = true
})
userCategory.forEach(u => {
if(!idsToReplace[u.id]){
result.push(u)
}
})
console.log(result)
Edit:
I have an order with has many items.
The items have one order.
The goal is to have a query that returns previous values for a column of the item as an array with the length of a supplied argument.
So if I have:
{
orders: [
{
orderDate: 1,
items: [
{
itemName: 'Item 1',
orderAmount: 2,
itemId: '1a'
},
{
itemName: 'Item 2',
orderAmount: 7,
itemId: '1b'
}
]
},
{
orderDate: '2',
items: [
{
itemName: 'Item 1',
orderAmount: 3,
itemId: '1a'
},
{
itemName: 'Item 2',
orderAmount: 6,
itemId: '1b'
}
]
},
{
orderDate: '3',
items: [
{
itemName: 'Item 1',
orderAmount: 4,
itemId: '1a'
},
{
itemName: 'Item 2',
orderAmount: 5,
itemId: '1b'
}
]
}
]
}
What I would like is a query that includes returning previousOrders(count:2) to return...
orderDate: 3,
items: [
{
itemName: 'Item 1',
orderAmount: 4,
itemId: '1a'
previousOrders: [3, 2]
},
{
itemName: 'Item 2',
orderAmount: 5,
itemId: '1a'
previousOrders: [6, 7]
}
What I currently have working is...
previousOrders: async (item, { count }, { models }) => {
const previous = await models.Item.findAll({
attributes: ['id', 'orderAmount'],
where: {
itemId: item.itemId
},
include: [
{
attributes: [],
model: models.Order
}
],
order: [[models.Order, 'orderDate', 'desc']],
raw: true
})
const index = previous.findIndex(x => x.id === item.id)
const sliced = previous.slice(index + 1, index + count + 1)
const array = sliced.map(a => a.orderAmount)
while (array.length < count) {
array.push(0)
}
return array
This works but I feel like there is probably a much nicer way to do it with queries and just want to see other potentials.
I wanted to get values from one array of object with keys and values into another array of objects with the same keys.
const array1 = [{
key1: 7,
key2: 1,
key3: 37,
}];
const array2 = [
{
title: 'Some Title 1',
key: 'key1',
number: '',
icon: require('../../assets/some2.png')
},
{
title: 'Some Title 2',
key: 'key2',
number: '',
icon: require('../../assets/some1.png')
},
{
title: 'Some Title 3',
key: 'key3',
number: '',
icon: require('../../assets/some3.png')
},
];
I have tried using Object.keys to get all the keys from array1 object.
const keys = Object.keys(obj);
keys.map((key) => {
if (array2[key] === key) {
// console.log('card detail matching');
// add to the array 2 with value
}
})
but after a point its doesn't makes sense.
Expected array
const resultArray = [
{
title: 'Some Title 1',
key: 'key1',
number: 7,
icon: require('../../assets/some2.png')
},
{
title: 'Some Title 2',
key: 'key2',
number: 1,
icon: require('../../assets/some1.png')
},
{
title: 'Some Title 3',
key: 'key3',
number: 37,
icon: require('../../assets/some3.png')
}
]
I expect the output to be the values of the key would be entered in array2 in the 'number' key.
You could map a new array by taking the key as accessor for keys.
const
array1 = [{ key1: 7, key2: 1, key3: 37 }],
array2 = [{ title: 'Some Title 1', key: 'key1', number: '', icon: '../../assets/some2.png' }, { title: 'Some Title 2', key: 'key2', number: '', icon: '../../assets/some1.png' }, { title: 'Some Title 3', key: 'key3', number: '', icon:'../../assets/some3.png' }],
result = array2.map(o => Object.assign({}, o, { number: array1[0][o.key] }));
console.log(result);
Below code help you:-
const array1 = {
key1: 7,
key2: 1,
key3: 37,
};
const array2 = [
{
title: 'Some Title 1',
key: 'key1',
number: '',
icon: require('../../assets/some2.png')
},
{
title: 'Some Title 2',
key: 'key2',
number: '',
icon: require('../../assets/some1.png')
},
{
title: 'Some Title 3',
key: 'key3',
number: '',
icon: require('../../assets/some3.png')
},
];
array2.forEach(item=>{
item.number=array1[item.key]
})
You can iterate through each object from array2 and add the number value fetching from array1
const array1 = [{
key1: 7,
key2: 1,
key3: 37,
}];
const array2 = [
{
title: 'Some Title 1',
key: 'key1',
number: '',
icon: '../../assets/some2.png'
},
{
title: 'Some Title 2',
key: 'key2',
number: '',
icon: '../../assets/some1.png'
},
{
title: 'Some Title 3',
key: 'key3',
number: '',
icon: '../../assets/some3.png'
},
];
array2.forEach(e => e.number = array1[0][e.key]);
console.log(array2)
const array1 = [{
key1: 7,
key2: 1,
}];
const array2 = [
{
title: 'Some Title 1',
key: 'key1',
number: '',
icon: '../../assets/some2.png'
},
{
title: 'Some Title 2',
key: 'key2',
number: '',
icon: '../../assets/some1.png'
},
{
title: 'Some Title 3',
key: 'key3',
number: '',
icon: '../../assets/some3.png'
},
];
const resultArray = array2.filter(item => array1[0][item.key]);
console.log(resultArray);
You can filter to get the result.
This should work for you.
const keys = [
{
key1: 7,
key2: 1,
key3: 37,
},
{
key4: 7,
key5: 1,
key6: 37,
}
];
const array2 = [
{
title: 'Some Title 1',
key: 'key4',
number: ''
},
{
title: 'Some Title 2',
key: 'key2',
number: ''
},
{
title: 'Some Title 3',
key: 'key3',
number: ''
}
];
function populateArrayData (arr, propToCompare, propToReplace, keysObj) {
let populatedArray = [];
if (Array.isArray(arr)) {
populatedArray = arr.map((item) => {
if (checkIfKeyExists(item[propToCompare], keysObj)) {
item[propToReplace] = keysObj[item[propToCompare]];
}
return item;
});
}
return populatedArray;
}
function flattenAllKeys (keys) {
let flattenedKeysObj = {};
if (Array.isArray(keys)) {
flattenedKeysObj = keys.reduce((acc, keysObj) => {
acc = {...acc, ...keysObj};
return acc;
}, {});
}
return flattenedKeysObj;
}
function checkIfKeyExists(key, keysObj) {
return (keysObj[key]!== undefined && keysObj[key]!== null);
}
let flattenedKeys = flattenAllKeys(keys);
console.log(populateArrayData(array2, 'key', 'number', flattenedKeys));
I have two separate arrays of objects that I need to merge based if a specific key value matches. Might make more sense after analyzing the data:
Array 1
let categories = [
{ id: 5, slug: 'category-5', items: [] },
{ id: 4, slug: 'category-4', items: [] },
{ id: 3, slug: 'category-3', items: [] },
]
Array 2
let items = [
{ id: 5, data: [{ title: 'item title', description: 'item description' }] },
{ id: 5, data: [{ title: 'item title 2', description: 'item description 2' }] },
{ id: 4, data: [{ title: 'item title 4', description: 'item description 4' }] },
]
Expected Output
let mergedOutput = [
{ id: 5, slug: 'category-5',
items: [
{ title: 'item title', description: 'item description' },
{ title: 'item title 2', description: 'item description 2' }
]
},
{ id: 4, slug: 'category-4',
items: [
{ title: 'item title 4', description: 'item description 4' },
]
},
{ id: 3, slug: 'category-3', items: [] },
]
So....I need to add Array 2 to Array 1 if their id's match.
Array 1 will stay the same, but if Array 2 matches, the items property of Array 1 (empty) will be replaced by the data property of Array 2
I know this is a pretty basic / and redundant question, but I can't find the resources for my use case / object structure.
I was able to easily group arrays with lodash -- so if there is a similar solution with that library -- that would good! Or just some direction would suffice.
Thanks in advance!
You can loop first array and then use filter to get objects with same id as current element and add that items to current object.
let categories = [
{ id: 5, slug: 'category-5', items: [] },
{ id: 4, slug: 'category-4', items: [] },
{ id: 3, slug: 'category-3', items: [] },
]
let items = [
{ id: 5, data: [{ title: 'item title', description: 'item description' }] },
{ id: 5, data: [{ title: 'item title 2', description: 'item description 2' }] },
{ id: 4, data: [{ title: 'item title 4', description: 'item description 4' }] },
]
categories.forEach(function(e) {
var i = items.filter(a => a.id == e.id).map(a => a.data);
e.items = i;
})
console.log(categories)
You could reduce the items into categories:
let res = items.reduce((a, b) => {
let it = a.find(e => e.id === b.id);
if (! it) return a;
it.items = it.items.concat(b.data);
return a;
}, categories);
let categories = [{
id: 5,
slug: 'category-5',
items: []
},
{
id: 4,
slug: 'category-4',
items: []
},
{
id: 3,
slug: 'category-3',
items: []
},
];
let items = [{
id: 5,
data: [{
title: 'item title',
description: 'item description'
}]
},
{
id: 5,
data: [{
title: 'item title 2',
description: 'item description 2'
}]
},
{
id: 4,
data: [{
title: 'item title 4',
description: 'item description 4'
}]
},
];
let res = items.reduce((a, b) => {
let it = a.find(e => e.id === b.id);
if (! it) return a;
it.items = it.items.concat(b.data);
return a;
}, categories);
console.log(res);
It might be faster to get the ids in an object first, so we don't have to use find on the same id many times:
function merge(result, toMerge, mergeInto) {
let i = 0, hm = {};
for (let {id} of categories) {
hm[id] = i;
i++;
}
return toMerge.reduce((a,b) => {
let it = a[hm[b.id]];
if (!it) return a;
it[mergeInto] = it[mergeInto].concat(b.data);
return a;
}, result);
}
let categories = [
{ id: 5, slug: 'category-5', items: [] },
{ id: 4, slug: 'category-4', items: [] },
{ id: 3, slug: 'category-3', items: [] },
];
let items = [
{ id: 5, data: [{ title: 'item title', description: 'item description' }] },
{ id: 5, data: [{ title: 'item title 2', description: 'item description 2' }] },
{ id: 4, data: [{ title: 'item title 4', description: 'item description 4' }] },
];
function merge(result, toMerge, mergeInto) {
let i = 0, hm = {};
for (let {id} of categories) {
hm[id] = i;
i++;
}
return toMerge.reduce((a,b) => {
let it = result[hm[b.id]];
if (!it) return a;
it[mergeInto] = it[mergeInto].concat(b.data);
return a;
}, result);
}
console.log(merge(categories, items, 'items'));
I would make the categories as hash map and the key would be the id and iterate over all the items only.
then you get O(N) solution.