Omit matched object between Arrays - javascript

I'm trying to compare newProp (Array of objects) with prop (Another array of objects), and when there is a match, it will be removed off newProp array.
prop = [{ id: 1, name: 'John Doe}, {id: 2, name: 'Jane Doe'}, {id: 3. name: 'Baby Doe'}]
newProp = [{id: 4, name: 'Johnny Doe' }, {id: 1, name: 'John Doe'} ....]
Here is what I have tried, wonder if there is better / cleaner way of solving this problem.
let prop = [{ id: 1, name: 'John Doe'}, {id: 2, name: 'Jane Doe'}, {id: 3, name: 'Baby Doe'}]
let newProp = [{id: 4, name: 'Johnny Doe' }, {id: 1, name: 'John Doe'}]
prop.map( (i,Iindex) => {
newProp.map((o, oIndex ) => {
if (i.id == o.id) {
prop.splice(Iindex, 1);
console.log(prop);
}
})
})

One way to do it is to filter() out the items you don't need.
It's cleaner and more readable (you'd see it all over the place in code of folks who follow functional programming), however it produces a new array, thus negatively affecting performance if called very frequently on very large input arrays.
newProp = newProp
.filter(newPropElement =>
prop.every(propElement => newPropElement.id !== propElement.id)
)

Related

How to filter array of objects by another array of objects by property using javascript

I have two nested array of objects, how to compare two array of objects by
id from arrobj1 and assignId from arrobj2 using javascript
So, I would to know how to compare array of objects by id and assignId and return array of objects using javascript
Tried
const result = arrobj1.filter(arr1 => {
arrobj2.find(arr2 => arr2.assignId === arr1.id)
});
var arrobj1 =[
{id: 1, name: 'xxx', value:100},
{id: 2, name: 'yyy', value:200},
{id: 3, name: 'zzz', value:400}
]
var arrobj2 =[
{country: 'IN', name: 'lina', assignId:2},
{country: 'MY', name: 'john', assignId:3},
{country: 'SG', name: 'peter', assignId:6}
]
Expected Code:
[
{id: 2, name: 'yyy', value:200},
{id: 3, name: 'zzz', value:400}
]
You have it almost correct, but you need to return in your filter, either by explicitly adding the return keyword or by removing the braces to use the arrow function's implicit return:
const result = arrobj1.filter(arr1 =>
arrobj2.find(arr2 => arr2.assignId === arr1.id)
)
// or
const result = arrobj1.filter(arr1 => {
return arrobj2.find(arr2 => arr2.assignId === arr1.id)
})
We can combine Array.filter() and Array.some() to make it more simple
let result = arrobj1.filter(a1 => arrobj2.some(a2 => a2.assignId === a1.id) )
console.log(result)
For your code,the reason is that you have missing return when invoke find
var arrobj1 =[
{id: 1, name: 'xxx', value:100},
{id: 2, name: 'yyy', value:200},
{id: 3, name: 'zzz', value:400}
]
var arrobj2 =[
{country: 'IN', name: 'lina', assignId:2},
{country: 'MY', name: 'john', assignId:3},
{country: 'SG', name: 'peter', assignId:6}
]
let result = arrobj1.filter(a1 => arrobj2.some(a2 => a2.assignId === a1.id) )
console.log(result)
You can generally go with the filter and some combination as #flyingfox mentioned in the answer, But if you'd have thousands of records then your time complexity would increase which you can solve by removing the nested some loop.
So more performant code would look like the following for a bigger data set.
And yes, Either use return with braces or simply remove the braces for one-liner returns!
var arrobj1 = [
{ id: 1, name: 'xxx', value: 100 },
{ id: 2, name: 'yyy', value: 200 },
{ id: 3, name: 'zzz', value: 400 },
]
var arrobj2 = [
{ country: 'IN', name: 'lina', assignId: 2 },
{ country: 'MY', name: 'john', assignId: 3 },
{ country: 'SG', name: 'peter', assignId: 6 },
]
var obj = {}
for (const elem of arrobj2) {
obj[elem.assignId] = true
}
let result = arrobj1.filter((a1) => obj[a1.id])
console.log(result)

Find maximum id value in a deeply nested array of objects

I have an tree data structure with each object containing children:
const data = {
id: 1,
name: "John",
parent_id: null,
children: [{
id: 2,
name: "Tess",
parent_id: 1,
children: []
},
{
id: 3,
name: "Tom",
parent_id: 1,
children: [{
id: 4,
name: "Harry",
parent_id: 3,
children: [{
id: 7,
name: "Thabo",
parent_id: 4,
children: []
}]
},
{
id: 5,
name: "Mary",
parent_id: 3,
children: []
},
{
id: 6,
name: "Madge",
parent_id: 3,
children: []
}
]
}
]
}
Before I can add a new object to the tree, I need to determine the highest id value currently used, so I can assign the next available number as id for the new user.
To do this I created a new variable with an initial value of 0. Then I iterate over each object in the tree, and if the object's id is higher than the new id, I assign the new id the current id's value (the idea being taking the final value and adding 1 to get the new id).
let newUserID = 0;
const newID = ( root, idKey ) => {
if ( root.id > idKey ) {
idKey = root.id;
}
root.children.forEach( ( obj ) => {
newID( obj, idKey );
});
return idKey;
}
newUserID = newID( data, newUserID );
console.log( newUserID );
I expected this to return the highest id in the tree as the final value, but what actually happens is that, while the new id does increase until it matches the maximum value, it then starts decreasing again, ending on 1.
This can be seen in this JSFiddle which includes some logging to show the value of the new ID at different points in the function.
I've since solved the issue using a different approach (extracting the id values to a new array, and using Math.max() to find the highest value), but I'd like to understand why my initial approach didn't work as expected. I can see the idKey value is being updated, but then the previous value gets passed back on the recursive call, but I don't know why that's happening or how to prevent it.
First, as to why your code is broken: You just missed an assignment. Where you have
newID( obj, idKey );
you are ignoring the resulting value. You need to assign it back to idKey:
idKey = newID( obj, idKey );
That will solve your problem. We should also note that the variable name newUserID is a bit of a misnomer, since it's not the the new one you will use but the highest one found. Perhaps highestUserID would be less confusing?
However, we should point out that this can be written much more simply, using Math .max to do the heavy lifting and a dollop of recursion. Here's how I might write this:
const maxId = ({id, children = []}) =>
Math .max (id, ... children .map (maxId))
const data = {id: 1, name: "John", parent_id: null, children: [{id: 2, name: "Tess", parent_id: 1, children: []}, {id: 3, name: "Tom", parent_id: 1, children: [{id: 4, name: "Harry", parent_id: 3, children: [{id: 7, name: "Thabo", parent_id: 4, children: []}]}, {id: 5, name: "Mary", parent_id: 3, children: []}, {id: 6, name: "Madge", parent_id: 3, children: []}]}]}
console .log (maxId (data))
Simply assign the returned value of the recursive call to idKey :
let newUserID = 0;
const newID = ( root, idKey ) => {
if ( root.id > idKey ) {
idKey = root.id;
}
root.children.forEach( ( obj ) => {
idKey = newID( obj, idKey ); // <--------
});
return idKey;
}
newUserID = newID( data, newUserID );
console.log( newUserID );
Without this assignment, no matter how much you recurse, the value returned will depend only on the result of the if statement at the top. This explains the logs you were getting.
You can use recursion to solve this. Like below
const data = {
id: 1,
name: "John",
parent_id: null,
children: [
{
id: 2,
name: "Tess",
parent_id: 1,
children: [],
},
{
id: 3,
name: "Tom",
parent_id: 1,
children: [
{
id: 4,
name: "Harry",
parent_id: 3,
children: [
{
id: 7,
name: "Thabo",
parent_id: 4,
children: [],
},
],
},
{
id: 5,
name: "Mary",
parent_id: 3,
children: [],
},
{
id: 6,
name: "Madge",
parent_id: 3,
children: [],
},
],
},
],
};
const findMax = (value) => {
let max = -Infinity;
const _findMax = (data) => {
if (max < data.id) max = data.id;
data.children.forEach(_findMax);
};
_findMax(value);
return max;
};
console.log(findMax(data));
You can do:
const data = {id: 1,name: 'John',parent_id: null,children: [{ id: 2, name: 'Tess', parent_id: 1, children: [] },{id: 3,name: 'Tom',parent_id: 1,children: [{id: 4,name: 'Harry',parent_id: 3,children: [{ id: 7, name: 'Thabo', parent_id: 4, children: [] }],},{ id: 5, name: 'Mary', parent_id: 3, children: [] },{ id: 6, name: 'Madge', parent_id: 3, children: [] },],},],}
const arr = [...JSON.stringify(data).matchAll(/"id":(\d+)/g)].map(([, n]) => +n)
const result = Math.max(...arr)
console.log(result)

Merge Object Array into Array matched by property value

I am trying to merge in an array to an existing array by a key (id). Is there any easy way to do this?
For example:
people = [{id: 1, name: 'John Doe'}, {id: 2, name: 'Jane Doe'}];
places = [{id: 1, state: 'CA'}, {id: 2, state: 'AK'}];
// expected output I want is
result = [{id: 1, name: 'John Doe', places: {id: 1, state: 'CA'}}, {id: 2, name: 'Jane Doe', places: {id: 2, state: 'AK}'}}];
How can I get places property id to map into people id so basically the ID's match up and they keys are carried in?
Here is the JS way to implement the scenario :
const people = [{id: 1, name: 'John Doe'}, {id: 2, name: 'Jane Doe'}];
const places = [{id: 1, state: 'CA'}, {id: 2, state: 'AK'}];
const result = people.map(ppl => {
ppl.places = places.find(plc => plc.id === ppl.id)
return ppl;
})
console.log(result)
// ES6 way
let res = people.map(obj => {
let data = places.find(item => item.id === obj.id);
return {...obj, places: data}
});
console.log('ES6 way ......',res)

How can I optimally group a list of objects by their sub object?

I'm trying to group some JavasScript objects by their shared similar object. I can do this effortlessly in Ruby, but for the life of my I (somewhat embarrassingly) can't figure this out in JS in linear time. JS doesn't seem to allow object literals as keys, at least for the purposes of reducing.
I have data shaped like this, as a result from a GraphQL query:
[
{
id: 1,
name: 'Bob',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 3,
name: 'Sheila',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 2,
name: 'Tom',
room: {
id: 3,
name: 'Bathroom'
}
}
]
In the UI, we're going to display the objects by the room they're in. We need to keep a reference to the room itself, otherwise we'd just sort by a room property.
What I'm trying to do is reshape the data into something like this:
{
{id: 5, name: 'Kitchen'}: [{id: 1, name: 'Bob'}, {id: 3, name: 'Sheila'}],
{id: 3, name: 'Bathroom'}: [{id: 2, name: 'Tom'}]
}
As you can see, the people are grouped together by the room they're in.
It could also be shaped like this...
[
{ room: {id: 5, name: 'Kitchen'}, people: [{id: 1, name: 'Bob', ...}] },
{ room: {id: 3, name: 'Bathroom', people: [{id: 2, name: 'Tom'}]
]
However it comes out, we just need the people grouped by the rooms in linear time.
I've tried lodash's groupBy, using both map and reduce, just doing for loops that put the list together, etc. I'm stumped because without being able to use an object literal (the room) as a hash index, I don't know how to efficiently group the outer objects by the inner objects.
Any help is greatly appreciated.
Update: adding clarity about trying to do it with linear time complexity - the most efficient equivalent of this Ruby code:
h = Hash.new { |h, k| h[k] = [] }
value.each_with_object(h) { |v, m| m[v[:room]] << v }
You can solve this using lodash#groupBy and lodash#map to gather and transform each group. Additionally, we use lodash#omit to remove the room object from each person from the people array.
var result = _(data)
.groupBy('room.id')
.map(people => ({
room: { ...people[0].room },
people: _.map(people, person => _.omit(person, 'room'))
})).value();
var data = [
{
id: 1,
name: 'Bob',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 3,
name: 'Sheila',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 2,
name: 'Tom',
room: {
id: 3,
name: 'Bathroom'
}
}
];
var result = _(data)
.groupBy('room.id')
.map(people => ({
// make sure to create a new room object reference
// to avoid mutability
room: { ...people[0].room },
people: _.map(people, person => _.omit(person, 'room'))
})).value();
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.10/lodash.min.js"></script>
You can use reduce to create an object of people indexed by rooms and then get that object's values, no library needed:
const input=[{id:1,name:'Bob',room:{id:5,name:'Kitchen'}},{id:3,name:'Sheila',room:{id:5,name:'Kitchen'}},{id:2,name:'Tom',room:{id:3,name:'Bathroom'}}]
const output = Object.values(
input.reduce((a, { id, name, room }) => {
const roomName = room.name;
if (!a[roomName]) a[roomName] = { room, people: [] };
a[roomName].people.push({ id, name });
return a;
}, {})
);
console.log(output);
Objects like
{id: 5, name: 'Kitchen'}: [{id: 1, name: 'Bob'}, {id: 3, name: 'Sheila'}],
in your question can't be properties like that unless the structure is a Map. Ordinary Javascript objects can only have string (/ number) properties.
One alternative is to use reduce in order to groupBy the rooms.
const input = [{
id: 1,
name: 'Bob',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 3,
name: 'Sheila',
room: {
id: 5,
name: 'Kitchen'
}
},
{
id: 2,
name: 'Tom',
room: {
id: 3,
name: 'Bathroom'
}
}
];
const res = input
.map(person => ({
person: {
id: person.id,
name: person.name
},
room: person.room
}))
.reduce((rooms, person) => {
const room = rooms.find(room => room.id === person.room.id) ||
{ room: person.room };
const idx = rooms.indexOf(room);
room.people = room.people ?
[...room.people, person.person] :
[person.person];
return Object.assign(rooms, {
[idx === -1 ? rooms.length : idx]: room
});
}, []);
console.log(res);

How can I perform an inner join with two object arrays in JavaScript?

I have two object arrays:
var a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
]
var b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
]
I want to do an inner join for these two arrays a and b, and create a third array like this (if the position property is not present, then it becomes null):
var result = [{
{id: 4, name: 'Greg', position: null},
{id: 1, name: 'David', position: null},
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
}]
My approach:
function innerJoinAB(a,b) {
a.forEach(function(obj, index) {
// Search through objects in first loop
b.forEach(function(obj2,i2){
// Find objects in 2nd loop
// if obj1 is present in obj2 then push to result.
});
});
}
But the time complexity is O(N^2). How can I do it in O(N)? My friend told me that we can use reducers and Object.assign.
I'm not able to figure this out. Please help.
I don't know how reduce would help here, but you could use a Map to
accomplish the same task in O(n):
const a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'}];
const b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'}];
var m = new Map();
// Insert all entries keyed by ID into the Map, filling in placeholder
// 'position' since the Array 'a' lacks 'position' entirely:
a.forEach(function(x) { x.position = null; m.set(x.id, x); });
// For values in 'b', insert them if missing, otherwise, update existing values:
b.forEach(function(x) {
var existing = m.get(x.id);
if (existing === undefined)
m.set(x.id, x);
else
Object.assign(existing, x);
});
// Extract resulting combined objects from the Map as an Array
var result = Array.from(m.values());
console.log(JSON.stringify(result));
.as-console-wrapper { max-height: 100% !important; top: 0; }
Because Map accesses and updates are O(1) (on average - because of hash
collisions and rehashing, it can be longer), this makes O(n+m) (where n
and m are the lengths of a and b respectively; the naive solution you
gave would be O(n*m) using the same meaning for n and m).
One of the ways how to solve it.
const a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
];
const b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
];
const r = a.filter(({ id: idv }) => b.every(({ id: idc }) => idv !== idc));
const newArr = b.concat(r).map((v) => v.position ? v : { ...v, position: null });
console.log(JSON.stringify(newArr));
.as-console-wrapper { max-height: 100% !important; top: 0; }
If you drop the null criteria (many in the community are saying using null is bad) then there's a very simple solution
let a = [1, 2, 3];
let b = [2, 3, 4];
a.filter(x => b.includes(x))
// [2, 3]
To reduce the time complexity, it is inevitable to use more memory.
var a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
]
var b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
]
var s = new Set();
var result = [];
b.forEach(function(e) {
result.push(Object.assign({}, e));
s.add(e.id);
});
a.forEach(function(e) {
if (!s.has(e.id)) {
var temp = Object.assign({}, e);
temp.position = null;
result.push(temp);
}
});
console.log(result);
update
As #Blindman67 mentioned:"You do not reduce the problems complexity by moving a search into the native code." I've consulted the ECMAScript® 2016 Language Specification about the internal procedure of Set.prototype.has() and Map.prototype.get(), unfortunately, it seemed that they both iterate through all the elements they have.
Set.prototype.has ( value )#
The following steps are taken:
Let S be the this value.
If Type(S) is not Object, throw a TypeError exception.
If S does not have a [[SetData]] internal slot, throw a TypeError exception.
Let entries be the List that is the value of S's [[SetData]] internal slot.
Repeat for each e that is an element of entries,
If e is not empty and SameValueZero(e, value) is true, return true.
Return false.
http://www.ecma-international.org/ecma-262/7.0/#sec-set.prototype.has
Map.prototype.get ( key )#
The following steps are taken:
Let M be the this value.
If Type(M) is not Object, throw a TypeError exception.
If M does not have a [[MapData]] internal slot, throw a TypeError exception.
Let entries be the List that is the value of M's [[MapData]] internal slot.
Repeat for each Record {[[Key]], [[Value]]} p that is an element of entries,
If p.[[Key]] is not empty and SameValueZero(p.[[Key]], key) is true, return p.[[Value]].
Return undefined.
http://www.ecma-international.org/ecma-262/7.0/#sec-map.prototype.get
Perhaps, we can use the Object which can directly access its properties by their names, like the hash table or associative array, for example:
var a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
]
var b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
]
var s = {};
var result = [];
b.forEach(function(e) {
result.push(Object.assign({}, e));
s[e.id] = true;
});
a.forEach(function(e) {
if (!s[e.id]) {
var temp = Object.assign({}, e);
temp.position = null;
result.push(temp);
}
});
console.log(result);
You do not reduce the problems complexity by moving a search into the native code. The search must still be done.
Also the addition of the need to null a undefined property is one of the many reasons I dislike using null.
So without the null the solution would look like
var a = [
{id: 4, name: 'Greg',position: '7'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
]
var b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 6, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
]
function join (indexName, ...arrays) {
const map = new Map();
arrays.forEach((array) => {
array.forEach((item) => {
map.set(
item[indexName],
Object.assign(item, map.get(item[indexName]))
);
})
})
return [...map.values()];
}
And is called with
const joinedArray = join("id", a, b);
To join with a default is a little more complex but should prove handy as it can join any number of arrays and automatically set missing properties to a provided default.
Testing for the defaults is done after the join to save a little time.
function join (indexName, defaults, ...arrays) {
const map = new Map();
arrays.forEach((array) => {
array.forEach((item) => {
map.set(
item[indexName],
Object.assign(
item,
map.get(item[indexName])
)
);
})
})
return [...map.values()].map(item => Object.assign({}, defaults, item));
}
To use
const joinedArray = join("id", {position : null}, a, b);
You could add...
arrays.shift().forEach((item) => { // first array is a special case.
map.set(item[indexName], item);
});
...at the start of the function to save a little time, but I feel it's more elegant without the extra code.
Here is an attempt at a more generic version of a join which accepts N objects and merges them based on a primary id key.
If performance is critical, you are better off using a specific version like the one provided by ShadowRanger which doesn't need to dynamically build a list of all property keys.
This implementation assumes that any missing properties should be set to null and that every object in each input array has the same properties (though properties can differ between arrays)
var a = [
{id: 4, name: 'Greg'},
{id: 1, name: 'David'},
{id: 2, name: 'John'},
{id: 3, name: 'Matt'},
];
var b = [
{id: 5, name: 'Mathew', position: '1'},
{id: 600, name: 'Gracia', position: '2'},
{id: 2, name: 'John', position: '2'},
{id: 3, name: 'Matt', position: '2'},
];
console.log(genericJoin(a, b));
function genericJoin(...input) {
//Get all possible keys
let template = new Set();
input.forEach(arr => {
if (arr.length) {
Object.keys(arr[0]).forEach(key => {
template.add(key);
});
}
});
// Merge arrays
input = input.reduce((a, b) => a.concat(b));
// Merge items with duplicate ids
let result = new Map();
input.forEach(item => {
result.set(item.id, Object.assign((result.get(item.id) || {}), item));
});
// Convert the map back to an array of objects
// and set any missing properties to null
return Array.from(result.values(), item => {
template.forEach(key => {
item[key] = item[key] || null;
});
return item;
});
}
Here's a generic O(n*m) solution, where n is the number of records and m is the number of keys. This will only work for valid object keys. You can convert any value to base64 and use that if you need to.
const join = ( keys, ...lists ) =>
lists.reduce(
( res, list ) => {
list.forEach( ( record ) => {
let hasNode = keys.reduce(
( idx, key ) => idx && idx[ record[ key ] ],
res[ 0 ].tree
)
if( hasNode ) {
const i = hasNode.i
Object.assign( res[ i ].value, record )
res[ i ].found++
} else {
let node = keys.reduce( ( idx, key ) => {
if( idx[ record[ key ] ] )
return idx[ record[ key ] ]
else
idx[ record[ key ] ] = {}
return idx[ record[ key ] ]
}, res[ 0 ].tree )
node.i = res[ 0 ].i++
res[ node.i ] = {
found: 1,
value: record
}
}
} )
return res
},
[ { i: 1, tree: {} } ]
)
.slice( 1 )
.filter( node => node.found === lists.length )
.map( n => n.value )
join( [ 'id', 'name' ], a, b )
This is essentially the same as Blindman67's answer, except that it adds an index object to identify records to join. The records are stored in an array and the index stores the position of the record for the given key set and the number of lists it's been found in.
Each time the same key set is encountered, the node is found in the tree, the element at it's index is updated, and the number of times it's been found is incremented.
finally, the idx object is removed from the array with the slice, any elements that weren't found in each set are removed. This makes it an inner join, you could remove this filter and have a full outer join.
finally each element is mapped to it's value, and you have the merged array.

Categories

Resources