Replacing objects in array - javascript

I have this javascript object:
var arr1 = [{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}]
var arr2 = [{id:'124',name:'ttt'},
{id:'45',name:'yyy'}]
I need to replace objects in arr1 with items from arr2 with same id.
So here is the result I want to get:
var arr1 = [{id:'124',name:'ttt'},
{id:'589',name:'www'},
{id:'45',name:'yyy'},
{id:'567',name:'rrr'}]
How can I implement it using javascript?

You can use Array#map with Array#find.
arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
var arr1 = [{
id: '124',
name: 'qqq'
}, {
id: '589',
name: 'www'
}, {
id: '45',
name: 'eee'
}, {
id: '567',
name: 'rrr'
}];
var arr2 = [{
id: '124',
name: 'ttt'
}, {
id: '45',
name: 'yyy'
}];
var res = arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
console.log(res);
Here, arr2.find(o => o.id === obj.id) will return the element i.e. object from arr2 if the id is found in the arr2. If not, then the same element in arr1 i.e. obj is returned.

What's wrong with Object.assign(target, source) ?
Arrays are still type object in Javascript, so using assign should still reassign any matching keys parsed by the operator as long as matching keys are found, right?

There is always going to be a good debate on time vs space, however these days I've found using space is better for the long run.. Mathematics aside let look at a one practical approach to the problem using hashmaps, dictionaries, or associative array's whatever you feel like labeling the simple data structure..
var marr2 = new Map(arr2.map(e => [e.id, e]));
arr1.map(obj => marr2.has(obj.id) ? marr2.get(obj.id) : obj);
I like this approach because though you could argue with an array with low numbers you are wasting space because an inline approach like #Tushar approach performs indistinguishably close to this method. However I ran some tests and the graph shows how performant in ms both methods perform from n 0 - 1000. You can decide which method works best for you, for your situation but in my experience users don't care to much about small space but they do care about small speed.
Here is my performance test I ran for source of data
var n = 1000;
var graph = new Array();
for( var x = 0; x < n; x++){
var arr1s = [...Array(x).keys()];
var arr2s = arr1s.filter( e => Math.random() > .5);
var arr1 = arr1s.map(e => {return {id: e, name: 'bill'}});
var arr2 = arr2s.map(e => {return {id: e, name: 'larry'}});
// Map 1
performance.mark('p1s');
var marr2 = new Map(arr2.map(e => [e.id, e]));
arr1.map(obj => marr2.has(obj.id) ? marr2.get(obj.id) : obj);
performance.mark('p1e');
// Map 2
performance.mark('p2s');
arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
performance.mark('p2e');
graph.push({ x: x, r1: performance.measure('HashMap Method', 'p1s', 'p1e').duration, r2: performance.measure('Inner Find', 'p2s','p2e').duration});
}

Since you're using Lodash you could use _.map and _.find to make sure major browsers are supported.
In the end I would go with something like:
function mergeById(arr) {
return {
with: function(arr2) {
return _.map(arr, item => {
return _.find(arr2, obj => obj.id === item.id) || item
})
}
}
}
var result = mergeById([{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}])
.with([{id:'124',name:'ttt'}, {id:'45',name:'yyy'}])
console.log(result);
<script src="https://raw.githubusercontent.com/lodash/lodash/4.13.1/dist/lodash.js"></script>

I'd like to suggest another solution:
const objectToReplace = this.array.find(arrayItem => arrayItem.id === requiredItem.id);
Object.assign(objectToReplace, newObject);

Thanks to ES6 we can made it with easy way -> for example on util.js module ;))).
Merge 2 array of entity
export const mergeArrays = (arr1, arr2) =>
arr1 && arr1.map(obj => arr2 && arr2.find(p => p.id === obj.id) || obj);
gets 2 array and merges it.. Arr1 is main array which is priority is
high on merge process
Merge array with same type of entity
export const mergeArrayWithObject = (arr, obj) => arr && arr.map(t => t.id === obj.id ? obj : t);
it merges the same kind of array of type with some kind of type for
example: array of person ->
[{id:1, name:"Bir"},{id:2, name: "Iki"},{id:3, name:"Uc"}]
second param Person {id:3, name: "Name changed"}
result is
[{id:1, name:"Bir"},{id:2, name: "Iki"},{id:3, name:"Name changed"}]

I like to go through arr2 with foreach() and use findIndex() for checking for occurrence in arr1:
var arr1 = [{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}]
var arr2 = [{id:'124',name:'ttt'},
{id:'45',name:'yyy'}]
arr2.forEach(element => {
const itemIndex = arr1.findIndex(o => o.id === element.id);
if(itemIndex > -1) {
arr1[itemIndex] = element;
} else {
arr1 = arr1.push(element);
}
});
console.log(arr1)

Considering that the accepted answer is probably inefficient for large arrays, O(nm), I usually prefer this approach, O(2n + 2m):
function mergeArrays(arr1 = [], arr2 = []){
//Creates an object map of id to object in arr1
const arr1Map = arr1.reduce((acc, o) => {
acc[o.id] = o;
return acc;
}, {});
//Updates the object with corresponding id in arr1Map from arr2,
//creates a new object if none exists (upsert)
arr2.forEach(o => {
arr1Map[o.id] = o;
});
//Return the merged values in arr1Map as an array
return Object.values(arr1Map);
}
Unit test:
it('Merges two arrays using id as the key', () => {
var arr1 = [{id:'124',name:'qqq'}, {id:'589',name:'www'}, {id:'45',name:'eee'}, {id:'567',name:'rrr'}];
var arr2 = [{id:'124',name:'ttt'}, {id:'45',name:'yyy'}];
const actual = mergeArrays(arr1, arr2);
const expected = [{id:'124',name:'ttt'}, {id:'589',name:'www'}, {id:'45',name:'yyy'}, {id:'567',name:'rrr'}];
expect(actual.sort((a, b) => (a.id < b.id)? -1: 1)).toEqual(expected.sort((a, b) => (a.id < b.id)? -1: 1));
})

// here find all the items that are not it the arr1
const temp = arr1.filter(obj1 => !arr2.some(obj2 => obj1.id === obj2.id))
// then just concat it
arr1 = [...temp, ...arr2]

Here a more transparent approach. I find the oneliners harder to read and harder to debug.
export class List {
static replace = (object, list) => {
let newList = [];
list.forEach(function (item) {
if (item.id === object.id) {
newList.push(object);
} else {
newList.push(item);
}
});
return newList;
}
}

If you don't care about the order of the array, then you may want to get the difference between arr1 and arr2 by id using differenceBy() and then simply use concat() to append all the updated objects.
var result = _(arr1).differenceBy(arr2, 'id').concat(arr2).value();
var arr1 = [{
id: '124',
name: 'qqq'
}, {
id: '589',
name: 'www'
}, {
id: '45',
name: 'eee'
}, {
id: '567',
name: 'rrr'
}]
var arr2 = [{
id: '124',
name: 'ttt'
}, {
id: '45',
name: 'yyy'
}];
var result = _(arr1).differenceBy(arr2, 'id').concat(arr2).value();
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.13.1/lodash.js"></script>

I am only submitting this answer because people expressed concerns over browsers and maintaining the order of objects. I recognize that it is not the most efficient way to accomplish the goal.
Having said this, I broke the problem down into two functions for readability.
// The following function is used for each itertion in the function updateObjectsInArr
const newObjInInitialArr = function(initialArr, newObject) {
let id = newObject.id;
let newArr = [];
for (let i = 0; i < initialArr.length; i++) {
if (id === initialArr[i].id) {
newArr.push(newObject);
} else {
newArr.push(initialArr[i]);
}
}
return newArr;
};
const updateObjectsInArr = function(initialArr, newArr) {
let finalUpdatedArr = initialArr;
for (let i = 0; i < newArr.length; i++) {
finalUpdatedArr = newObjInInitialArr(finalUpdatedArr, newArr[i]);
}
return finalUpdatedArr
}
const revisedArr = updateObjectsInArr(arr1, arr2);
jsfiddle

function getMatch(elem) {
function action(ele, val) {
if(ele === val){
elem = arr2[i];
}
}
for (var i = 0; i < arr2.length; i++) {
action(elem.id, Object.values(arr2[i])[0]);
}
return elem;
}
var modified = arr1.map(getMatch);

I went with this, because it makes sense to me. Comments added for readers!
masterData = [{id: 1, name: "aaaaaaaaaaa"},
{id: 2, name: "Bill"},
{id: 3, name: "ccccccccc"}];
updatedData = [{id: 3, name: "Cat"},
{id: 1, name: "Apple"}];
updatedData.forEach(updatedObj=> {
// For every updatedData object (dataObj), find the array index in masterData where the IDs match.
let indexInMasterData = masterData.map(masterDataObj => masterDataObj.id).indexOf(updatedObj.id); // First make an array of IDs, to use indexOf().
// If there is a matching ID (and thus an index), replace the existing object in masterData with the updatedData's object.
if (indexInMasterData !== undefined) masterData.splice(indexInMasterData, 1, updatedObj);
});
/* masterData becomes [{id: 1, name: "Apple"},
{id: 2, name: "Bill"},
{id: 3, name: "Cat"}]; as you want.`*/

The accepted answer using array.map is correct but you have to remember to assign it to another variable since array.map doesnt change original array, it actually creates a new array.
//newArr contains the mapped array from arr2 to arr1.
//arr1 still contains original value
var newArr = arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);

Array.prototype.update = function(...args) {
return this.map(x=>args.find((c)=>{return c.id===x.id}) || x)
}
const result =
[
{id:'1',name:'test1'},
{id:'2',name:'test2'},
{id:'3',name:'test3'},
{id:'4',name:'test4'}
]
.update({id:'1',name:'test1.1'}, {id:'3',name:'test3.3'})
console.log(result)

This is how I do it in TypeScript:
const index = this.array.indexOf(this.objectToReplace);
this.array[index] = newObject;

Related

Remove duplicates in array of objects based on 1 key [duplicate]

I have an object that contains an array of objects.
obj = {};
obj.arr = new Array();
obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});
I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr would become...
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
How about with some es6 magic?
obj.arr = obj.arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.place === value.place && t.name === value.name
))
)
Reference URL
A more generic solution would be:
const uniqueArray = obj.arr.filter((value, index) => {
const _value = JSON.stringify(value);
return index === obj.arr.findIndex(obj => {
return JSON.stringify(obj) === _value;
});
});
Using the above property strategy instead of JSON.stringify:
const isPropValuesEqual = (subject, target, propNames) =>
propNames.every(propName => subject[propName] === target[propName]);
const getUniqueItemsByProperties = (items, propNames) =>
items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
);
You can add a wrapper if you want the propNames property to be either an array or a value:
const getUniqueItemsByProperties = (items, propNames) => {
const propNamesArray = Array.from(propNames);
return items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
);
};
allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);
Stackblitz Example
Explanation
Start by understanding the two methods used:
filter, findIndex
Next take your idea of what makes your two objects equal and keep that in mind.
We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but it's position is not at the first instance of an object with the criterion.
Therefore we can use the above criterion to determine if something is a duplicate.
One liners with filter ( Preserves order )
Find unique id's in an array.
arr.filter((v,i,a)=>a.findIndex(v2=>(v2.id===v.id))===i)
If the order is not important, map solutions will be faster: Solution with map
Unique by multiple properties ( place and name )
arr.filter((v,i,a)=>a.findIndex(v2=>['place','name'].every(k=>v2[k] ===v[k]))===i)
Unique by all properties (This will be slow for large arrays)
arr.filter((v,i,a)=>a.findIndex(v2=>(JSON.stringify(v2) === JSON.stringify(v)))===i)
Keep the last occurrence by replacing findIndex with findLastIndex.
arr.filter((v,i,a)=>a.findLastIndex(v2=>(v2.place === v.place))===i)
Using ES6+ in a single line you can get a unique list of objects by key:
const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]
It can be put into a function:
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
Here is a working example:
const arr = [
{place: "here", name: "x", other: "other stuff1" },
{place: "there", name: "x", other: "other stuff2" },
{place: "here", name: "y", other: "other stuff4" },
{place: "here", name: "z", other: "other stuff5" }
]
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
const arr1 = getUniqueListBy(arr, 'place')
console.log("Unique by place")
console.log(JSON.stringify(arr1))
console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')
console.log(JSON.stringify(arr2))
How does it work
First the array is remapped in a way that it can be used as an input for a Map.
arr.map(item => [item[key], item]);
which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.
Example when key is place:
[["here", {place: "here", name: "x", other: "other stuff1" }], ...]
Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key.
Note: Map keeps the order of insertion. (check difference between Map and object)
new Map(entry array just mapped above)
Third we use the map values to retrieve the original items, but this time without duplicates.
new Map(mappedArr).values()
And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:
return [...new Map(mappedArr).values()]
Simple and performant solution with a better runtime than the 70+ answers that already exist:
const ids = array.map(o => o.id)
const filtered = array.filter(({id}, index) => !ids.includes(id, index + 1))
Example:
const arr = [{id: 1, name: 'one'}, {id: 2, name: 'two'}, {id: 1, name: 'one'}]
const ids = arr.map(o => o.id)
const filtered = arr.filter(({id}, index) => !ids.includes(id, index + 1))
console.log(filtered)
How it works:
Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.
Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.
This obviously also works for any other key that is not called id, multiple or even all keys.
A primitive method would be:
const obj = {};
for (let i = 0, len = things.thing.length; i < len; i++) {
obj[things.thing[i]['place']] = things.thing[i];
}
things.thing = new Array();
for (const key in obj) {
things.thing.push(obj[key]);
}
If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:
_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])
Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:
var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name');
UPDATE: Lodash now has introduced a .uniqBy as well.
I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects
So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.
var arrayWithDuplicates = [
{"type":"LICENSE", "licenseNum": "12345", state:"NV"},
{"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
{"type":"LICENSE", "licenseNum": "12345", state:"OR"},
{"type":"LICENSE", "licenseNum": "10849", state:"CA"},
{"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
{"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];
function removeDuplicates(originalArray, prop) {
var newArray = [];
var lookupObject = {};
for(var i in originalArray) {
lookupObject[originalArray[i][prop]] = originalArray[i];
}
for(i in lookupObject) {
newArray.push(lookupObject[i]);
}
return newArray;
}
var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));
The results:
uniqueArray is:
[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
One liner using Set
var things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
// assign things.thing to myData for brevity
var myData = things.thing;
things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);
console.log(things.thing)
Explanation:
new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
Set object will ensure that every element is unique.
Then I create an array based on the elements of the created set using Array.from.
Finally, I use JSON.parse to convert stringified element back to an object.
ES6 one liner is here
let arr = [
{id:1,name:"sravan ganji"},
{id:2,name:"pinky"},
{id:4,name:"mammu"},
{id:3,name:"avy"},
{id:3,name:"rashni"},
];
console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
To remove all duplicates from an array of objects, the simplest way is use filter:
var uniq = {};
var arr = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
One liners with Map ( High performance, Does not preserve order )
Find unique id's in array arr.
const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]
If the order is important check out the solution with filter: Solution with filter
Unique by multiple properties ( place and name ) in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]
Unique by all properties in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]
Keep the first occurrence in array arr
const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:
function uniq(a, param){
return a.filter(function(item, pos, array){
return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
})
}
uniq(things.thing, 'place');
This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.
ES5 answer
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arr.some(function(item) { return equals(item, val); })) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
var things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
removeDuplicates(things, thingsEqual);
console.log(things);
Original ES3 answer
function arrayContains(arr, val, equals) {
var i = arr.length;
while (i--) {
if ( equals(arr[i], val) ) {
return true;
}
}
return false;
}
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, j, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arrayContains(arr, val, equals)) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
removeDuplicates(things.thing, thingsEqual);
If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.
The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq
This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:
function unique(a){
a.sort();
for(var i = 1; i < a.length; ){
if(a[i-1] == a[i]){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
// Provide your own comparison
function unique(a, compareFunc){
a.sort( compareFunc );
for(var i = 1; i < a.length; ){
if( compareFunc(a[i-1], a[i]) === 0){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
I think the best approach is using reduce and Map object. This is a single line solution.
const data = [
{id: 1, name: 'David'},
{id: 2, name: 'Mark'},
{id: 2, name: 'Lora'},
{id: 4, name: 'Tyler'},
{id: 4, name: 'Donald'},
{id: 5, name: 'Adrian'},
{id: 6, name: 'Michael'}
]
const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];
console.log(uniqueData)
/*
in `map.set(obj.id, obj)`
'obj.id' is key. (don't worry. we'll get only values using the .values() method)
'obj' is whole object.
*/
To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.
let filtered = array.reduce((accumulator, current) => {
if (! accumulator.find(({guid}) => guid === current.guid)) {
accumulator.push(current);
}
return accumulator;
}, []);
Extending this one to allow selection of a property and compress it into a one liner:
const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);
To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:
const result = uniqify(myArrayOfObjects, 'guid')
Considering lodash.uniqWith
const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
You could also use a Map:
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
Full sample:
const things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
console.log(JSON.stringify(dedupThings, null, 4));
Result:
[
{
"place": "here",
"name": "stuff"
},
{
"place": "there",
"name": "morestuff"
}
]
Dang, kids, let's crush this thing down, why don't we?
let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
let myData = [{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}];
let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
console.log(q)
One-liner using ES6 and new Map().
// assign things.thing to myData
let myData = things.thing;
[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
Details:-
Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
Using .values() would give MapIterator with all values in a Map (obj in our case)
Finally, spread ... operator to give new Array with values from the above step.
A TypeScript solution
This will remove duplicate objects and also preserve the types of the objects.
function removeDuplicateObjects(array: any[]) {
return [...new Set(array.map(s => JSON.stringify(s)))]
.map(s => JSON.parse(s));
}
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
const x = thing.find(item => item.place === current.place);
if (!x) {
return thing.concat([current]);
} else {
return thing;
}
}, []);
console.log(filteredArr)
Solution Via Set Object | According to the data type
const seen = new Set();
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.filter(el => {
const duplicate = seen.has(el.place);
seen.add(el.place);
return !duplicate;
});
console.log(filteredArr)
Set Object Feature
Each value in the Set Object has to be unique, the value equality will be checked
The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.
Unique & data Type feature:..
addmethod
it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...
has method
sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..
delete method
it will remove specific item from the collection by identifying data type..
clear method
it will remove all collection items from one specific variable and set as empty object
Set object has also Iteration methods & more feature..
Better Read from Here : Set - JavaScript | MDN
removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).
const allTests = [
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'},
{name: 'Test2', id: '2'},
{name: 'Test3', id: '3'}
];
function removeDuplicates(array) {
let uniq = {};
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}
removeDuplicates(allTests);
Expected outcome:
[
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'}
];
First, we set the value of variable uniq to an empty object.
Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.
For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.
Fast (less runtime) and type-safe answer for lazy Typescript developers:
export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
const ids = objects.map(object => object[uniqueKey]);
return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
}
This way works well for me:
function arrayUnique(arr, uniqueKey) {
const flagList = new Set()
return arr.filter(function(item) {
if (!flagList.has(item[uniqueKey])) {
flagList.add(item[uniqueKey])
return true
}
})
}
const data = [
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Emily',
occupation: 'Web Designer'
},
{
name: 'Melissa',
occupation: 'Fashion Designer'
},
{
name: 'Tom',
occupation: 'Web Developer'
},
{
name: 'Tom',
occupation: 'Web Developer'
}
]
console.table(arrayUnique(data, 'name'))// work well
printout
┌─────────┬───────────┬────────────────────┐
│ (index) │ name │ occupation │
├─────────┼───────────┼────────────────────┤
│ 0 │ 'Kyle' │ 'Fashion Designer' │
│ 1 │ 'Emily' │ 'Web Designer' │
│ 2 │ 'Melissa' │ 'Fashion Designer' │
│ 3 │ 'Tom' │ 'Web Developer' │
└─────────┴───────────┴────────────────────┘
ES5:
function arrayUnique(arr, uniqueKey) {
const flagList = []
return arr.filter(function(item) {
if (flagList.indexOf(item[uniqueKey]) === -1) {
flagList.push(item[uniqueKey])
return true
}
})
}
These two ways are simpler and more understandable.
Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.
const things = {
thing: [
{ place: 'here', name: 'stuff' },
{ place: 'there', name: 'morestuff1' },
{ place: 'there', name: 'morestuff2' },
],
};
const removeDuplicates = (array, key) => {
return array.reduce((arr, item) => {
const removed = arr.filter(i => i[key] !== item[key]);
return [...removed, item];
}, []);
};
console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
I know there is a ton of answers in this question already, but bear with me...
Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.
Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.
The expected result should include only the first and last objects. So here goes the code:
const array = [{
propOne: 'a',
propTwo: 'b',
propThree: 'I have no part in this...'
},
{
propOne: 'a',
propTwo: 'b',
someOtherProperty: 'no one cares about this...'
},
{
propOne: 'x',
propTwo: 'y',
yetAnotherJunk: 'I am valueless really',
noOneHasThis: 'I have something no one has'
}];
const uniques = [...new Set(
array.map(x => JSON.stringify(((o) => ({
propOne: o.propOne,
propTwo: o.propTwo
}))(x))))
].map(JSON.parse);
console.log(uniques);
Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.
var uniq = redundant_array.reduce(function(a,b){
function indexOfProperty (a, b){
for (var i=0;i<a.length;i++){
if(a[i].property == b.property){
return i;
}
}
return -1;
}
if (indexOfProperty(a,b) < 0 ) a.push(b);
return a;
},[]);
Here I found a simple solution for removing duplicates from an array of objects using reduce method. I am filtering elements based on the position key of an object
const med = [
{name: 'name1', position: 'left'},
{name: 'name2', position: 'right'},
{name: 'name3', position: 'left'},
{name: 'name4', position: 'right'},
{name: 'name5', position: 'left'},
{name: 'name6', position: 'left1'}
]
const arr = [];
med.reduce((acc, curr) => {
if(acc.indexOf(curr.position) === -1) {
acc.push(curr.position);
arr.push(curr);
}
return acc;
}, [])
console.log(arr)
If array contains objects, then you can use this to remove duplicate
const persons= [
{ id: 1, name: 'John',phone:'23' },
{ id: 2, name: 'Jane',phone:'23'},
{ id: 1, name: 'Johnny',phone:'56' },
{ id: 4, name: 'Alice',phone:'67' },
];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];
if remove duplicates on the basis of phone, just replace m.id with m.phone
const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];

Can I add or edit object in array by field [duplicate]

I have this javascript object:
var arr1 = [{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}]
var arr2 = [{id:'124',name:'ttt'},
{id:'45',name:'yyy'}]
I need to replace objects in arr1 with items from arr2 with same id.
So here is the result I want to get:
var arr1 = [{id:'124',name:'ttt'},
{id:'589',name:'www'},
{id:'45',name:'yyy'},
{id:'567',name:'rrr'}]
How can I implement it using javascript?
You can use Array#map with Array#find.
arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
var arr1 = [{
id: '124',
name: 'qqq'
}, {
id: '589',
name: 'www'
}, {
id: '45',
name: 'eee'
}, {
id: '567',
name: 'rrr'
}];
var arr2 = [{
id: '124',
name: 'ttt'
}, {
id: '45',
name: 'yyy'
}];
var res = arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
console.log(res);
Here, arr2.find(o => o.id === obj.id) will return the element i.e. object from arr2 if the id is found in the arr2. If not, then the same element in arr1 i.e. obj is returned.
What's wrong with Object.assign(target, source) ?
Arrays are still type object in Javascript, so using assign should still reassign any matching keys parsed by the operator as long as matching keys are found, right?
There is always going to be a good debate on time vs space, however these days I've found using space is better for the long run.. Mathematics aside let look at a one practical approach to the problem using hashmaps, dictionaries, or associative array's whatever you feel like labeling the simple data structure..
var marr2 = new Map(arr2.map(e => [e.id, e]));
arr1.map(obj => marr2.has(obj.id) ? marr2.get(obj.id) : obj);
I like this approach because though you could argue with an array with low numbers you are wasting space because an inline approach like #Tushar approach performs indistinguishably close to this method. However I ran some tests and the graph shows how performant in ms both methods perform from n 0 - 1000. You can decide which method works best for you, for your situation but in my experience users don't care to much about small space but they do care about small speed.
Here is my performance test I ran for source of data
var n = 1000;
var graph = new Array();
for( var x = 0; x < n; x++){
var arr1s = [...Array(x).keys()];
var arr2s = arr1s.filter( e => Math.random() > .5);
var arr1 = arr1s.map(e => {return {id: e, name: 'bill'}});
var arr2 = arr2s.map(e => {return {id: e, name: 'larry'}});
// Map 1
performance.mark('p1s');
var marr2 = new Map(arr2.map(e => [e.id, e]));
arr1.map(obj => marr2.has(obj.id) ? marr2.get(obj.id) : obj);
performance.mark('p1e');
// Map 2
performance.mark('p2s');
arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
performance.mark('p2e');
graph.push({ x: x, r1: performance.measure('HashMap Method', 'p1s', 'p1e').duration, r2: performance.measure('Inner Find', 'p2s','p2e').duration});
}
Since you're using Lodash you could use _.map and _.find to make sure major browsers are supported.
In the end I would go with something like:
function mergeById(arr) {
return {
with: function(arr2) {
return _.map(arr, item => {
return _.find(arr2, obj => obj.id === item.id) || item
})
}
}
}
var result = mergeById([{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}])
.with([{id:'124',name:'ttt'}, {id:'45',name:'yyy'}])
console.log(result);
<script src="https://raw.githubusercontent.com/lodash/lodash/4.13.1/dist/lodash.js"></script>
Thanks to ES6 we can made it with easy way -> for example on util.js module ;))).
Merge 2 array of entity
export const mergeArrays = (arr1, arr2) =>
arr1 && arr1.map(obj => arr2 && arr2.find(p => p.id === obj.id) || obj);
gets 2 array and merges it.. Arr1 is main array which is priority is
high on merge process
Merge array with same type of entity
export const mergeArrayWithObject = (arr, obj) => arr && arr.map(t => t.id === obj.id ? obj : t);
it merges the same kind of array of type with some kind of type for
example: array of person ->
[{id:1, name:"Bir"},{id:2, name: "Iki"},{id:3, name:"Uc"}]
second param Person {id:3, name: "Name changed"}
result is
[{id:1, name:"Bir"},{id:2, name: "Iki"},{id:3, name:"Name changed"}]
I like to go through arr2 with foreach() and use findIndex() for checking for occurrence in arr1:
var arr1 = [{id:'124',name:'qqq'},
{id:'589',name:'www'},
{id:'45',name:'eee'},
{id:'567',name:'rrr'}]
var arr2 = [{id:'124',name:'ttt'},
{id:'45',name:'yyy'}]
arr2.forEach(element => {
const itemIndex = arr1.findIndex(o => o.id === element.id);
if(itemIndex > -1) {
arr1[itemIndex] = element;
} else {
arr1 = arr1.push(element);
}
});
console.log(arr1)
I'd like to suggest another solution:
const objectToReplace = this.array.find(arrayItem => arrayItem.id === requiredItem.id);
Object.assign(objectToReplace, newObject);
Considering that the accepted answer is probably inefficient for large arrays, O(nm), I usually prefer this approach, O(2n + 2m):
function mergeArrays(arr1 = [], arr2 = []){
//Creates an object map of id to object in arr1
const arr1Map = arr1.reduce((acc, o) => {
acc[o.id] = o;
return acc;
}, {});
//Updates the object with corresponding id in arr1Map from arr2,
//creates a new object if none exists (upsert)
arr2.forEach(o => {
arr1Map[o.id] = o;
});
//Return the merged values in arr1Map as an array
return Object.values(arr1Map);
}
Unit test:
it('Merges two arrays using id as the key', () => {
var arr1 = [{id:'124',name:'qqq'}, {id:'589',name:'www'}, {id:'45',name:'eee'}, {id:'567',name:'rrr'}];
var arr2 = [{id:'124',name:'ttt'}, {id:'45',name:'yyy'}];
const actual = mergeArrays(arr1, arr2);
const expected = [{id:'124',name:'ttt'}, {id:'589',name:'www'}, {id:'45',name:'yyy'}, {id:'567',name:'rrr'}];
expect(actual.sort((a, b) => (a.id < b.id)? -1: 1)).toEqual(expected.sort((a, b) => (a.id < b.id)? -1: 1));
})
// here find all the items that are not it the arr1
const temp = arr1.filter(obj1 => !arr2.some(obj2 => obj1.id === obj2.id))
// then just concat it
arr1 = [...temp, ...arr2]
Here a more transparent approach. I find the oneliners harder to read and harder to debug.
export class List {
static replace = (object, list) => {
let newList = [];
list.forEach(function (item) {
if (item.id === object.id) {
newList.push(object);
} else {
newList.push(item);
}
});
return newList;
}
}
If you don't care about the order of the array, then you may want to get the difference between arr1 and arr2 by id using differenceBy() and then simply use concat() to append all the updated objects.
var result = _(arr1).differenceBy(arr2, 'id').concat(arr2).value();
var arr1 = [{
id: '124',
name: 'qqq'
}, {
id: '589',
name: 'www'
}, {
id: '45',
name: 'eee'
}, {
id: '567',
name: 'rrr'
}]
var arr2 = [{
id: '124',
name: 'ttt'
}, {
id: '45',
name: 'yyy'
}];
var result = _(arr1).differenceBy(arr2, 'id').concat(arr2).value();
console.log(result);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.13.1/lodash.js"></script>
I am only submitting this answer because people expressed concerns over browsers and maintaining the order of objects. I recognize that it is not the most efficient way to accomplish the goal.
Having said this, I broke the problem down into two functions for readability.
// The following function is used for each itertion in the function updateObjectsInArr
const newObjInInitialArr = function(initialArr, newObject) {
let id = newObject.id;
let newArr = [];
for (let i = 0; i < initialArr.length; i++) {
if (id === initialArr[i].id) {
newArr.push(newObject);
} else {
newArr.push(initialArr[i]);
}
}
return newArr;
};
const updateObjectsInArr = function(initialArr, newArr) {
let finalUpdatedArr = initialArr;
for (let i = 0; i < newArr.length; i++) {
finalUpdatedArr = newObjInInitialArr(finalUpdatedArr, newArr[i]);
}
return finalUpdatedArr
}
const revisedArr = updateObjectsInArr(arr1, arr2);
jsfiddle
function getMatch(elem) {
function action(ele, val) {
if(ele === val){
elem = arr2[i];
}
}
for (var i = 0; i < arr2.length; i++) {
action(elem.id, Object.values(arr2[i])[0]);
}
return elem;
}
var modified = arr1.map(getMatch);
I went with this, because it makes sense to me. Comments added for readers!
masterData = [{id: 1, name: "aaaaaaaaaaa"},
{id: 2, name: "Bill"},
{id: 3, name: "ccccccccc"}];
updatedData = [{id: 3, name: "Cat"},
{id: 1, name: "Apple"}];
updatedData.forEach(updatedObj=> {
// For every updatedData object (dataObj), find the array index in masterData where the IDs match.
let indexInMasterData = masterData.map(masterDataObj => masterDataObj.id).indexOf(updatedObj.id); // First make an array of IDs, to use indexOf().
// If there is a matching ID (and thus an index), replace the existing object in masterData with the updatedData's object.
if (indexInMasterData !== undefined) masterData.splice(indexInMasterData, 1, updatedObj);
});
/* masterData becomes [{id: 1, name: "Apple"},
{id: 2, name: "Bill"},
{id: 3, name: "Cat"}]; as you want.`*/
The accepted answer using array.map is correct but you have to remember to assign it to another variable since array.map doesnt change original array, it actually creates a new array.
//newArr contains the mapped array from arr2 to arr1.
//arr1 still contains original value
var newArr = arr1.map(obj => arr2.find(o => o.id === obj.id) || obj);
Array.prototype.update = function(...args) {
return this.map(x=>args.find((c)=>{return c.id===x.id}) || x)
}
const result =
[
{id:'1',name:'test1'},
{id:'2',name:'test2'},
{id:'3',name:'test3'},
{id:'4',name:'test4'}
]
.update({id:'1',name:'test1.1'}, {id:'3',name:'test3.3'})
console.log(result)
This is how I do it in TypeScript:
const index = this.array.indexOf(this.objectToReplace);
this.array[index] = newObject;

How prevent duplicate items in javascript array.push [duplicate]

I have an object that contains an array of objects.
obj = {};
obj.arr = new Array();
obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});
I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr would become...
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
How about with some es6 magic?
obj.arr = obj.arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.place === value.place && t.name === value.name
))
)
Reference URL
A more generic solution would be:
const uniqueArray = obj.arr.filter((value, index) => {
const _value = JSON.stringify(value);
return index === obj.arr.findIndex(obj => {
return JSON.stringify(obj) === _value;
});
});
Using the above property strategy instead of JSON.stringify:
const isPropValuesEqual = (subject, target, propNames) =>
propNames.every(propName => subject[propName] === target[propName]);
const getUniqueItemsByProperties = (items, propNames) =>
items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
);
You can add a wrapper if you want the propNames property to be either an array or a value:
const getUniqueItemsByProperties = (items, propNames) => {
const propNamesArray = Array.from(propNames);
return items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
);
};
allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);
Stackblitz Example
Explanation
Start by understanding the two methods used:
filter, findIndex
Next take your idea of what makes your two objects equal and keep that in mind.
We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but it's position is not at the first instance of an object with the criterion.
Therefore we can use the above criterion to determine if something is a duplicate.
One liners with filter ( Preserves order )
Find unique id's in an array.
arr.filter((v,i,a)=>a.findIndex(v2=>(v2.id===v.id))===i)
If the order is not important, map solutions will be faster: Solution with map
Unique by multiple properties ( place and name )
arr.filter((v,i,a)=>a.findIndex(v2=>['place','name'].every(k=>v2[k] ===v[k]))===i)
Unique by all properties (This will be slow for large arrays)
arr.filter((v,i,a)=>a.findIndex(v2=>(JSON.stringify(v2) === JSON.stringify(v)))===i)
Keep the last occurrence by replacing findIndex with findLastIndex.
arr.filter((v,i,a)=>a.findLastIndex(v2=>(v2.place === v.place))===i)
Using ES6+ in a single line you can get a unique list of objects by key:
const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]
It can be put into a function:
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
Here is a working example:
const arr = [
{place: "here", name: "x", other: "other stuff1" },
{place: "there", name: "x", other: "other stuff2" },
{place: "here", name: "y", other: "other stuff4" },
{place: "here", name: "z", other: "other stuff5" }
]
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
const arr1 = getUniqueListBy(arr, 'place')
console.log("Unique by place")
console.log(JSON.stringify(arr1))
console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')
console.log(JSON.stringify(arr2))
How does it work
First the array is remapped in a way that it can be used as an input for a Map.
arr.map(item => [item[key], item]);
which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.
Example when key is place:
[["here", {place: "here", name: "x", other: "other stuff1" }], ...]
Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key.
Note: Map keeps the order of insertion. (check difference between Map and object)
new Map(entry array just mapped above)
Third we use the map values to retrieve the original items, but this time without duplicates.
new Map(mappedArr).values()
And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:
return [...new Map(mappedArr).values()]
Simple and performant solution with a better runtime than the 70+ answers that already exist:
const ids = array.map(o => o.id)
const filtered = array.filter(({id}, index) => !ids.includes(id, index + 1))
Example:
const arr = [{id: 1, name: 'one'}, {id: 2, name: 'two'}, {id: 1, name: 'one'}]
const ids = arr.map(o => o.id)
const filtered = arr.filter(({id}, index) => !ids.includes(id, index + 1))
console.log(filtered)
How it works:
Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.
Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.
This obviously also works for any other key that is not called id, multiple or even all keys.
A primitive method would be:
const obj = {};
for (let i = 0, len = things.thing.length; i < len; i++) {
obj[things.thing[i]['place']] = things.thing[i];
}
things.thing = new Array();
for (const key in obj) {
things.thing.push(obj[key]);
}
If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:
_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])
Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:
var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name');
UPDATE: Lodash now has introduced a .uniqBy as well.
I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects
So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.
var arrayWithDuplicates = [
{"type":"LICENSE", "licenseNum": "12345", state:"NV"},
{"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
{"type":"LICENSE", "licenseNum": "12345", state:"OR"},
{"type":"LICENSE", "licenseNum": "10849", state:"CA"},
{"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
{"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];
function removeDuplicates(originalArray, prop) {
var newArray = [];
var lookupObject = {};
for(var i in originalArray) {
lookupObject[originalArray[i][prop]] = originalArray[i];
}
for(i in lookupObject) {
newArray.push(lookupObject[i]);
}
return newArray;
}
var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));
The results:
uniqueArray is:
[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
One liner using Set
var things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
// assign things.thing to myData for brevity
var myData = things.thing;
things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);
console.log(things.thing)
Explanation:
new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
Set object will ensure that every element is unique.
Then I create an array based on the elements of the created set using Array.from.
Finally, I use JSON.parse to convert stringified element back to an object.
ES6 one liner is here
let arr = [
{id:1,name:"sravan ganji"},
{id:2,name:"pinky"},
{id:4,name:"mammu"},
{id:3,name:"avy"},
{id:3,name:"rashni"},
];
console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
To remove all duplicates from an array of objects, the simplest way is use filter:
var uniq = {};
var arr = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
One liners with Map ( High performance, Does not preserve order )
Find unique id's in array arr.
const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]
If the order is important check out the solution with filter: Solution with filter
Unique by multiple properties ( place and name ) in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]
Unique by all properties in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]
Keep the first occurrence in array arr
const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:
function uniq(a, param){
return a.filter(function(item, pos, array){
return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
})
}
uniq(things.thing, 'place');
This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.
ES5 answer
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arr.some(function(item) { return equals(item, val); })) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
var things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
removeDuplicates(things, thingsEqual);
console.log(things);
Original ES3 answer
function arrayContains(arr, val, equals) {
var i = arr.length;
while (i--) {
if ( equals(arr[i], val) ) {
return true;
}
}
return false;
}
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, j, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arrayContains(arr, val, equals)) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
removeDuplicates(things.thing, thingsEqual);
If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.
The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq
This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:
function unique(a){
a.sort();
for(var i = 1; i < a.length; ){
if(a[i-1] == a[i]){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
// Provide your own comparison
function unique(a, compareFunc){
a.sort( compareFunc );
for(var i = 1; i < a.length; ){
if( compareFunc(a[i-1], a[i]) === 0){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
I think the best approach is using reduce and Map object. This is a single line solution.
const data = [
{id: 1, name: 'David'},
{id: 2, name: 'Mark'},
{id: 2, name: 'Lora'},
{id: 4, name: 'Tyler'},
{id: 4, name: 'Donald'},
{id: 5, name: 'Adrian'},
{id: 6, name: 'Michael'}
]
const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];
console.log(uniqueData)
/*
in `map.set(obj.id, obj)`
'obj.id' is key. (don't worry. we'll get only values using the .values() method)
'obj' is whole object.
*/
To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.
let filtered = array.reduce((accumulator, current) => {
if (! accumulator.find(({guid}) => guid === current.guid)) {
accumulator.push(current);
}
return accumulator;
}, []);
Extending this one to allow selection of a property and compress it into a one liner:
const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);
To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:
const result = uniqify(myArrayOfObjects, 'guid')
Considering lodash.uniqWith
const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
You could also use a Map:
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
Full sample:
const things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
console.log(JSON.stringify(dedupThings, null, 4));
Result:
[
{
"place": "here",
"name": "stuff"
},
{
"place": "there",
"name": "morestuff"
}
]
Dang, kids, let's crush this thing down, why don't we?
let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
let myData = [{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}];
let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
console.log(q)
One-liner using ES6 and new Map().
// assign things.thing to myData
let myData = things.thing;
[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
Details:-
Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
Using .values() would give MapIterator with all values in a Map (obj in our case)
Finally, spread ... operator to give new Array with values from the above step.
A TypeScript solution
This will remove duplicate objects and also preserve the types of the objects.
function removeDuplicateObjects(array: any[]) {
return [...new Set(array.map(s => JSON.stringify(s)))]
.map(s => JSON.parse(s));
}
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
const x = thing.find(item => item.place === current.place);
if (!x) {
return thing.concat([current]);
} else {
return thing;
}
}, []);
console.log(filteredArr)
Solution Via Set Object | According to the data type
const seen = new Set();
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.filter(el => {
const duplicate = seen.has(el.place);
seen.add(el.place);
return !duplicate;
});
console.log(filteredArr)
Set Object Feature
Each value in the Set Object has to be unique, the value equality will be checked
The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.
Unique & data Type feature:..
addmethod
it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...
has method
sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..
delete method
it will remove specific item from the collection by identifying data type..
clear method
it will remove all collection items from one specific variable and set as empty object
Set object has also Iteration methods & more feature..
Better Read from Here : Set - JavaScript | MDN
removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).
const allTests = [
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'},
{name: 'Test2', id: '2'},
{name: 'Test3', id: '3'}
];
function removeDuplicates(array) {
let uniq = {};
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}
removeDuplicates(allTests);
Expected outcome:
[
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'}
];
First, we set the value of variable uniq to an empty object.
Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.
For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.
Fast (less runtime) and type-safe answer for lazy Typescript developers:
export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
const ids = objects.map(object => object[uniqueKey]);
return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
}
This way works well for me:
function arrayUnique(arr, uniqueKey) {
const flagList = new Set()
return arr.filter(function(item) {
if (!flagList.has(item[uniqueKey])) {
flagList.add(item[uniqueKey])
return true
}
})
}
const data = [
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Emily',
occupation: 'Web Designer'
},
{
name: 'Melissa',
occupation: 'Fashion Designer'
},
{
name: 'Tom',
occupation: 'Web Developer'
},
{
name: 'Tom',
occupation: 'Web Developer'
}
]
console.table(arrayUnique(data, 'name'))// work well
printout
┌─────────┬───────────┬────────────────────┐
│ (index) │ name │ occupation │
├─────────┼───────────┼────────────────────┤
│ 0 │ 'Kyle' │ 'Fashion Designer' │
│ 1 │ 'Emily' │ 'Web Designer' │
│ 2 │ 'Melissa' │ 'Fashion Designer' │
│ 3 │ 'Tom' │ 'Web Developer' │
└─────────┴───────────┴────────────────────┘
ES5:
function arrayUnique(arr, uniqueKey) {
const flagList = []
return arr.filter(function(item) {
if (flagList.indexOf(item[uniqueKey]) === -1) {
flagList.push(item[uniqueKey])
return true
}
})
}
These two ways are simpler and more understandable.
Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.
const things = {
thing: [
{ place: 'here', name: 'stuff' },
{ place: 'there', name: 'morestuff1' },
{ place: 'there', name: 'morestuff2' },
],
};
const removeDuplicates = (array, key) => {
return array.reduce((arr, item) => {
const removed = arr.filter(i => i[key] !== item[key]);
return [...removed, item];
}, []);
};
console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
I know there is a ton of answers in this question already, but bear with me...
Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.
Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.
The expected result should include only the first and last objects. So here goes the code:
const array = [{
propOne: 'a',
propTwo: 'b',
propThree: 'I have no part in this...'
},
{
propOne: 'a',
propTwo: 'b',
someOtherProperty: 'no one cares about this...'
},
{
propOne: 'x',
propTwo: 'y',
yetAnotherJunk: 'I am valueless really',
noOneHasThis: 'I have something no one has'
}];
const uniques = [...new Set(
array.map(x => JSON.stringify(((o) => ({
propOne: o.propOne,
propTwo: o.propTwo
}))(x))))
].map(JSON.parse);
console.log(uniques);
Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.
var uniq = redundant_array.reduce(function(a,b){
function indexOfProperty (a, b){
for (var i=0;i<a.length;i++){
if(a[i].property == b.property){
return i;
}
}
return -1;
}
if (indexOfProperty(a,b) < 0 ) a.push(b);
return a;
},[]);
Here I found a simple solution for removing duplicates from an array of objects using reduce method. I am filtering elements based on the position key of an object
const med = [
{name: 'name1', position: 'left'},
{name: 'name2', position: 'right'},
{name: 'name3', position: 'left'},
{name: 'name4', position: 'right'},
{name: 'name5', position: 'left'},
{name: 'name6', position: 'left1'}
]
const arr = [];
med.reduce((acc, curr) => {
if(acc.indexOf(curr.position) === -1) {
acc.push(curr.position);
arr.push(curr);
}
return acc;
}, [])
console.log(arr)
If array contains objects, then you can use this to remove duplicate
const persons= [
{ id: 1, name: 'John',phone:'23' },
{ id: 2, name: 'Jane',phone:'23'},
{ id: 1, name: 'Johnny',phone:'56' },
{ id: 4, name: 'Alice',phone:'67' },
];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];
if remove duplicates on the basis of phone, just replace m.id with m.phone
const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];

Removing / Filtering Duplicates Array - Angular [duplicate]

I have an object that contains an array of objects.
obj = {};
obj.arr = new Array();
obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});
I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr would become...
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
How about with some es6 magic?
obj.arr = obj.arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.place === value.place && t.name === value.name
))
)
Reference URL
A more generic solution would be:
const uniqueArray = obj.arr.filter((value, index) => {
const _value = JSON.stringify(value);
return index === obj.arr.findIndex(obj => {
return JSON.stringify(obj) === _value;
});
});
Using the above property strategy instead of JSON.stringify:
const isPropValuesEqual = (subject, target, propNames) =>
propNames.every(propName => subject[propName] === target[propName]);
const getUniqueItemsByProperties = (items, propNames) =>
items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
);
You can add a wrapper if you want the propNames property to be either an array or a value:
const getUniqueItemsByProperties = (items, propNames) => {
const propNamesArray = Array.from(propNames);
return items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
);
};
allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);
Stackblitz Example
Explanation
Start by understanding the two methods used:
filter, findIndex
Next take your idea of what makes your two objects equal and keep that in mind.
We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but it's position is not at the first instance of an object with the criterion.
Therefore we can use the above criterion to determine if something is a duplicate.
One liners with filter ( Preserves order )
Find unique id's in an array.
arr.filter((v,i,a)=>a.findIndex(v2=>(v2.id===v.id))===i)
If the order is not important, map solutions will be faster: Solution with map
Unique by multiple properties ( place and name )
arr.filter((v,i,a)=>a.findIndex(v2=>['place','name'].every(k=>v2[k] ===v[k]))===i)
Unique by all properties (This will be slow for large arrays)
arr.filter((v,i,a)=>a.findIndex(v2=>(JSON.stringify(v2) === JSON.stringify(v)))===i)
Keep the last occurrence by replacing findIndex with findLastIndex.
arr.filter((v,i,a)=>a.findLastIndex(v2=>(v2.place === v.place))===i)
Using ES6+ in a single line you can get a unique list of objects by key:
const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]
It can be put into a function:
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
Here is a working example:
const arr = [
{place: "here", name: "x", other: "other stuff1" },
{place: "there", name: "x", other: "other stuff2" },
{place: "here", name: "y", other: "other stuff4" },
{place: "here", name: "z", other: "other stuff5" }
]
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
const arr1 = getUniqueListBy(arr, 'place')
console.log("Unique by place")
console.log(JSON.stringify(arr1))
console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')
console.log(JSON.stringify(arr2))
How does it work
First the array is remapped in a way that it can be used as an input for a Map.
arr.map(item => [item[key], item]);
which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.
Example when key is place:
[["here", {place: "here", name: "x", other: "other stuff1" }], ...]
Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key.
Note: Map keeps the order of insertion. (check difference between Map and object)
new Map(entry array just mapped above)
Third we use the map values to retrieve the original items, but this time without duplicates.
new Map(mappedArr).values()
And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:
return [...new Map(mappedArr).values()]
Simple and performant solution with a better runtime than the 70+ answers that already exist:
const ids = array.map(o => o.id)
const filtered = array.filter(({id}, index) => !ids.includes(id, index + 1))
Example:
const arr = [{id: 1, name: 'one'}, {id: 2, name: 'two'}, {id: 1, name: 'one'}]
const ids = arr.map(o => o.id)
const filtered = arr.filter(({id}, index) => !ids.includes(id, index + 1))
console.log(filtered)
How it works:
Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.
Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.
This obviously also works for any other key that is not called id, multiple or even all keys.
A primitive method would be:
const obj = {};
for (let i = 0, len = things.thing.length; i < len; i++) {
obj[things.thing[i]['place']] = things.thing[i];
}
things.thing = new Array();
for (const key in obj) {
things.thing.push(obj[key]);
}
If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:
_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])
Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:
var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name');
UPDATE: Lodash now has introduced a .uniqBy as well.
I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects
So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.
var arrayWithDuplicates = [
{"type":"LICENSE", "licenseNum": "12345", state:"NV"},
{"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
{"type":"LICENSE", "licenseNum": "12345", state:"OR"},
{"type":"LICENSE", "licenseNum": "10849", state:"CA"},
{"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
{"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];
function removeDuplicates(originalArray, prop) {
var newArray = [];
var lookupObject = {};
for(var i in originalArray) {
lookupObject[originalArray[i][prop]] = originalArray[i];
}
for(i in lookupObject) {
newArray.push(lookupObject[i]);
}
return newArray;
}
var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));
The results:
uniqueArray is:
[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
One liner using Set
var things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
// assign things.thing to myData for brevity
var myData = things.thing;
things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);
console.log(things.thing)
Explanation:
new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
Set object will ensure that every element is unique.
Then I create an array based on the elements of the created set using Array.from.
Finally, I use JSON.parse to convert stringified element back to an object.
ES6 one liner is here
let arr = [
{id:1,name:"sravan ganji"},
{id:2,name:"pinky"},
{id:4,name:"mammu"},
{id:3,name:"avy"},
{id:3,name:"rashni"},
];
console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
To remove all duplicates from an array of objects, the simplest way is use filter:
var uniq = {};
var arr = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
One liners with Map ( High performance, Does not preserve order )
Find unique id's in array arr.
const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]
If the order is important check out the solution with filter: Solution with filter
Unique by multiple properties ( place and name ) in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]
Unique by all properties in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]
Keep the first occurrence in array arr
const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:
function uniq(a, param){
return a.filter(function(item, pos, array){
return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
})
}
uniq(things.thing, 'place');
This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.
ES5 answer
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arr.some(function(item) { return equals(item, val); })) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
var things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
removeDuplicates(things, thingsEqual);
console.log(things);
Original ES3 answer
function arrayContains(arr, val, equals) {
var i = arr.length;
while (i--) {
if ( equals(arr[i], val) ) {
return true;
}
}
return false;
}
function removeDuplicates(arr, equals) {
var originalArr = arr.slice(0);
var i, len, j, val;
arr.length = 0;
for (i = 0, len = originalArr.length; i < len; ++i) {
val = originalArr[i];
if (!arrayContains(arr, val, equals)) {
arr.push(val);
}
}
}
function thingsEqual(thing1, thing2) {
return thing1.place === thing2.place
&& thing1.name === thing2.name;
}
removeDuplicates(things.thing, thingsEqual);
If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.
The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq
This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:
function unique(a){
a.sort();
for(var i = 1; i < a.length; ){
if(a[i-1] == a[i]){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
// Provide your own comparison
function unique(a, compareFunc){
a.sort( compareFunc );
for(var i = 1; i < a.length; ){
if( compareFunc(a[i-1], a[i]) === 0){
a.splice(i, 1);
} else {
i++;
}
}
return a;
}
I think the best approach is using reduce and Map object. This is a single line solution.
const data = [
{id: 1, name: 'David'},
{id: 2, name: 'Mark'},
{id: 2, name: 'Lora'},
{id: 4, name: 'Tyler'},
{id: 4, name: 'Donald'},
{id: 5, name: 'Adrian'},
{id: 6, name: 'Michael'}
]
const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];
console.log(uniqueData)
/*
in `map.set(obj.id, obj)`
'obj.id' is key. (don't worry. we'll get only values using the .values() method)
'obj' is whole object.
*/
To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.
let filtered = array.reduce((accumulator, current) => {
if (! accumulator.find(({guid}) => guid === current.guid)) {
accumulator.push(current);
}
return accumulator;
}, []);
Extending this one to allow selection of a property and compress it into a one liner:
const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);
To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:
const result = uniqify(myArrayOfObjects, 'guid')
Considering lodash.uniqWith
const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
You could also use a Map:
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
Full sample:
const things = new Object();
things.thing = new Array();
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());
console.log(JSON.stringify(dedupThings, null, 4));
Result:
[
{
"place": "here",
"name": "stuff"
},
{
"place": "there",
"name": "morestuff"
}
]
Dang, kids, let's crush this thing down, why don't we?
let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
let myData = [{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}];
let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
console.log(q)
One-liner using ES6 and new Map().
// assign things.thing to myData
let myData = things.thing;
[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];
Details:-
Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
Using .values() would give MapIterator with all values in a Map (obj in our case)
Finally, spread ... operator to give new Array with values from the above step.
A TypeScript solution
This will remove duplicate objects and also preserve the types of the objects.
function removeDuplicateObjects(array: any[]) {
return [...new Set(array.map(s => JSON.stringify(s)))]
.map(s => JSON.parse(s));
}
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
const x = thing.find(item => item.place === current.place);
if (!x) {
return thing.concat([current]);
} else {
return thing;
}
}, []);
console.log(filteredArr)
Solution Via Set Object | According to the data type
const seen = new Set();
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.filter(el => {
const duplicate = seen.has(el.place);
seen.add(el.place);
return !duplicate;
});
console.log(filteredArr)
Set Object Feature
Each value in the Set Object has to be unique, the value equality will be checked
The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.
Unique & data Type feature:..
addmethod
it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...
has method
sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..
delete method
it will remove specific item from the collection by identifying data type..
clear method
it will remove all collection items from one specific variable and set as empty object
Set object has also Iteration methods & more feature..
Better Read from Here : Set - JavaScript | MDN
removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).
const allTests = [
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'},
{name: 'Test2', id: '2'},
{name: 'Test3', id: '3'}
];
function removeDuplicates(array) {
let uniq = {};
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}
removeDuplicates(allTests);
Expected outcome:
[
{name: 'Test1', id: '1'},
{name: 'Test3', id: '3'},
{name: 'Test2', id: '2'}
];
First, we set the value of variable uniq to an empty object.
Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.
return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.
For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.
Fast (less runtime) and type-safe answer for lazy Typescript developers:
export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
const ids = objects.map(object => object[uniqueKey]);
return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
}
This way works well for me:
function arrayUnique(arr, uniqueKey) {
const flagList = new Set()
return arr.filter(function(item) {
if (!flagList.has(item[uniqueKey])) {
flagList.add(item[uniqueKey])
return true
}
})
}
const data = [
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Kyle',
occupation: 'Fashion Designer'
},
{
name: 'Emily',
occupation: 'Web Designer'
},
{
name: 'Melissa',
occupation: 'Fashion Designer'
},
{
name: 'Tom',
occupation: 'Web Developer'
},
{
name: 'Tom',
occupation: 'Web Developer'
}
]
console.table(arrayUnique(data, 'name'))// work well
printout
┌─────────┬───────────┬────────────────────┐
│ (index) │ name │ occupation │
├─────────┼───────────┼────────────────────┤
│ 0 │ 'Kyle' │ 'Fashion Designer' │
│ 1 │ 'Emily' │ 'Web Designer' │
│ 2 │ 'Melissa' │ 'Fashion Designer' │
│ 3 │ 'Tom' │ 'Web Developer' │
└─────────┴───────────┴────────────────────┘
ES5:
function arrayUnique(arr, uniqueKey) {
const flagList = []
return arr.filter(function(item) {
if (flagList.indexOf(item[uniqueKey]) === -1) {
flagList.push(item[uniqueKey])
return true
}
})
}
These two ways are simpler and more understandable.
Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.
const things = {
thing: [
{ place: 'here', name: 'stuff' },
{ place: 'there', name: 'morestuff1' },
{ place: 'there', name: 'morestuff2' },
],
};
const removeDuplicates = (array, key) => {
return array.reduce((arr, item) => {
const removed = arr.filter(i => i[key] !== item[key]);
return [...removed, item];
}, []);
};
console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
I know there is a ton of answers in this question already, but bear with me...
Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.
Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.
The expected result should include only the first and last objects. So here goes the code:
const array = [{
propOne: 'a',
propTwo: 'b',
propThree: 'I have no part in this...'
},
{
propOne: 'a',
propTwo: 'b',
someOtherProperty: 'no one cares about this...'
},
{
propOne: 'x',
propTwo: 'y',
yetAnotherJunk: 'I am valueless really',
noOneHasThis: 'I have something no one has'
}];
const uniques = [...new Set(
array.map(x => JSON.stringify(((o) => ({
propOne: o.propOne,
propTwo: o.propTwo
}))(x))))
].map(JSON.parse);
console.log(uniques);
Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.
var uniq = redundant_array.reduce(function(a,b){
function indexOfProperty (a, b){
for (var i=0;i<a.length;i++){
if(a[i].property == b.property){
return i;
}
}
return -1;
}
if (indexOfProperty(a,b) < 0 ) a.push(b);
return a;
},[]);
Here I found a simple solution for removing duplicates from an array of objects using reduce method. I am filtering elements based on the position key of an object
const med = [
{name: 'name1', position: 'left'},
{name: 'name2', position: 'right'},
{name: 'name3', position: 'left'},
{name: 'name4', position: 'right'},
{name: 'name5', position: 'left'},
{name: 'name6', position: 'left1'}
]
const arr = [];
med.reduce((acc, curr) => {
if(acc.indexOf(curr.position) === -1) {
acc.push(curr.position);
arr.push(curr);
}
return acc;
}, [])
console.log(arr)
If array contains objects, then you can use this to remove duplicate
const persons= [
{ id: 1, name: 'John',phone:'23' },
{ id: 2, name: 'Jane',phone:'23'},
{ id: 1, name: 'Johnny',phone:'56' },
{ id: 4, name: 'Alice',phone:'67' },
];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];
if remove duplicates on the basis of phone, just replace m.id with m.phone
const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];

Turn array of comma delimited strings into distinct array

I would like to take an array of objects:
var objArr = [
{id:1, name:'test', seenby:'user1, user2, user3'},
{id:2, name:'test1', seenby:'user3, user4'},
{id:3, name:'test2', seenby:'user1, user3'},
{id:4, name:'test3', seenby:'user2'}
];
And return a distinct array of all 'seenby' users:
var seenByArr = ['user1', 'user2', 'user3', 'user4']
I can't figure out how to efficiently (fewest lines possible) turn this into an array of distinct values. Please check out my fiddle for an example: https://jsfiddle.net/ksumarine/ss3v7dgj/
I realize this question doesn't mention Underscore, but it's worth mentioning that Underscore.js is a very popular library for manipulating objects and arrays, and well-suited for this kind of thing.
Here's a big unreadable single line for achieving this with Underscore:
_.unique(_.flatten(_.map(_.pluck(objArr, 'seenby'), function(seenby) { return seenby.split(', '); })))
The laconic solution with String.split and Array.foreach functions:
var arr = [];
objArr.forEach(function (obj) {
var str = obj.seenby.split(',');
str.forEach(function (v) {
var user = v.trim();
if (arr.indexOf(user) === -1) arr.push(user);
});
});
console.log(arr);
// the output:
["user1", "user2", "user3", "user4"]
You can try following
$.each(list, function(index, value) {
$.each(value.seenby.split(","), function(i, val) {
if (val !== undefined && val.trim() !== "" && arr.indexOf(val.trim()) == -1) {
arr.push(val.trim());
}
});
});
For reference - https://jsfiddle.net/ss3v7dgj/1/
A solution in plain Javascript with an IIFE and a temporary object for the items.
var objArr = [{ id: 1, name: 'test', seenby: 'user1, user2, user3' }, { id: 2, name: 'test1', seenby: 'user3, user4' }, { id: 3, name: 'test2', seenby: 'user1, user3' }, { id: 4, name: 'test3', seenby: 'user2' }],
result = function (array) {
var obj = {}, r = [];
array.forEach(function (a) {
a.seenby.split(', ').forEach(function (b) {
obj[b] = obj[b] || r.push(b);
});
});
return r;
}(objArr);
document.write('<pre>' + JSON.stringify(result, 0, 4) + '</pre>');
given = function (x) {
return {
do: function (f) { x = f(x); return this },
return: function () { return x }
}
};
r = given(objArr)
.do(x => x.map(y => y.seenby))
.do(x => [].concat.apply([], x))
.do(x => x.join(', '))
.do(x => x.split(', '))
.do(x => x.filter((e, i) => x.indexOf(e) === i))
.do(x => x.sort())
.return();
I might find use for this someday...
I would use the Array.prototype.map function, and de-duplicate the users in a associative array. So here the interesting part, but you can also try it out here:
var list = [
{id:1, name:'test', seenby:'user1, user2, user3'},
{id:2, name:'test1', seenby:'user3, user4'},
{id:3, name:'test2', seenby:'user1, user3'},
{id:4, name:'test3', seenby:'user2'}
];
var arr = [];
list.map(function(value){
value.seenby.split(",").map(function(val){
arr[val.trim()] = 1;
});
});
var seenby = Object.keys(arr);
How does it work? The map function will iterate over all the list items, the next map will iterate over all the seenby items. We then collect the users in the arr, de-duplicating by using it as an associative array.
At the end we extract all the associative array keys by using the Object.keys function.
Minor caveat is that this solution will only work if the map function exists, if it doesn't you would need to implement it yourself. But usually you are using a new enough JavaScript version that does support it. This solution also has the nice side effect that you don't really need any external dependency (except of course for displaying it)
UPDATE:
I think it's actually better to do it the following way:
var arr = list
.reduce(function(acc, value){
return acc.concat(value.seenby.split(",").map(function(val){
return val.trim()
}))},[])
.filter(function(item, index, array) {
return array.indexOf(item) === index
}
)
Explanation:
we take the list and for each item, we reduce it ...
from each item, we extract the seenby field
we split the field
we map each item to trim it
... we take the array of returned seenby and concatenate them to our accumulator array arr (which is initialized as empty array [])
then we filter out the duplicates from the array (by checking whether the index of an item is the same as the current index)

Categories

Resources