A bit of a different use case from the ones I was suggested above.
I need to loop through and check each file name within an array of files and push the files that have the same name into a new array so that I can upload them later separately.
This is my code so far, and surely I have a problem with my conditional checking, can somebody see what I am doing wrong?
filesForStorage = [
{id: 12323, name: 'name', ...},
{id: 3123, name: 'abc', ...},
{id: 3213, name: 'name', ...},
...
]
filesForStorage.map((image, index) => {
for (let i = 0; i < filesForStorage.length; i++) {
for (let j = 0; j < filesForStorage.length; j++) {
if (
filesForStorage[i].name.split(".", 1) ===. //.split('.', 1) is to not keep in consideration the file extension
filesForStorage[j].name.split(".", 1)
) {
console.log(
"----FILES HAVE THE SAME NAME " +
filesForStorage[i] +
" " +
filesForStorage[j]
);
}
}
}
Using map without returning anything makes it near on pointless. You could use forEach but that is equally pointless when you're using a double loop within - it means you would be looping once in the foreach (or map in your case) and then twice more within making for eye-wateringly bad performance.
What you're really trying to do is group your items by name and then pick any group with more than 1 element
const filesForStorage = [
{id: 12323, name: 'name'},
{id: 3123, name: 'abc'},
{id: 3213, name: 'name'}
]
const grouped = Object.values(
filesForStorage.reduce( (a,i) => {
a[i.name] = a[i.name] || [];
a[i.name].push(i);
return a;
},{})
);
console.log(grouped.filter(x => x.length>1).flat());
JavaScript has several functions which perform "hidden" iteration.
Object.values will iterate through an object of key-value pairs and collect all values in an array
Array.prototype.reduce will iterate through an array and perform a computation for each element and finally return a single value
Array.prototype.filter will iterate through an array and collect all elements that return true for a specified test
Array.prototype.flat will iterate through an array, concatenating each element to the next, to create a new flattened array
All of these methods are wasteful as you can compute a collection of duplicates using a single pass over the input array. Furthermore, array methods offer O(n) performance at best, compared to O(1) performance of Set or Map, making the choice of arrays for this kind of computation eye-wateringly bad -
function* duplicates (files) {
const seen = new Set()
for (const f of files) {
if (seen.has(f.name))
yield f
else
seen.add(f.name, f)
}
}
const filesForStorage = [
{id: 12323, name: 'foo'},
{id: 3123, name: 'abc'},
{id: 3213, name: 'foo'},
{id: 4432, name: 'bar'},
{id: 5213, name: 'qux'},
{id: 5512, name: 'bar'},
]
for (const d of duplicates(filesForStorage))
console.log("duplicate name found", d)
duplicate name found {
"id": 3213,
"name": "foo"
}
duplicate name found {
"id": 5512,
"name": "bar"
}
A nested loop can be very expensive on performance, especially if your array will have a lot of values. Something like this would be much better.
filesForStorage = [
{ id: 12323, name: 'name' },
{ id: 3123, name: 'abc' },
{ id: 3213, name: 'name' },
{ id: 3123, name: 'abc' },
{ id: 3213, name: 'name' },
{ id: 3123, name: 'random' },
{ id: 3213, name: 'nothing' },
]
function sameName() {
let checkerObj = {};
let newArray = [];
filesForStorage.forEach(file => {
checkerObj[file.name] = (checkerObj[file.name] || 0) + 1;
});
Object.entries(checkerObj).forEach(([key, value]) => {
if (value > 1) {
newArray.push(key);
}
});
console.log(newArray);
}
sameName();
Related
I am having an array like this :
arr = [ {id:0,name:Mark} , {id:1,name:Ron}, {id:2,name:Henry}, {id:3,name:Rose}].
I want to create an object like this :
obj1 = { Mark:false, Ron:false, Henry:false, Rose:flase }
I am using map to traverse through the array like this
let obj1 = {};
obj1 = arr.map((item)=> {
obj1[item.name] = false;
})
How can I achieve the following result?
You could map entries and build an object from the pairs.
const
data = [{ id: 0, name: 'Mark' }, { id: 1, name: 'Ron' }, { id: 2, name: 'Henry' }, { id: 3, name: 'Rose' }],
result = Object.fromEntries(data.map(({ name }) => [name, false]));
console.log(result);
Object.fromEntries() is probably the best idea. But you could also use reduce, if you got more operations on the array and want to stick to the "pipe" approach.
const arr = [
{ id: 0, name: 'Mark' },
{ id: 1, name: 'Ron' },
{ id: 2, name: 'Henry' },
{ id: 3, name: 'Rose' }
];
const objA = arr
.reduce((previous, { name }) => ({ ...previous, [name]: false }), {});
const objB = arr
.reduce((previous, { name }) => {
previous[name] = false;
return previous;
}, {});
The spreach operation {...obj} for objA does effectivly copy the object on each extension, which might not be desirable. But the modern JavaScript engines simplify those expressions anyways.
objB is the more standard approach for me. One additional benefit, in regards to Object.fromEntries() is that you can have some form of standard or default object / settings, which you can use as the start of the reduce (the second parameter) and evaluate in the collection function.
All three options are valid and depend on your code style.
I am trying to delete all repeated objects between four arrays by preference. All the arrays have unique elements, and may not be ordered. Here is a picture that tries to explain the problem:
As you can see, if the array has a lower preference, the elements will stay inside it. For example, the object with id "6" is repeated in the arrays with preference 2, 3, and 4. So, the algorithm has to detect this and remove these objects from the arrays with preference 3 and 4, because 2 < 3 < 4.
So, if the input data is:
arr_p1 = [{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }]
arr_p2 = [{id: "saa1" }, { id: "892d" }]
arr_p3 = [{ id: "kla8x" }, {id: "saa1" }, {id: "pp182" }]
the output must be:
arr_p1 = [{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }]
arr_p2 = [{id: "saa1" }]
arr_p3 = [{id: "pp182" }]
Any ideas on how to solve this situation in a good complexity order?
All arrays have a limited size of 40 objects.
The only thing I can think of is to sort all the objects, in each array, by identifier. Then, take the lowest identifier of an object moving with the pointer of each list, from the lowest preference (1) to the highest (4), and if it is in one of the higher preference lists, delete it... but I need to do it without altering the order of the elements ...
Pd: I am using JS and ES6.
Combine all items to a single array, and then reduce them to a Map in a reversed order using Array.reduceRight(). The reversed order will cause the 1st items to override the last items.
Now you can filter each array by using the Map, and keeping only items that exist on the Map.
Complexity is O(N1 + N2 + N3) where Nx is the length of that array.
const arr_p1 = [{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }]
const arr_p2 = [{id: "saa1" }, { id: "892d" }]
const arr_p3 = [{ id: "kla8x" }, {id: "saa1" }, {id: "pp182" }]
// create an array of all items and reduce it in a reversed order to a Map
const dupsMap = [...arr_p1, ...arr_p2, ...arr_p3]
// create the Map by using the `id` as the key, and the object as the value
.reduceRight((acc, o) => acc.set(o.id, o), new Map())
const filterArr = arr => arr.filter(o =>
dupsMap.get(o.id) === o // keep the item if it was the object that was used as value
)
const arr_p1f = filterArr(arr_p1)
const arr_p2f = filterArr(arr_p2)
const arr_p3f = filterArr(arr_p3)
console.log({ arr_p1f, arr_p2f, arr_p3f })
You can easily create a generic function that can handle any number of arrays, and get the individual arrays from it's returned value using destructuring.
const dedupArrays = (...arrs) => {
const dupsMap = arrs.flat() // convert arrays to a single array
// a reduce right to create a Map of [id, object]
.reduceRight((acc, o) => acc.set(o.id, o), new Map())
// map the array of arrays, and filter each sub array
return arrs.map(arr => arr.filter(o => dupsMap.get(o.id) === o))
}
const arr_p1 = [{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }]
const arr_p2 = [{id: "saa1" }, { id: "892d" }]
const arr_p3 = [{ id: "kla8x" }, {id: "saa1" }, {id: "pp182" }]
const [arr_p1f, arr_p2f, arr_p3f] = dedupArrays(arr_p1, arr_p2, arr_p3)
console.log({ arr_p1f, arr_p2f, arr_p3f })
You could generate a preference object (hash map) to map the id to preference. Run it from 3rd array to the first so that lower order overrides the higher one.
Then when you have the preference map, you can filter all arrays by checking if the id's preference matches the current array.
let arr_p1 = [{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }];
let arr_p2 = [{id: "saa1" }, { id: "892d" }];
let arr_p3 = [{ id: "kla8x" }, {id: "saa1" }, {id: "pp182" }];
let pref = {};
arr_p3.forEach(e => pref[e.id] = 3);
arr_p2.forEach(e => pref[e.id] = 2);
arr_p1.forEach(e => pref[e.id] = 1);
arr_p1 = arr_p1.filter(e => pref[e.id] === 1);
arr_p2 = arr_p2.filter(e => pref[e.id] === 2);
arr_p3 = arr_p3.filter(e => pref[e.id] === 3);
console.log(arr_p1);
console.log(arr_p2);
console.log(arr_p3);
I have several tips for you, rather than a full answer, since I assume this is a homework question?
Strategy
Build a set of "items already seen"
Check each new array against that, deleting any duplicate entries (in the new array).
Start with the most preferred array
That way, whenever something is deleted, it is being deleted from the less-preferred array.
For example, in pseudocode
let elementsSeen = new Set( most preferred array of elements )
for array in listOfArraysInDecreasingOrderOfPreference {
for element in array {
if element is in elementsSeen, delete it from array
}
elementsSeen = union of elementsSeen and array
}
Complexity
Every item has to be looked at. It has to be compared with every other item, but the complexity of that need not be enormous, because the `Set` process can make use of hashes, i.e. not have to do an individual comparison of each incoming object with each existing object. Almost all incoming objects will have a hash table value that is different from those of existing objects, which is quick, at the expense of some time spent on hashing and some memory spent on the table.
In the worst case, where hashing is no longer helping you, it is O(N x M) where N is the number of arrays, and M is the size of each.
Your question implies you want to mutate the original arrays.
So if you still want to mutate the arrays you could.
create a SET of the ID's for each level.
Loop each level backward, if any id's in higher level then remove from array.
A couple of optimisation here too, eg. slice(0, -1), is so we don't need to create a SET for the last level, as were check previous ones. Inside the loop once item is known to be deleted, use a break to then go onto next. To be honest, I've no idea what the complexity on this is.. :)
eg.
const arr_p1 =
[{ id: "892d" }, {id: "kla8x" }, {id: "sys32" }];
const arr_p2 =
[{id: "saa1" }, { id: "892d" }];
const arr_p3 =
[{ id: "kla8x" }, {id: "saa1" }, {id: "pp182" }];
function dedupe(alist) {
const hasList = alist.map(
m => new Set(m.slice(0, -1).map(i => i.id)));
for (let l = alist.length -1; l > 0; l --) {
for (let i = alist[l].length -1; i >= 0; i --) {
for (let h = 0; h < l; h += 1) {
if (hasList[h].has(alist[l][i].id)) {
alist[l].splice(i, 1);
break;
}
}
}
}
}
dedupe([arr_p1, arr_p2, arr_p3]);
console.log(arr_p1);
console.log(arr_p2);
console.log(arr_p3);
I am using Lodash in my Angular project and I was wondering if there is a better way to write the following code:
$scope.new_arr = _.map(arr1, function(item){
return _.assign(item, {new_id: _.find(arr2, {id: item.id})});
});
$scope.new_arr = _.filter($scope.new_arr, function (item) {
return item.new_id !== undefined;
});
I am trying to combine values from one array to same objects in other array, and I want to ignore the objects that not appear in both arrays (it is something like join or left outer join in the sql language).
Here is a fiddle with an example of this code: Click me!
i think is better to use chaining
$scope.new_arr = _.chain(arr1)
.map(function(item) {
return _.merge(
{}, // to avoid mutations
item,
{new_id: _.find(arr2, {id: item.id})}
);
})
.filter('new_id')
.value();
https://jsfiddle.net/3xjdqsjs/6/
try this:
$scope.getItemById = (array, id) => {
return array.find(item => item.id == id);
};
$scope.mergeArrays = () => {
let items_with_ids = arr1.filter(item => !_.isNil($scope.getItemById(arr2,item.id)));
return items_with_ids.map(item => _.assign(item, {new_id: $scope.getItemById(arr2,item.id)}));
};
The answers provided here are all runtime of O(n^2), because they first run an outer loop on the first array, with an inner loop on the second array. You can instead run this in O(n). First, create a hashmap of all the ids in arr2 in a single loop; this will allow us an order 1 lookup. In the second loop on arr1, check this hashmap to determine if those items exist with O(n). Total Complexity is n + n = 2n, which is just O(n).
// provision some test arrays
var arr1 = [
{
id: 2
},
{
id: 4
},
{
id: 6
}
]
var arr2 = [
{
id: 3
},
{
id: 4
},
{
id: 5
},
{
id: 6
}
]
// First, we create a map of the ids of arr2 with the items. Complexity: O(n)
var mapIdsToArr2Items = _.reduce(arr2, function(accumulator, item) {
accumulator[item.id] = item;
return accumulator;
}, {});
// Next, we use reduce (instead of a _.map followed by a _.filter for slightly more performance.
// This is because with reduce, we loop once, whereas with map and filter,
// we loop twice). Complexity: O(n)
var combinedArr = _.reduce(arr1, function(accumulator, item) {
// Complexity: O(1)
if (mapIdsToArr2Items[item.id]) {
// There's a match/intersection! Arr1's item matches an item in arr 2. Include it
accumulator.push(item);
}
return accumulator;
}, []);
console.log(combinedArr)
You could first make a Map with arr1 and then map the items of arr2 with the properties of arr1.
var arr1 = [{ id: 1, title: 'z' }, { id: 2, title: 'y' }, { id: 3, title: 'x' }, { id: 4, title: 'w' }, { id: 5, title: 'v' }],
arr2 = [{ id: 2, name: 'b' }, { id: 3, name: 'c' }, { id: 4, name: 'd' }, { id: 5, name: 'e' }],
map = new Map(arr1.map(a => [a.id, a])),
result = arr2.map(a => Object.assign({}, a, map.get(a.id)));
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Given an array of objects like this:
objects = [
{ id: 'aaaa', description: 'foo' },
{ id: 'bbbb', description: 'bar' },
{ id: 'cccc', description: 'baz' }
];
And an array of strings like this:
order = [ 'bbbb', 'aaaa', 'cccc' ];
How would I sort the first array so that the id attribute matches the order of the second array?
Try this:
objects.sort(function(a, b){
return order.indexOf(a.id) - order.indexOf(b.id)
});
Assuming the variables are like you declared them in the question, this should return:
[
{ id: 'bbbb', description: 'bar' },
{ id: 'aaaa', description: 'foo' },
{ id: 'cccc', description: 'baz' }
];
(It actually modifies the objects variable)
You need a way to translate the string into the position in the array, i.e. an index-of function for an array.
There is one in newer browsers, but to be backwards compatible you need to add it if it's not there:
if (!Array.prototype.indexOf) {
Array.prototype.indexOf = function(str) {
var i;
for (i = 0; i < this.length; i++) if (this[i] == str) return i;
return -1;
}
}
Now you can sort the array by turning the string into an index:
objects.sort(function(x,y){ return order.indexOf(x.id) - order.indexOf(y.id); });
Demo: http://jsfiddle.net/Guffa/u3CQW/
Use a mapping object for (almost) constant access time:
/* Create a mapping object `orderIndex`:
{
"bbbb": 0,
"aaaa": 1,
"cccc": 2
}
*/
const orderIndex = {}
order.forEach((value, index) => orderIndex[value] = index);
// Sort
objects.sort((a, b) => orderIndex[a.id] - orderIndex[b.id]);
// data
const objects = [
{ id: 'aaaa', description: 'foo' },
{ id: 'bbbb', description: 'bar' },
{ id: 'cccc', description: 'baz' }
];
const order = [ 'bbbb', 'aaaa', 'cccc' ];
/* Create a mapping object `orderIndex`:
{
"bbbb": 0,
"aaaa": 1,
"cccc": 2
}
*/
const orderIndex = {}
order.forEach((value, index) => orderIndex[value] = index);
// Sort
objects.sort((a, b) => orderIndex[a.id] - orderIndex[b.id]);
// Log
console.log('orderIndex:', orderIndex);
console.log('objects:', objects);
I have an array of objects and I want to get a new array from it that is unique based only on a single property, is there a simple way to achieve this?
Eg.
[ { id: 1, name: 'bob' }, { id: 1, name: 'bill' }, { id: 1, name: 'bill' } ]
Would result in 2 objects with name = bill removed once.
Use the uniq function
var destArray = _.uniq(sourceArray, function(x){
return x.name;
});
or single-line version
var destArray = _.uniq(sourceArray, x => x.name);
From the docs:
Produces a duplicate-free version of the array, using === to test object equality. If you know in advance that the array is sorted, passing true for isSorted will run a much faster algorithm. If you want to compute unique items based on a transformation, pass an iterator function.
In the above example, the function uses the objects name in order to determine uniqueness.
If you prefer to do things yourself without Lodash, and without getting verbose, try this uniq filter with optional uniq by property:
const uniqFilterAccordingToProp = function (prop) {
if (prop)
return (ele, i, arr) => arr.map(ele => ele[prop]).indexOf(ele[prop]) === i
else
return (ele, i, arr) => arr.indexOf(ele) === i
}
Then, use it like this:
const obj = [ { id: 1, name: 'bob' }, { id: 1, name: 'bill' }, { id: 1, name: 'bill' } ]
obj.filter(uniqFilterAccordingToProp('abc'))
Or for plain arrays, just omit the parameter, while remembering to invoke:
[1,1,2].filter(uniqFilterAccordingToProp())
If you want to check all the properties then
lodash 4 comes with _.uniqWith(sourceArray, _.isEqual)
A better and quick approach
var table = [
{
a:1,
b:2
},
{
a:2,
b:3
},
{
a:1,
b:4
}
];
let result = [...new Set(table.map(item => item.a))];
document.write(JSON.stringify(result));
Found here
You can use the _.uniqBy function
var array = [ { id: 1, name: 'bob' }, { id: 2, name: 'bill' }, { id: 1, name: 'bill' },{ id: 2, name: 'bill' } ];
var filteredArray = _.uniqBy(array,function(x){ return x.id && x.name;});
console.log(filteredArray)
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.5/lodash.js"></script>
In the above example, filtering is based on the uniqueness of combination of properties id & name.
if you have multiple properties for an object.
then to find unique array of objects based on specific properties, you could follow this method of combining properties inside _.uniqBy() method.
I was looking for a solution which didn't require a library, and put this together, so I thought I'd add it here. It may not be ideal, or working in all situations, but it's doing what I require, so could potentially help someone else:
const uniqueBy = (items, reducer, dupeCheck = [], currentResults = []) => {
if (!items || items.length === 0) return currentResults;
const thisValue = reducer(items[0]);
const resultsToPass = dupeCheck.indexOf(thisValue) === -1 ?
[...currentResults, items[0]] : currentResults;
return uniqueBy(
items.slice(1),
reducer,
[...dupeCheck, thisValue],
resultsToPass,
);
}
const testData = [
{text: 'hello', image: 'yes'},
{text: 'he'},
{text: 'hello'},
{text: 'hell'},
{text: 'hello'},
{text: 'hellop'},
];
const results = uniqueBy(
testData,
item => {
return item.text
},
)
console.dir(results)
In case you need pure JavaScript solution:
var uniqueProperties = {};
var notUniqueArray = [ { id: 1, name: 'bob' }, { id: 1, name: 'bill' }, { id: 1, name: 'bill' } ];
for(var object in notUniqueArray){
uniqueProperties[notUniqueArray[object]['name']] = notUniqueArray[object]['id'];
}
var uniqiueArray = [];
for(var uniqueName in uniqueProperties){
uniqiueArray.push(
{id:uniqueProperties[uniqueName],name:uniqueName});
}
//uniqiueArray
unique array by id property with ES6:
arr.filter((a, i) => arr.findIndex(b => b.id === a.id) === i); // unique by id
replace b.id === a.id with the relevant comparison for your case