Related
I have an array like this
let data = [{x:1,y:2,z:3},{x:1,y:2,z:3},{x:1,y:2,z:4},{x:11,y:2,z:3}]
Now I want to get only those items whose x,y,z values are the same.so expected output should be
{x:1,y:2,z:3}
Because {x:1,y:2,z:3} has duplicate values but rest of them not so I don't want to get rest of them because they do not have any duplicates. Please tell me how can I achieve this?
For lodash 4.17.15,
You can first use _.uniqWith and _.isEqual to find the unique values.
_.uniqWith(data, _.isEqual); // [{x:1,y:2,z:3},{x:1,y:2,z:4},{x:11,y:2,z:3}]
Then use _.difference to remove the unique values from the array, leaving just the duplicates
_.difference(data, _.uniqWith(data, _.isEqual)); // [{x:1,y:2,z:3}]
let data = [{x:1,y:2,z:3},{x:1,y:2,z:3},{x:1,y:2,z:4},{x:11,y:2,z:3},{x:11,y:2,z:3}]
function filterDuplicates(data) {
let dic = {};
data.forEach(obj => {
let strObj = JSON.stringify(obj, Object.keys(obj).sort());
if(strObj in dic) {
++dic[strObj];
return;
}
dic[strObj] = 0;
})
return Object.entries(dic).filter(([key, value]) => value > 0).map(([el]) => JSON.parse(el));
}
console.log(filterDuplicates(data));
Build an object to track the duplicates and use Object.values and filter
let data = [
{ x: 1, y: 2, z: 3 },
{ x: 1, y: 2, z: 3 },
{ x: 1, y: 2, z: 4 },
{ x: 11, y: 2, z: 3 },
];
const all = {};
data.forEach(({ x, y, z }) => {
const key = `x${x}y${y}z${z}`;
all[key] = key in all ? { x, y, z } : null;
});
const res = Object.values(all).filter(Boolean);
console.log(res);
How do I take this object and array.
const data = {
type: "hello",
point: 1.8
};
const raw = [
{
x: [1, 2],
y: [-1.1, -1.2]
},
{
x: [14, 24],
y: [-1.14, 1.24]
}
];
Then "append" the items in the data object to each object in the raw array. The desired end result is;
const result = [
{
x: [1, 2],
y: [-1.1, -1.2],
type: "hello",
point: 1.8
},
{
x: [14, 24],
y: [-1.14, 1.24],
type: "hello",
point: 1.8
}
];
I tried using map but this object works with arrays, then I looked at using Object.keys but am having no luck.
Use map with spreading:
const data = {type:"hello",point:1.8};
const raw = [{x:[1,2],y:[-1.1,-1.2]},{x:[14,24],y:[-1.14,1.24]}];
const result = raw.map(e => ({ ...e, ...data }));
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: auto; }
ES5 syntax:
var data = {type:"hello",point:1.8};
var raw = [{x:[1,2],y:[-1.1,-1.2]},{x:[14,24],y:[-1.14,1.24]}];
var result = raw.map(function(e) {
return Object.assign({}, e, data);
});
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: auto; }
map is indeed the tool you want. I'd probably combine it with destructuring in the map callback parameter list and property spread in the result value:
const result = raw.map(({x, y}) => ({x, y, ...data}));
Live Copy:
const data = {
type: "hello",
point: 1.8
};
const raw = [
{
x: [1, 2],
y: [-1.1, -1.2]
},
{
x: [14, 24],
y: [-1.14, 1.24]
}
];
const result = raw.map(({x, y}) => ({x, y, ...data}));
console.log(result);
Note that if data had any properties whose values were objects (data in your example doesn't), using spread will copy only the references to the objects, it won't make deep copies. So all of your result objects should share them. You could deep copy if that were relevant.
I found a solution;
const data = {type:"hello",point:1.8};
const raw = [{x:[1,2],y:[-1.1,-1.2]},{x:[14,24],y:[-1.14,1.24]}];
const result = raw.map(r => Object.assign(r, data));
console.log(result);
Some feedback on this approach would be appreciated. Looking a the solutions provided now. Thank you all.
I have two array object as following:
var arr1 = [
{
name: 1,
value: 10
},
{
name: 2,
value: 15
}
]
var arr2 = [
{
name: 3,
value: 5
},
{
name: 4,
value: 3
}
]
I want to redefine the key and reduce each data with the same index.
output:
var arr1 = [
{
itemLabel: 1,
itemValue: 5
},
{
itemLabel: 2,
itemValue: 12
}
]
I'm doing now as following:
formatData = arr1.map((row, index) => ({
itemLabel: arr1.name,
itemValue: arr1.value - arr2[index].value
}))
Is there any better solution of doing this?
One-man army
A simple recursive program that handles everything in a single function. There's a clear mixture of concerns here which hurts of function's overall readability. We'll see one such remedy for this problem below
const main = ([x, ...xs], [y, ...ys]) =>
x === undefined || y === undefined
? []
: [ { itemLabel: x.name, itemValue: x.value - y.value } ] .concat (main (xs, ys))
const arr1 =
[ { name: 1, value: 10 }, { name: 2, value: 15 } ]
const arr2 =
[ { name: 3, value: 5 }, { name: 4, value: 3 } ]
console.log (main (arr1, arr2))
// [ { itemLabel: 1, itemValue: 5 },
// { itemLabel: 2, itemValue: 12 } ]
Thinking with types
This part of the answer is influenced by type theory from the Monoid category – I won't go too far into it because I think the code should be able to demonstrate itself.
So we have two types in our problem: We'll call them Foo and Bar
Foo – has name, and value fields
Bar – has itemLabel and itemValue fields
We can represent our "types" however we want, but I chose a simple function which constructs an object
const Foo = (name, value) =>
({ name
, value
})
const Bar = (itemLabel, itemValue) =>
({ itemLabel
, itemValue
})
Making values of a type
To construct new values of our type, we just apply our functions to the field values
const arr1 =
[ Foo (1, 10), Foo (2, 15) ]
const arr2 =
[ Foo (3, 5), Foo (4, 3) ]
Let's see the data we have so far
console.log (arr1)
// [ { name: 1, value: 10 },
// { name: 2, value: 15 } ]
console.log (arr2)
// [ { name: 3, value: 5 },
// { name: 4, value: 3 } ]
Some high-level planning
We're off to a great start. We have two arrays of Foo values. Our objective is to work through the two arrays by taking one Foo value from each array, combining them (more on this later), and then moving onto the next pair
const zip = ([ x, ...xs ], [ y, ...ys ]) =>
x === undefined || y === undefined
? []
: [ [ x, y ] ] .concat (zip (xs, ys))
console.log (zip (arr1, arr2))
// [ [ { name: 1, value: 10 },
// { name: 3, value: 5 } ],
// [ { name: 2, value: 15 },
// { name: 4, value: 3 } ] ]
Combining values: concat
With the Foo values properly grouped together, we can now focus more on what that combining process is. Here, I'm going to define a generic concat and then implement it on our Foo type
// generic concat
const concat = (m1, m2) =>
m1.concat (m2)
const Foo = (name, value) =>
({ name
, value
, concat: ({name:_, value:value2}) =>
// keep the name from the first, subtract value2 from value
Foo (name, value - value2)
})
console.log (concat (Foo (1, 10), Foo (3, 5)))
// { name: 1, value: 5, concat: [Function] }
Does concat sound familiar? Array and String are also Monoid types!
concat ([ 1, 2 ], [ 3, 4 ])
// [ 1, 2, 3, 4 ]
concat ('foo', 'bar')
// 'foobar'
Higher-order functions
So now we have a way to combine two Foo values together. The name of the first Foo is kept, and the value properties are subtracted. Now we apply this to each pair in our "zipped" result. Functional programmers love higher-order functions, so you'll appreciate this higher-order harmony
const apply = f => xs =>
f (...xs)
zip (arr1, arr2) .map (apply (concat))
// [ { name: 1, value: 5, concat: [Function] },
// { name: 2, value: 12, concat: [Function] } ]
Transforming types
So now we have the Foo values with the correct name and value values, but we want our final answer to be Bar values. A specialized constructor is all we need
Bar.fromFoo = ({ name, value }) =>
Bar (name, value)
Bar.fromFoo (Foo (1,2))
// { itemLabel: 1, itemValue: 2 }
zip (arr1, arr2)
.map (apply (concat))
.map (Bar.fromFoo)
// [ { itemLabel: 1, itemValue: 5 },
// { itemLabel: 2, itemValue: 12 } ]
Hard work pays off
A beautiful, pure functional expression. Our program reads very nicely; flow and transformation of the data is easy to follow thanks to the declarative style.
// main :: ([Foo], [Foo]) -> [Bar]
const main = (xs, ys) =>
zip (xs, ys)
.map (apply (concat))
.map (Bar.fromFoo)
And a complete code demo, of course
const Foo = (name, value) =>
({ name
, value
, concat: ({name:_, value:value2}) =>
Foo (name, value - value2)
})
const Bar = (itemLabel, itemValue) =>
({ itemLabel
, itemValue
})
Bar.fromFoo = ({ name, value }) =>
Bar (name, value)
const concat = (m1, m2) =>
m1.concat (m2)
const apply = f => xs =>
f (...xs)
const zip = ([ x, ...xs ], [ y, ...ys ]) =>
x === undefined || y === undefined
? []
: [ [ x, y ] ] .concat (zip (xs, ys))
const main = (xs, ys) =>
zip (xs, ys)
.map (apply (concat))
.map (Bar.fromFoo)
const arr1 =
[ Foo (1, 10), Foo (2, 15) ]
const arr2 =
[ Foo (3, 5), Foo (4, 3) ]
console.log (main (arr1, arr2))
// [ { itemLabel: 1, itemValue: 5 },
// { itemLabel: 2, itemValue: 12 } ]
Remarks
Our program above is implemented with a .map-.map chain which means handling and creating intermediate values multiple times. We also created an intermediate array of [[x1,y1], [x2,y2], ...] in our call to zip. Category theory gives us things like equational reasoning so we could replace m.map(f).map(g) with m.map(compose(f,g)) and achieve the same result. So there's room to improve this yet, but I think this is just enough to cut your teeth and start thinking about things in a different way.
Your code is just fine, you could use recursion as well:
var arr1 =[{
name: 1,
value: 10
}, {
name: 2,
value: 15
}];
var arr2= [{
name: 3,
value: 5
}, {
name: 4,
value: 3
}]
const createObject=(arr1,arr2,ret=[])=>{
if(arr1.length!==arr2.length){
throw("Arrays should be the same length.")
}
const item = {
itemLabel: arr1[0].name,
itemValue: arr1[0].value - arr2[0].value
};
if(arr1.length===0){
return ret;
};
return createObject(arr1.slice(1),arr2.slice(1),ret.concat(item));
}
console.log(createObject(arr1,arr2));
Both functions implementing a map or reduce would have to use either arr1 or arr2 outside of their scope (not passed to it as parameter) so strictly speaking not pure. But you could easily solve it with partial application:
var arr1 =[{
name: 1,
value: 10
}, {
name: 2,
value: 15
}];
var arr2= [{
name: 3,
value: 5
}, {
name: 4,
value: 3
}];
const mapFunction = arr2 => (item,index) => {
return {
itemLabel: item.name,
itemValue: item.value - arr2[index].value
}
}
var createObject=(arr1,arr2,ret=[])=>{
if(arr1.length!==arr2.length){
throw("Arrays should be the same length.")
}
const mf = mapFunction(arr2);
return arr1.map(mf);
}
console.log(createObject(arr1,arr2));
But as CodingIntrigue mentioned in the comment: none of these are any "better" than you've already done.
To make your solution more functional you need to change your anonymous function to a pure (anonymous) function.
A pure function is a function that, given the same input, will always return the same output
The anonymous function depends on the mutable variable arr1 and arr2. That means that it depends on the system state. So it doesn't fit into the pure function rule.
The following is maybe not the best implementaion but I hope it gives you an idea..
Let's Make it Pure
To make it pure we can pass the variables into the function as arguments
const mapWithObject = (obj2, obj1, index) => ({
itemLabel: obj1.name,
itemValue: obj1.value - obj2[index].value
})
// example call
const result = mapWithObject(arr2, arr1[0], 0)
Ok, but now the function doesn't fit into map anymore because it takes 3 arguments instead of 2..
Let's Curry it
const mapWithObject = obj2 => (obj1, index) => ({
itemLabel: obj1.name,
itemValue: obj1.value - obj2[index].value
})
const mapObject_with_arr2 = mapWithObject(arr2)
// example call
const result = mapObject_with_arr2(arr1[0], 0)
Full Code
const arr1 = [{
name: 1,
value: 10
},
{
name: 2,
value: 15
}
]
const arr2 = [{
name: 3,
value: 5
},
{
name: 4,
value: 3
}
]
const mapWithObject = obj2 => (obj1, index) => ({
itemLabel: obj1.name,
itemValue: obj1.value - obj2[index].value
})
const mapObject_with_arr2 = mapWithObject(arr2)
const mappedObject = arr1.map(mapObject_with_arr2)
console.log(mappedObject)
If you don't care to much about performance, but want to separate your concerns a bit further you could use this approach:
Define a function that does the "pairing" between arr1 and arr2
[a, b, c] + [1, 2, 3] -> [ [ a, 1 ], [ b, 2 ], [ c, 3 ] ]
Define a function that clearly shows the merge strategy of two objects
{ a: 1, b: 10 } + { a: 2, b: 20 } -> { a: 1, b: -10 }
Define simple helpers that compose the two so you can pass your original arrays and be returned the desired output in one function call.
Here's an example:
var arr1=[{name:1,value:10},{name:2,value:15}],arr2=[{name:3,value:5},{name:4,value:3}];
// This is a very general method that bundles two
// arrays in an array of pairs. Put it in your utils
// and you can use it everywhere
const pairs = (arr1, arr2) => Array.from(
{ length: Math.max(arr1.length, arr2.length) },
(_, i) => [ arr1[i], arr2[i] ]
);
// This defines our merge strategy for two objects.
// Ideally, you should give it a better name, based
// on what the objects represent
const merge =
(base, ext) => ({
itemLabel: base.name,
itemValue: base.value - ext.value
});
// This is a helper that "applies" our merge method
// to an array of two items.
const mergePair = ([ base, ext ]) => merge(base, ext);
// Another helper that composes `pairs` and `mergePair`
// to allow you to pass your original data.
const mergeArrays = (arr1, arr2) => pairs(arr1, arr2).map(mergePair);
console.log(mergeArrays(arr1, arr2));
tl;dr:
How can I chain onto Javascript's map() with my own function? Like -
stuff.map(i => i.key).avg()
where avg() is my own function to compute the average of the array returned by map?
In moving away from objects and toward functional programming with pure functions, I've lost the handy
return this;
that allows me to chain.
If I have
let stuff = [
{id: 1, name: 'tuan', country: 'VN', age: 23},
{id: 2, name: 'nhung', country: 'US', age: 25},
...
//my own filter to pass as a param to native filter()
var filt = x => j => j.country === x;
//my own reducer for an array that computes an average
let avg = (arr) => (arr.reduce((acc, i) => acc + i) / arr.length);
then
stuff.filter(filt('VN')).map(i => i.age)
would return something like
[23, 34, 45]
but
stuff.filter(filt('VN')).map(i => i.age).avg()
gives an error like
filter().map().avg() is not a function
How can we write functions that chain onto the native ones?
Method chaining isn't compatible with function composition. But instead of modifying built-in prototypes or fall back on subtyping, you can create a container type that allows you to compose pure functions in the context of method chaining:
function Box(x) {
return new.target ? (this.x = x, this) : new Box(x)
}
Box.prototype.fold = function fold(f) {return f(this.x)};
Box.prototype.map = function map(f) {return new Box(f(this.x))};
Box.prototype.toString = function toString() {return `Box(${this.x})`};
const id = x => x;
const stuff = [
{id: 1, name: 'foo', country: 'VN', age: 23},
{id: 2, name: 'bar', country: 'US', age: 25},
{id: 2, name: 'bat', country: 'VN', age: 34},
{id: 2, name: 'baz', country: 'VN', age: 45}
];
const filt = x => j => j.country === x;
const avg = (arr) => (arr.reduce((acc, i) => acc + i) / arr.length);
console.log(
Box(stuff.filter(filt('VN')).map(i => i.age))
.map(xs => avg(xs))
.fold(id) // yields 34
);
Box is a functor and you can put values of any type into this container. With map you can apply functions to the value inside the functor and get a new functor with the transformed value back. fold behaves identically, except that it returns the bare value.
Maybe you have noticed that my example is a little verbose and I could have spared me the mapping.
create a avg method on the Array.prototype
Array.prototype.avg = function() {
return this.reduce((a,b) => Number(a) + Number(b)) / this.length;
}
var array = [
{ id: 1, key:2 },
{ id: 2, key:3 },
{ id: 3, key:7 },
{ id: 4, key:6 },
{ id: 5, key:4 }
]
var avg = array.map(i => i.key).avg();
console.log(avg);
It should be
avg(stuff.filter(filt('VN')).map(i => i.age))
because you defined the avg function that expects arr as its argument. You did not extend the Array prototype with the avg method.
Chaining isn't magic — you're just calling a method on the return value of a function. If the function doesn't support that method, you need to add it to the prototype.
This works because map returns and array and arrays have a join() method:
var a = [1, 2, 3, 4, 5]
a.map((i) => i *2 ).join(",")
But arrays don't have an avg() method unless you add it so the chaining won't work.
Well, you have some options here for sure. There's no single right way to achieve what you want. I think your best option is to extend JavaScript's Array class.
class PizzaCollection extends Array {
// .. collection specific methods here...
avg() {
// you can iterate on `this`
}
}
.map, .filter, etc will all return back an instance of PizzaCollection.
Try it out!
const j = new PizzaCollection(1, 2, 3)
const k = j.map((num) => num * num)
k instanceof PizzaCollection // returns true
k.avg() // returns the avg
I have an array of objects
list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}]
And I'm looking for an efficient way (if possible O(log(n))) to remove duplicates and to end up with
list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}]
I've tried _.uniq or even _.contains but couldn't find a satisfying solution.
Thanks!
Edit : The question has been identified as a duplicate of another one. I saw this question before posting but it didn't answer my question since it's an array of object (and not a 2-dim array, thanks Aaron), or at least the solutions on the other question weren't working in my case.
Plain javascript (ES2015), using Set
const list = [{ x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 }, { x: 1, y: 2 }];
const uniq = new Set(list.map(e => JSON.stringify(e)));
const res = Array.from(uniq).map(e => JSON.parse(e));
document.write(JSON.stringify(res));
Try using the following:
list = list.filter((elem, index, self) => self.findIndex(
(t) => {return (t.x === elem.x && t.y === elem.y)}) === index)
Vanilla JS version:
const list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];
function dedupe(arr) {
return arr.reduce(function(p, c) {
// create an identifying id from the object values
var id = [c.x, c.y].join('|');
// if the id is not found in the temp array
// add the object to the output array
// and add the key to the temp array
if (p.temp.indexOf(id) === -1) {
p.out.push(c);
p.temp.push(id);
}
return p;
// return the deduped array
}, {
temp: [],
out: []
}).out;
}
console.log(dedupe(list));
I would use a combination of Arrayr.prototype.reduce and Arrayr.prototype.some methods with spread operator.
1. Explicit solution. Based on complete knowledge of the array object contains.
list = list.reduce((r, i) =>
!r.some(j => i.x === j.x && i.y === j.y) ? [...r, i] : r
, [])
Here we have strict limitation on compared objects structure: {x: N, y: M}. And [{x:1, y:2}, {x:1, y:2, z:3}] will be filtered to [{x:1, y:2}].
2. Generic solution, JSON.stringify(). The compared objects could have any number of any properties.
list = list.reduce((r, i) =>
!r.some(j => JSON.stringify(i) === JSON.stringify(j)) ? [...r, i] : r
, [])
This approach has a limitation on properties order, so [{x:1, y:2}, {y:2, x:1}] won't be filtered.
3. Generic solution, Object.keys(). The order doesn't matter.
list = list.reduce((r, i) =>
!r.some(j => !Object.keys(i).some(k => i[k] !== j[k])) ? [...r, i] : r
, [])
This approach has another limitation: compared objects must have the same list of keys.
So [{x:1, y:2}, {x:1}] would be filtered despite the obvious difference.
4. Generic solution, Object.keys() + .length.
list = list.reduce((r, i) =>
!r.some(j => Object.keys(i).length === Object.keys(j).length
&& !Object.keys(i).some(k => i[k] !== j[k])) ? [...r, i] : r
, [])
With the last approach objects are being compared by the number of keys, by keys itself and by key values.
I created a Plunker to play with it.
One liners for ES6+
If you want to find uniq by x and y:
arr.filter((v,i,a)=>a.findIndex(t=>(t.x === v.x && t.y===v.y))===i)
If you want to find uniques by all properties:
arr.filter((v,i,a)=>a.findIndex(t=>(JSON.stringify(t) === JSON.stringify(v)))===i)
The following will work:
var a = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];
var b = _.uniq(a, function(v) {
return v.x && v.y;
})
console.log(b); // [ { x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 } ]
Filter the array after checking if already in a temorary object in O(n).
var list = [{ x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 }, { x: 1, y: 2 }],
filtered = function (array) {
var o = {};
return array.filter(function (a) {
var k = a.x + '|' + a.y;
if (!o[k]) {
o[k] = true;
return true;
}
});
}(list);
document.write('<pre>' + JSON.stringify(filtered, 0, 4) + '</pre>');
No libraries, and works with any depth
Limitation:
You must provide only string or Number properties as hash objects otherwise you'll get inconsistent results
/**
* Implementation, you can convert this function to the prototype pattern to allow
* usage like `myArray.unique(...)`
*/
function unique(array, f) {
return Object.values(
array.reduce((acc, item) => ({ ...acc, [f(item).join(``)]: item }), {})
);
}
const list = [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}, { x: 1, y: 2}];
// Usage
const result = unique(list, item => [item.x, item.y]);
// Output: [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}]
console.log(result);
Snippet Sample
// Implementation
function unique(array, f) {
return Object.values(
array.reduce((acc, item) => ({ ...acc, [f(item).join(``)]: item }), {})
);
}
// Your object list
const list = [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}, { x: 1, y: 2}];
// Usage
const result = unique(list, item => [item.x, item.y]);
// Add result to DOM
document.querySelector(`p`).textContent = JSON.stringify(result, null, 2);
<p></p>
With Underscore's _.uniq and the standard JSON.stringify it is a oneliner:
var list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];
var deduped = _.uniq(list, JSON.stringify);
console.log(deduped);
<script src="https://underscorejs.org/underscore-umd-min.js"></script>
However, this presumes that the keys are always specified in the same order. By sophisticating the iteratee, we can make the solution work even if the order of the keys varies. This problem as well as the solution also apply to other answers that involve JSON.stringify.
var list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {y:2, x:1}];
// Ensure that objects are always stringified
// with the keys in alphabetical order.
function replacer(key, value) {
if (!_.isObject(value)) return value;
var sortedKeys = _.keys(value).sort();
return _.pick(value, sortedKeys);
}
// Create a modified JSON.stringify that always
// uses the above replacer.
var stringify = _.partial(JSON.stringify, _, replacer, null);
var deduped = _.uniq(list, stringify);
console.log(deduped);
<script src="https://underscorejs.org/underscore-umd-min.js"></script>
For Lodash 4, use _.uniqBy instead of _.uniq.
Using lodash you can use this one-liner:
_.uniqBy(list, e => { return e.x && e.y })