Combine map() and concat() JavaScript - cleaner code question - javascript

Is it possible to combine map() and concat() in the following code to make it cleaner/shorter?
const firstColumnData = data.map((item: any) => {
return item.firstColumn;
});
const secondColumnData = data.map((item: any) => {
return item.secondColumn;
});
const allData = firstColumnData.concat(secondColumnData);
For context, later in the file allData is mapped through to populate data into columns. The specific data depends on which page is calling the component.
Basically, I am wondering if I can skip the declaration of firstColumnData and secondColumn data and assign the value to allData directly. This is an example of how I tried to refactor, but it did not work. (white page, could not render)
const allData = data.map((item: any => {
return item.firstColumn.concat(item.secondColumn)
});

You can use a single reduce() operation. Arguably, what you save by only having to iterate once, you lose in readability:
const data =[
{firstColumn: 1, secondColumn: 2},
{firstColumn: 3, secondColumn: 4}
];
const result = data.reduce((a, {firstColumn, secondColumn}, i, {length}) => {
a[i] = firstColumn;
a[i + length] = secondColumn;
return a;
}, []);
console.log(result);

I agree with Robby that readability is probably the most important part in this.
You could one line this though, as:
const allData = [...data.map(item => item.firstColumn), ...data.map(item.secondColumn)];
but in this case you're still looping twice, so you haven't saved any computation, you've just made it shorter to write.

Looping your current logic
My original answer below is, of course, performant as the proverbial January molasses even though I like the shape of it.
A little testing shows that just putting your current logic into a loop offers about the same or better performance than a generalized version RobbyCornelissen's answer (unless you unroll the loop in the reduce...) and has the benefit of simplicity. It relies on defining an array of column properties to iterate over.
const
data = [{ firstColumn: 1, secondColumn: 2 }, { firstColumn: 3, secondColumn: 4 }],
columns = ['firstColumn', 'secondColumn'],
result = [].concat(...columns.map(col => data.map((item) => item[col])));
console.log(result);
Generalized reduce()
const
data = [{ firstColumn: 1, secondColumn: 2 }, { firstColumn: 3, secondColumn: 4 }],
columns = ['firstColumn', 'secondColumn'],
result = data.reduce((a, item, i, { length }) => {
for (let j = 0; j < columns.length; j++) {
a[i + length * j] = item[columns[j]]
}
return a
}, []);
console.log(result);
Zip (original answer)
If the properties are guaranteed to be in order you could 'zip' the Object.values. This will handle any number of properties without explicit desctructuring.
const data = [
{ firstColumn: 1, secondColumn: 2 },
{ firstColumn: 3, secondColumn: 4 }
];
const result = zip(...data.map(Object.values)).flat()
console.log(result)
<script>
/**
* #see https://stackoverflow.com/a/10284006/13762301
*/
const zip = (...rows) => [...rows[0]].map((_, c) => rows.map((row) => row[c]));
</script>
But to avoid relying on property order you can still destructure.
const result = zip(...data.map(({ firstColumn, secondColumn }) => [firstColumn, secondColumn])).flat()
see: Javascript equivalent of Python's zip function for more discussion on 'zip'.

Related

Given an array of objects, count how many (possibly different) properties are defined

In a GeoJSON file, some properties are shared by all "features" (element) of the entire collection (array). But some properties are defined only for a subset of the collection.
I've found this question: [javascript] counting properties of the objects in an array of objects, but it doesn't answer my problem.
Example:
const features =
[ {"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4] ...]}},
{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4] ...]}},
{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4] ...]]}},
// ... for instance 1000 different cities
{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[...]}}
];
expected result: a list a all existing properties and their cardinality, letting us know how (in)complete is the data-set. For instance:
properties: 1000, properties.name: 1000, properties.zip: 890, properties.updated: 412,
geometry: 1000, geometry.type: 1000, geometry.coordinates: 1000
I have a (rather complicated) solution, but I do suspect that some people have already faced the same issue (seems a data science classic), with a better one (performance matters).
Here is my clumsy solution:
// 1: list all properties encountered in the features array, at least two levels deep
const countProps = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f)))), []);
// adding all the properties of each individual feature, then removing duplicates using the array-set-array trick
const countProp2s = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f.properties)))), []);
const countProp2g = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f.geometry)))), []);
// 2: counting the number of defined occurrences of each property of the list 1
const countPerProp = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f[pf]), 0)}`;
const countPerProp2s = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f.properties[pf]), 0)}`;
const countPerProp2g = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f.geometry[pf]), 0)}`;
const cardinalities = countProps(features).map((kk,i) => countPerProp(ff)(kk)) +
countProp2s(features).map(kk => countPerProp2s(ff)(kk)) +
countProp2g(features).map(kk => countPerProp2g(ff)(kk));
Therefore, there are three issues:
-step 1: this is much work (adding everything before removing most of it) for a rather simple operation. Moreover, this isn't recursive and second level is "manually forced".
-step 2, a recursive solution is probably a better one.
-May step 1 and 2 be performed in a single step (starting to count when a new property is added)?
I would welcome any idea.
The JSON.parse reviver and JSON.stringify replacer can be used to check all key value pairs :
var counts = {}, json = `[{"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4]]]}},{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[]}} ]`
var features = JSON.parse(json, (k, v) => (isNaN(k) && (counts[k] = counts[k] + 1 || 1), v))
console.log( counts, features )
Consider trying the following. It is just one reduce, with a couple of nested forEach's inside. It checks whether the keys for indicating the count exist in the object to be returned, and if not creates them initialized to 0. Then whether those keys existed or not to begin with, their corresponding values get incremented by 1.
Repl is here: https://repl.it/#dexygen/countobjpropoccur2levels , code below:
const features =
[ {"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},
{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},
{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4]]]}},
{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[]}}
];
const featuresCount = features.reduce((count, feature) => {
Object.keys(feature).forEach(key => {
count[key] = count[key] || 0;
count[key] += 1;
Object.keys(feature[key]).forEach(key2 => {
let count2key = `${key}.${key2}`;
count[count2key] = count[count2key] || 0;
count[count2key] += 1;
});
});
return count;
}, {});
console.log(featuresCount);
/*
{ properties: 4,
'properties.name': 4,
'properties.zip': 3,
geometry: 4,
'geometry.type': 4,
'geometry.coordinates': 4,
'properties.updated': 1 }
*/
Use polymorphic serialization of json using jackson. It will look something like below. Your base interface will have all common properties and for each variation create sub types. Count on each type will give what you need
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME, include=JsonTypeInfo.As.PROPERTY, property="name") #JsonSubTypes({ #JsonSubTypes.Type(value=Lion.class, name="lion"), #JsonSubTypes.Type(value=Tiger.class, name="tiger"), }) public interface Animal { }

RxJs - get duplicate items of two observables

I need to get duplicate items of two streams. I think I almost managed to do it, but only if those items that are duplicate of second stream, goes in order. For ex:
This works:
first = Observable.of(1, 2, 3)
second = Observable.of(2, 3, 1)
But this doesn't:
first = Observable.of(1, 4, 3)
second = Observable.of(1, 2, 3)
When my loop gets to the 4, it breaks:
EmptyError {name: "EmptyError", stack: "EmptyError: no elements in
sequence↵ at new Emp…e
(http://localhost:4200/vendor.bundle.js:161:22)", message: "no
elements in sequence"}
Whole my code is in one function, you can copy/paste and test it:
findDublicates() {
let match = 0; // setting it to 0, so later could assign other number
let keys = []; // list of maching keys
let elementAt = 0; // index of item of first observable
let allKeys$;
let validKeys$;
// counting the length of both observables, so this will be the number of loops
// that checks for dublicates
let allKeysLength;
let validKeysLength;
let allKeysLength$ = Observable.of(2, 1, 4, 5, 7).count()
allKeysLength$.subscribe(val => allKeysLength = val)
let validKeysLength$ = Observable.of(1, 2, 3, 8, 5).count()
validKeysLength$.subscribe(val => validKeysLength = val)
let cycles = Math.min(allKeysLength,validKeysLength); // length of the shorter observable
// wrapping it in a function so when called variables will take new values
function defineObs() {
allKeys$ = Observable.of(2, 1, 4, 5, 7)
.elementAt(elementAt).take(1);
validKeys$ = Observable.of(1, 2, 3, 8, 5)
.filter((x) => (x === match)).first();
}
for (var i=0; i<=cycles; i++) {
defineObs();
allKeys$.subscribe(
function (val) { match = val },
function (err) { console.log(err) },
function () { console.log('Done filter')}
);
validKeys$.subscribe(
function (val) { keys.push(val) },
function (err) { console.log(err) },
function () { console.log('Done push')}
);
elementAt += 1;
cycles -= 1;
}
return console.log(keys);
}
Thanks for any help.
If you don't care about which stream emits the first value of a set of duplicates, you may just merge them and treat as finding duplicate values on a single stream:
first.merge(second)
.scan(([ dupes, uniques ], next) =>
[ uniques.has(next) ? dupes.add(next) : dupes, uniques.add(next) ],
[ new Set(), new Set() ]
)
.map(([ dupes ]) => dupes)
Note: the Sets above are immutable, to avoid undefined behavior in scan.
I would check up on Observable.combineLatest and the scan method on an observable sequence.
Here’s what I’m thinking, combine the two observables using combineLatest and apply the scan operator on that. You can even use a Set to ensure uniqueness or even a map and filter.

Sum values of collection by type and assign to new property on replicated object

I'm trying to sum values from a collection and add them to a new property for each object
var collection = [
{prop:'title',quan:2},
{prop:'body',quan:3},
{prop:'title',quan:2},
{prop:'title',quan:4},
]
/* desired result
[
{prop:'title', quan:2, stock:8},
{prop:'body', quan:3, stock:3},
{prop:'title', quan:2, stock:8},
{prop:'title', quan:4, stock:8},
]
*/
I've tried many different ways with no success. I am trying to do this in a functional way.
This were I am currently stuck and I'm quite sure it is the most concise way.
// group the props using key
var result = _.groupBy(collection,'prop');
which outputs
{
title:[
{prop:'title',quan:2},{prop:'title',quan:2},{prop:'title',quan:4}
],
body:[
{prop:'body':quan:3}
]
}
So let's reduce the arrays we've created
var obj = {};
_.forEach(result,function(value,key){
obj[key] = _.reduce(value,function(acc,val){
return acc.quan + val.quan
});
});
This section above isn't working though?
When I have that working, I should be able to map it back to my final collection.
we know totals, map them to collection
var final = _.map(collection,function(value){
return {
type:value.prop,
val:value.quan,
stock:obj[value.prop]
}
});
jsbin
First you have to get the sum object and then assign the sum to the corresponding object. Like this:
function stock(arr) {
// using lodach: var sum = _.reduce(arr, ...)
var sum = arr.reduce(function(s, o) { // get the sum object (hashing)
s[o.prop] = (s[o.prop] || 0) + o.quan; // if the sum already contains an entry for this object porp then add it to its quan otherwise add 0
return s;
}, {});
// using lodash: _.forEach(arr, ...);
arr.forEach(function(o) { // assign the sum to the objects
o.stock = sum[o.prop]; // the stock for this object is the hash of it's prop from the sum object
});
}
var collection = [
{prop:'title',quan:2},
{prop:'body',quan:3},
{prop:'title',quan:2},
{prop:'title',quan:4},
]
stock(collection);
console.log(collection);
If you want to return a new array and leave the original intact, use map like you already did instead of forEach like this:
// using lodash: _.map(arr, ...);
return arr.map(function(o) { // create new objects with stock property
return {
prop: o.prop,
quan: o.quan,
stock: sum[o.prop]
};
// or in ecmascript: return Object.assign({}, o, {stock: sum[o.prop]});
});
And then you'll have to use it like this:
var newArray = stock(collection);
I like this solution better because it abstracts away the collation but allows you to control how items are collated using a higher-order function.
Notice how we don't talk about the kind or structure of data at all in the collateBy function – this keeps our function generic and allows for it to work on data of any shape.
collateBy takes a grouping function f and a reducing function g and a homogenous array of any type of data
// collateBy :: (a -> b) -> ((c,a) -> c) -> [a] -> Map(b:c)
const collateBy = f => g => xs => {
return xs.reduce((m,x) => {
let v = f(x)
return m.set(v, g(m.get(v), x))
}, new Map())
}
const collateByProp = collateBy (x => x.prop)
const assignTotalStock = xs => {
let stocks = collateByProp ((acc=0, {quan}) => acc + quan) (xs)
return xs.map(({prop, quan}) =>
({prop, quan, stock: stocks.get(prop)}))
}
var collection = [
{prop:'title',quan:2},
{prop:'body',quan:3},
{prop:'title',quan:2},
{prop:'title',quan:4},
]
console.log(assignTotalStock(collection))
// [ { prop: 'title', quan: 2, stock: 8 },
// { prop: 'body', quan: 3, stock: 3 },
// { prop: 'title', quan: 2, stock: 8 },
// { prop: 'title', quan: 4, stock: 8 } ]
Performing collations is very common when manipulating data, so it doesn't make sense to encode collation behaviour in each function that needs it. Instead, use a generic, higher-order function that captures only the essence of a collation computation, and specialize it using grouping function f and reducing function g

match an object in an array of objects and remove

After 2 days of fighting this problem I'm hoping someone can help me out.
I have two arrays of objects, like this:
let oldRecords = [
{name: 'john'},
{name: 'ringo'}
];
let newRecords = [
{name: 'paul'},
{name: 'john'},
{name: 'stuart'}
];
I am trying to end up with a function that returns named variables containing a list of data thats been added (exist in newRecords but not in oldRecords) and a list of data that has been removed (exists in oldRecords but not in newRecords).
for example
const analyse = (older, newer) => {
let added, removed;
// ... magic
return {added, removed}
}
const {added, removed} = analyse(oldRecords, newRecords);
I won't post all the code I've tried inside this function as I have tried to map, reduce and loop through both arrays creating temporary arrays for the last 48 hours and I could now fill a book with code I've written and deleted. I have also used underscore.js methods like reject/find/findWhere which all got me close but no cigar.
the main issue I am having is because the arrays contain objects, its super easy if they contain numbers, i.e.
var oldRecords = [1, 3];
var newRecords = [1, 2, 4]
function analyse (old, newer) {
let added = [];
let removed = [];
old.reduce((o) => {
added = added.concat(_.reject(newer, (num) => num === o ));
});
newer.reduce((n) => {
removed = _.reject(old, (num) => num === n );
});
return {added, removed}
}
const {added, removed} = analyse(oldRecords, newRecords);
How can I achieve the above but with objects not arrays?
n.b. I tried modifying the above and using JSON.stringify but it didn't really work.
EDIT: important point I forgot to add, the object structure adn it's keys are dynamic, they come from a database, so any individual checking of a key must also be dynamic, i.e. not a hard coded value
You could use reject and some to check for inclusion in the appropriate sets. The isEqual function is used to check for equality to handle dynamic keys:
const analyse = (older, newer) => {
let added = _.reject(newer, n => _.some(older, o =>_.isEqual(n, o)));
let removed = _.reject(older, o => _.some(newer, n => _.isEqual(n, o)));
return {added, removed}
}
You could try this:
const analyse = (older, newer) => {
let removed = older.filter(newItem => {
return newer.filter(oldItem => {
return _.isEqual(newItem, oldItem);
}).length === 0
});
let added = newer.filter(oldItem => {
return older.filter(newItem => {
return _.isEqual(newItem, oldItem);
}).length === 0
});
return {added, removed};
}
You can first create function to check if two object are equal, end then use filter() and some() to return result.
let oldRecords = [
{name: 'john'},
{name: 'ringo'}
];
let newRecords = [
{name: 'paul'},
{name: 'john'},
{name: 'stuart'}
];
function isEqual(o1, o2) {
return Object.keys(o1).length == Object.keys(o2).length &&
Object.keys(o1).every(function(key) {
return o2.hasOwnProperty(key) && o1[key] == o2[key]
})
}
var result = {}
result.removed = oldRecords.filter(e => !newRecords.some(a => isEqual(e, a)))
result.added = newRecords.filter(e => !oldRecords.some(a => isEqual(e, a)))
console.log(result)

Pairwise combinations of entries in a javascript array

I'm given an array of entries in javascript, such as :
var entries = ["cat", "dog", "chicken", "pig"];
I'd now like to iterate over all unique pairwise combinations of them. In this example, I'd like to see:
("cat", "dog"),
("cat", "chicken"),
...
In other languages, like scala, this is super easy. You just do
entries.combinations(2)
is there a similar method or function in a library for javascript? Or do I just have to write it myself the ugly way with nested loops?
var arr = ["cat","dog","chicken","pig"].map(function(item,i,arr) {
return arr.map(function(_item) { if( item != _item) return [item, _item];});
});
This will return the expected results. There are caveats, it does not work in older browsers without shims.
Also the duplicate value is 'undefined' instead of there being 4 arrays of 3. I'm sure there is a more graceful way to handle this.
Array.prototype.map() - MDN
edit
this will give you the proper pairwise combinations.
var arr = ["cat","dog","chicken","pig"].map(function(item,i,arr) {
var tmp = arr.map(function(_item) { if( item != _item) return [item, _item];});
return tmp.splice(tmp.indexOf(undefined),1), tmp;
});
Array splice method - MDN
and here is a more readable version of the same code.
var myArray = ["cat", "dog", "chicken", "pig"];
var pairwise = myArray.map(function(item, index, originalArray) {
var tmp = originalArray.map(function(_item) {
if (item != _item) {
return [item, _item];
}
});
tmp.splice(tmp.indexOf(undefined), 1); // because there is now one undefined index we must remove it.
return tmp;
});
Not as far as I know. I think you have to stick to nested loops.
A similar question has been asked here: Output each combination of an array of numbers with javascript maybe you can find an answer there.
With ES6 syntax, one can use a shorter version of #rlemon's answer:
["cat","dog","chicken","pig"].sort().reduce(
(acc, item, i, arr) => acc.concat(
arr.slice(i + 1).map(_item => [item, _item])
),
[])
This takes care of undefineds, and also outputs only unique combinations, as per OP's question.
After reviewing the question, this answer doesn't correctly solve the question. The question asks for all combinations, but the function below combines all adjacent even and odd indexes of the array.
Here is a pairwise implementation I did using reduce
function pairwise(arr) {
return arr.reduce(function(acc, current, index) {
var isFirstPair = (index % 2) === 0;
if (isFirstPair) {
acc.push([current]);
} else {
lastElement = acc[acc.length - 1];
lastElement.push(current);
}
return acc;
}, []);
};
var nums = [1,2,3,4,5,6];
var res = pairwise(nums);
res.forEach(function(elem) {
console.log(elem);
});
Returns:
[
[1, 2]
[3, 4]
[5, 6]
]
Here's a generic TypeScript implementation (you can get the pure JS by removing the types):
// Returns an array of size
const sizedArray = (n: number): null[] => Array(n).fill(null);
// calls the callback n times
const times = (n: number, cb: () => void): void => {
while (0 < n--) {
cb();
}
};
// Fills up the array with the return values of subsequent calls of cb
const fillWithCb = <T>(n: number, cb: () => T): T[] => sizedArray(n).map(cb);
// Generic to produce pairwise, 3 element wise etc..
const nWise = (n: number): (<T>(array: T[]) => T[][]) => <T>(
array: T[]
): T[][] => {
const iterators = fillWithCb(n, () => array[Symbol.iterator]());
iterators.forEach((it, index) => times(index, () => it.next()));
return fillWithCb(array.length - n + 1, () =>
iterators.map(it => it.next().value)
);
};
// curried nWise with 2 -> pairWise
export const pairWise = nWise(2);
The most effective and simple solution can be reduced and slice. However, if you just want to get values. You can use a generator.
// Util class
function pairWise(arr) {
return {
[Symbol.iterator]: function *() {
for(let i =0; i< arr.length; i= i+2)
yield arr.slice(i, i+2)
}
}
}
// How to use it
for(ent of pairWise([1,2,3,4,5,6, 7])){
console.log(ent)
}
// Output
/*
[ 1, 2 ]
[ 3, 4 ]
[ 5, 6 ]
[ 7 ]
*/

Categories

Resources