Pairwise combinations of entries in a javascript array - javascript

I'm given an array of entries in javascript, such as :
var entries = ["cat", "dog", "chicken", "pig"];
I'd now like to iterate over all unique pairwise combinations of them. In this example, I'd like to see:
("cat", "dog"),
("cat", "chicken"),
...
In other languages, like scala, this is super easy. You just do
entries.combinations(2)
is there a similar method or function in a library for javascript? Or do I just have to write it myself the ugly way with nested loops?

var arr = ["cat","dog","chicken","pig"].map(function(item,i,arr) {
return arr.map(function(_item) { if( item != _item) return [item, _item];});
});
This will return the expected results. There are caveats, it does not work in older browsers without shims.
Also the duplicate value is 'undefined' instead of there being 4 arrays of 3. I'm sure there is a more graceful way to handle this.
Array.prototype.map() - MDN
edit
this will give you the proper pairwise combinations.
var arr = ["cat","dog","chicken","pig"].map(function(item,i,arr) {
var tmp = arr.map(function(_item) { if( item != _item) return [item, _item];});
return tmp.splice(tmp.indexOf(undefined),1), tmp;
});
Array splice method - MDN
and here is a more readable version of the same code.
var myArray = ["cat", "dog", "chicken", "pig"];
var pairwise = myArray.map(function(item, index, originalArray) {
var tmp = originalArray.map(function(_item) {
if (item != _item) {
return [item, _item];
}
});
tmp.splice(tmp.indexOf(undefined), 1); // because there is now one undefined index we must remove it.
return tmp;
});

Not as far as I know. I think you have to stick to nested loops.
A similar question has been asked here: Output each combination of an array of numbers with javascript maybe you can find an answer there.

With ES6 syntax, one can use a shorter version of #rlemon's answer:
["cat","dog","chicken","pig"].sort().reduce(
(acc, item, i, arr) => acc.concat(
arr.slice(i + 1).map(_item => [item, _item])
),
[])
This takes care of undefineds, and also outputs only unique combinations, as per OP's question.

After reviewing the question, this answer doesn't correctly solve the question. The question asks for all combinations, but the function below combines all adjacent even and odd indexes of the array.
Here is a pairwise implementation I did using reduce
function pairwise(arr) {
return arr.reduce(function(acc, current, index) {
var isFirstPair = (index % 2) === 0;
if (isFirstPair) {
acc.push([current]);
} else {
lastElement = acc[acc.length - 1];
lastElement.push(current);
}
return acc;
}, []);
};
var nums = [1,2,3,4,5,6];
var res = pairwise(nums);
res.forEach(function(elem) {
console.log(elem);
});
Returns:
[
[1, 2]
[3, 4]
[5, 6]
]

Here's a generic TypeScript implementation (you can get the pure JS by removing the types):
// Returns an array of size
const sizedArray = (n: number): null[] => Array(n).fill(null);
// calls the callback n times
const times = (n: number, cb: () => void): void => {
while (0 < n--) {
cb();
}
};
// Fills up the array with the return values of subsequent calls of cb
const fillWithCb = <T>(n: number, cb: () => T): T[] => sizedArray(n).map(cb);
// Generic to produce pairwise, 3 element wise etc..
const nWise = (n: number): (<T>(array: T[]) => T[][]) => <T>(
array: T[]
): T[][] => {
const iterators = fillWithCb(n, () => array[Symbol.iterator]());
iterators.forEach((it, index) => times(index, () => it.next()));
return fillWithCb(array.length - n + 1, () =>
iterators.map(it => it.next().value)
);
};
// curried nWise with 2 -> pairWise
export const pairWise = nWise(2);

The most effective and simple solution can be reduced and slice. However, if you just want to get values. You can use a generator.
// Util class
function pairWise(arr) {
return {
[Symbol.iterator]: function *() {
for(let i =0; i< arr.length; i= i+2)
yield arr.slice(i, i+2)
}
}
}
// How to use it
for(ent of pairWise([1,2,3,4,5,6, 7])){
console.log(ent)
}
// Output
/*
[ 1, 2 ]
[ 3, 4 ]
[ 5, 6 ]
[ 7 ]
*/

Related

Combine map() and concat() JavaScript - cleaner code question

Is it possible to combine map() and concat() in the following code to make it cleaner/shorter?
const firstColumnData = data.map((item: any) => {
return item.firstColumn;
});
const secondColumnData = data.map((item: any) => {
return item.secondColumn;
});
const allData = firstColumnData.concat(secondColumnData);
For context, later in the file allData is mapped through to populate data into columns. The specific data depends on which page is calling the component.
Basically, I am wondering if I can skip the declaration of firstColumnData and secondColumn data and assign the value to allData directly. This is an example of how I tried to refactor, but it did not work. (white page, could not render)
const allData = data.map((item: any => {
return item.firstColumn.concat(item.secondColumn)
});
You can use a single reduce() operation. Arguably, what you save by only having to iterate once, you lose in readability:
const data =[
{firstColumn: 1, secondColumn: 2},
{firstColumn: 3, secondColumn: 4}
];
const result = data.reduce((a, {firstColumn, secondColumn}, i, {length}) => {
a[i] = firstColumn;
a[i + length] = secondColumn;
return a;
}, []);
console.log(result);
I agree with Robby that readability is probably the most important part in this.
You could one line this though, as:
const allData = [...data.map(item => item.firstColumn), ...data.map(item.secondColumn)];
but in this case you're still looping twice, so you haven't saved any computation, you've just made it shorter to write.
Looping your current logic
My original answer below is, of course, performant as the proverbial January molasses even though I like the shape of it.
A little testing shows that just putting your current logic into a loop offers about the same or better performance than a generalized version RobbyCornelissen's answer (unless you unroll the loop in the reduce...) and has the benefit of simplicity. It relies on defining an array of column properties to iterate over.
const
data = [{ firstColumn: 1, secondColumn: 2 }, { firstColumn: 3, secondColumn: 4 }],
columns = ['firstColumn', 'secondColumn'],
result = [].concat(...columns.map(col => data.map((item) => item[col])));
console.log(result);
Generalized reduce()
const
data = [{ firstColumn: 1, secondColumn: 2 }, { firstColumn: 3, secondColumn: 4 }],
columns = ['firstColumn', 'secondColumn'],
result = data.reduce((a, item, i, { length }) => {
for (let j = 0; j < columns.length; j++) {
a[i + length * j] = item[columns[j]]
}
return a
}, []);
console.log(result);
Zip (original answer)
If the properties are guaranteed to be in order you could 'zip' the Object.values. This will handle any number of properties without explicit desctructuring.
const data = [
{ firstColumn: 1, secondColumn: 2 },
{ firstColumn: 3, secondColumn: 4 }
];
const result = zip(...data.map(Object.values)).flat()
console.log(result)
<script>
/**
* #see https://stackoverflow.com/a/10284006/13762301
*/
const zip = (...rows) => [...rows[0]].map((_, c) => rows.map((row) => row[c]));
</script>
But to avoid relying on property order you can still destructure.
const result = zip(...data.map(({ firstColumn, secondColumn }) => [firstColumn, secondColumn])).flat()
see: Javascript equivalent of Python's zip function for more discussion on 'zip'.

Sorting array of objects into an array of paired objects with Javascript

I have an array of objects and I want to be able to sort them by their "site" value into pairs. There can't be more that 2 objects in each child array so if there is 3 matches I get 1 child array with 2 objects and 1 child array with 1 object.
I have:
[{site:'A'}, {site:'A'}, {site:'B'}, {site:'B'}, {site:'B'}];
I want:
[[{site:'A'}, {site:'A'}],[{site:'B'}, {site:'B'}], [{site:'B'}]]
Whats the best way to do this? any help is appreciated.
This should work for you
function sortArray(arr){
arr.sort((a,b)=>a.site > b.site ? 1 : -1) // Sorting the array to have consecutive values
let chunks = [];
for(let i = 0;i<arr.length;i+=2){
if(arr[i]?.site == arr[i+1]?.site) chunks.push(arr.slice(i,i+2));
else {
chunks.push([arr[i]]);
i--;
}
}
return chunks;
}
let arr = [{site:'A'}, {site:'A'}, {site:'B'}, {site:'B'}, {site:'B'}];
console.log(sortArray(arr))
Using reduce ;) :
const a = [{
site: 'A'
}, {
site: 'A'
}, {
site: 'B'
}, {
site: 'B'
}, {
site: 'B'
}];
var r = a.reduce((ac, x) => ({
...ac,
[x.site]: [...(ac[x.site] || []), x]
}), {})
var r2 = Object.values(r).flatMap(x =>
x.reduce((ac, z, i) => {
if (i % 2) {
ac[i - 1].push(z)
return ac
}
return [...ac, [z]]
}, []))
console.log(r2)
PS: Since this is hard to read I'd suggest to use lodash (specifically groupBy and chunk methods)
It's kind of a 'groupBy' operation (as seen in underscore or lodash). Those produce an object keyed by the values being grouped. Consider writing it that way for general use. To get the shape the OP is looking for, strip out the values of that result...
function groupBy(array, key) {
return array.reduce((acc, el) => {
let value = el[key];
if (!acc[value]) acc[value] = [];
acc[value].push(el);
return acc;
}, {});
}
let array = [{site:'A'}, {site:'A'}, {site:'B'}, {site:'B'}, {site:'B'}];
let grouped = groupBy(array, 'site'); // produces { A: [{site:'A'} ...], B: ... }
let groups = Object.values(grouped)
console.log(groups)

How to print elements of an array in alternate order with the help of function in javascript?

I want to know that How can we print the elements of an array in a alternate order with the help of function in javascript?
array = [6,20,99,10,60,31,05,08];
result = [6,99,60,05]
You could use .filter - a high-order function in javascript. By using filter you can filter your elements via their index. If the index i modulo 2 is zero (ie !(i % 2)) then you can keep the element, and thus return true:
const arr = [6,20,99,10,60,31,05,08],
res = arr.filter((_, i) => !(i % 2));
console.log(res);
We want to traverse the array and use decent time complexity. To avoid looping more than once let's make our function do it in a single go. So we'll use an array helper. We also only want this to do something every other index. So we'll modulo it by 2, and if it is 0 we'll send it to the function. To make this have the same footprint as other common array functions we'll make it pass the same arguments it receives from forEach to the Function that gets passed.
function everyOther (array, fn) {
array.forEach(function (value, index, array) {
if (!(index % 2)) fn(value, index, array);
})
}
Now we just call it and do what we want!
everyOther([1,2,3], function (value) { console.log(value) }); // prints 1, 3
We could even make this strictly a print only of every other by wrapping it in a closure
function printEveryOther (array) {
everyOther(array, function (value) { console.log(value) });
}
printEveryOther([1,2,3]) // 1, 3
Now you have an adaptable function that has a familiar footprint and even can be made to do other things!
Things that were made use of in this example:
Function declarations
Anonymous Functions
Array.prototype.forEach()
Modulo operator
Okay now for some spice thanks to some more recent additions to JavaScript:
const everyOther = (a, f) => a.forEach((v, i) => !(i%2) && f(v,i,a));
const mapEveryOther = (a, f) => {
const r = [];
everyOther(a, (v, i) => r.push(f(v,i,a)));
return r;
}
And now we've made another function entirely that instead returns an array
const doubleEveryOther = a => mapEveryOther(a, a => a + a);
doubleEveryOther([1,2,3]) // [2,6]
Concepts used:
Arrow Functions
Closures
There's still lots more to have fun with. Happy learning!
Use for loop and increment i by 2
let array = [2, 20, 5, 66, 5, 98, 4, 6];
function alternate(array) {
for (i = 0; i < array.length; i += 2) {
console.log((i + 1) + ": " + array[i]);
}
}
alternate(array);

match an object in an array of objects and remove

After 2 days of fighting this problem I'm hoping someone can help me out.
I have two arrays of objects, like this:
let oldRecords = [
{name: 'john'},
{name: 'ringo'}
];
let newRecords = [
{name: 'paul'},
{name: 'john'},
{name: 'stuart'}
];
I am trying to end up with a function that returns named variables containing a list of data thats been added (exist in newRecords but not in oldRecords) and a list of data that has been removed (exists in oldRecords but not in newRecords).
for example
const analyse = (older, newer) => {
let added, removed;
// ... magic
return {added, removed}
}
const {added, removed} = analyse(oldRecords, newRecords);
I won't post all the code I've tried inside this function as I have tried to map, reduce and loop through both arrays creating temporary arrays for the last 48 hours and I could now fill a book with code I've written and deleted. I have also used underscore.js methods like reject/find/findWhere which all got me close but no cigar.
the main issue I am having is because the arrays contain objects, its super easy if they contain numbers, i.e.
var oldRecords = [1, 3];
var newRecords = [1, 2, 4]
function analyse (old, newer) {
let added = [];
let removed = [];
old.reduce((o) => {
added = added.concat(_.reject(newer, (num) => num === o ));
});
newer.reduce((n) => {
removed = _.reject(old, (num) => num === n );
});
return {added, removed}
}
const {added, removed} = analyse(oldRecords, newRecords);
How can I achieve the above but with objects not arrays?
n.b. I tried modifying the above and using JSON.stringify but it didn't really work.
EDIT: important point I forgot to add, the object structure adn it's keys are dynamic, they come from a database, so any individual checking of a key must also be dynamic, i.e. not a hard coded value
You could use reject and some to check for inclusion in the appropriate sets. The isEqual function is used to check for equality to handle dynamic keys:
const analyse = (older, newer) => {
let added = _.reject(newer, n => _.some(older, o =>_.isEqual(n, o)));
let removed = _.reject(older, o => _.some(newer, n => _.isEqual(n, o)));
return {added, removed}
}
You could try this:
const analyse = (older, newer) => {
let removed = older.filter(newItem => {
return newer.filter(oldItem => {
return _.isEqual(newItem, oldItem);
}).length === 0
});
let added = newer.filter(oldItem => {
return older.filter(newItem => {
return _.isEqual(newItem, oldItem);
}).length === 0
});
return {added, removed};
}
You can first create function to check if two object are equal, end then use filter() and some() to return result.
let oldRecords = [
{name: 'john'},
{name: 'ringo'}
];
let newRecords = [
{name: 'paul'},
{name: 'john'},
{name: 'stuart'}
];
function isEqual(o1, o2) {
return Object.keys(o1).length == Object.keys(o2).length &&
Object.keys(o1).every(function(key) {
return o2.hasOwnProperty(key) && o1[key] == o2[key]
})
}
var result = {}
result.removed = oldRecords.filter(e => !newRecords.some(a => isEqual(e, a)))
result.added = newRecords.filter(e => !oldRecords.some(a => isEqual(e, a)))
console.log(result)

How to skip over an element in .map()?

How can I skip an array element in .map?
My code:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
});
This will return:
["img.png", null, "img.png"]
Just .filter() it first:
var sources = images.filter(function(img) {
if (img.src.split('.').pop() === "json") {
return false; // skip
}
return true;
}).map(function(img) { return img.src; });
If you don't want to do that, which is not unreasonable since it has some cost, you can use the more general .reduce(). You can generally express .map() in terms of .reduce:
someArray.map(function(element) {
return transform(element);
});
can be written as
someArray.reduce(function(result, element) {
result.push(transform(element));
return result;
}, []);
So if you need to skip elements, you can do that easily with .reduce():
var sources = images.reduce(function(result, img) {
if (img.src.split('.').pop() !== "json") {
result.push(img.src);
}
return result;
}, []);
In that version, the code in the .filter() from the first sample is part of the .reduce() callback. The image source is only pushed onto the result array in the case where the filter operation would have kept it.
update — This question gets a lot of attention, and I'd like to add the following clarifying remark. The purpose of .map(), as a concept, is to do exactly what "map" means: transform a list of values into another list of values according to certain rules. Just as a paper map of some country would seem weird if a couple of cities were completely missing, a mapping from one list to another only really makes sense when there's a 1 to 1 set of result values.
I'm not saying that it doesn't make sense to create a new list from an old list with some values excluded. I'm just trying to make clear that .map() has a single simple intention, which is to create a new array of the same length as an old array, only with values formed by a transformation of the old values.
Since 2019, Array.prototype.flatMap is a good option.
images.flatMap(({src}) => src.endsWith('.json') ? [] : src);
From MDN:
flatMap can be used as a way to add and remove items (modify the
number of items) during a map. In other words, it allows you to map
many items to many items (by handling each input item separately),
rather than always one-to-one. In this sense, it works like the
opposite of filter. Simply return a 1-element array to keep the item,
a multiple-element array to add items, or a 0-element array to remove
the item.
I think the most simple way to skip some elements from an array is by using the filter() method.
By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);
console.log(sources);
TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.
Answer:
We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.
The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.
The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator
// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])
// note that [1,2,3].reduce(concat, []) would return [1,2,3]
// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))
// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)
// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']
// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])
// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)
// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
console.log(img)
if(img.src.split('.').pop() === 'json') {
// game.loadSprite(...);
return false;
} else {
return true;
}
}
const filteringJson = filtering(filterJsonAndLoad)
// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays
const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]
// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
const fns = args
var i = fns.length
while (i--) {
x = fns[i].call(this, x);
}
return x
}
const doABunchOfStuff = composeAll(
filtering((x) => x.src.split('.').pop() !== 'json'),
mapping((x) => x.src),
mapping((x) => x.toUpperCase()),
mapping((x) => x + '!!!')
)
const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']
Resources: rich hickey transducers post
Here's a fun solution:
/**
* Filter-map. Like map, but skips undefined values.
*
* #param callback
*/
function fmap(callback) {
return this.reduce((accum, ...args) => {
const x = callback(...args);
if(x !== undefined) {
accum.push(x);
}
return accum;
}, []);
}
Use with the bind operator:
[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]
Why not just use a forEach loop?
let arr = ['a', 'b', 'c', 'd', 'e'];
let filtered = [];
arr.forEach(x => {
if (!x.includes('b')) filtered.push(x);
});
console.log(filtered) // filtered === ['a','c','d','e'];
Or even simpler use filter:
const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];
Answer sans superfluous edge cases:
const thingsWithoutNulls = things.reduce((acc, thing) => {
if (thing !== null) {
acc.push(thing);
}
return acc;
}, [])
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
}).filter(Boolean);
The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.
To extrapolate on Felix Kling's comment, you can use .filter() like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json") { // if extension is .json
return null; // skip
} else {
return img.src;
}
}).filter(Boolean);
That will remove falsey values from the array that is returned by .map()
You could simplify it further like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() !== "json") { // if extension is .json
return img.src;
}
}).filter(Boolean);
Or even as a one-liner using an arrow function, object destructuring and the && operator:
var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);
Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):
function mapNonNull(arr, cb) {
return arr.reduce(function (accumulator, value, index, arr) {
var result = cb.call(null, value, index, arr);
if (result != null) {
accumulator.push(result);
}
return accumulator;
}, []);
}
var result = mapNonNull(["a", "b", "c"], function (value) {
return value === "b" ? null : value; // exclude "b"
});
console.log(result); // ["a", "c"]
if it null or undefined in one line ES5/ES6
//will return array of src
images.filter(p=>!p.src).map(p=>p.src);//p = property
//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);
const arr = [0, 1, '', undefined, false, 2, undefined, null, , 3, NaN];
const filtered = arr.filter(Boolean);
console.log(filtered);
/*
Output: [ 1, 2, 3 ]
*/
I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice
You can use after of you method map(). The method filter() for example in your case:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else {
return img.src;
}
});
The method filter:
const sourceFiltered = sources.filter(item => item)
Then, only the existing items are in the new array sourceFiltered.
Here is a updated version of the code provided by #theprtk. It is a cleaned up a little to show the generalized version whilst having an example.
Note: I'd add this as a comment to his post but I don't have enough reputation yet
/**
* #see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
* #description functions that transform reducing functions
*/
const transduce = {
/** a generic map() that can take a reducing() & return another reducing() */
map: changeInput => reducing => (acc, input) =>
reducing(acc, changeInput(input)),
/** a generic filter() that can take a reducing() & return */
filter: predicate => reducing => (acc, input) =>
predicate(input) ? reducing(acc, input) : acc,
/**
* a composing() that can take an infinite # transducers to operate on
* reducing functions to compose a computed accumulator without ever creating
* that intermediate array
*/
compose: (...args) => x => {
const fns = args;
var i = fns.length;
while (i--) x = fns[i].call(this, x);
return x;
},
};
const example = {
data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
/** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
concat: (acc, input) => acc.concat([input]),
getSrc: x => x.src,
filterJson: x => x.src.split('.').pop() !== 'json',
};
/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
filterFn,
mapFn,
transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);
/**
* Expected example output
* Note: each is wrapped in `example.data.reduce(x, [])`
* 1: ['file.html', 'file.txt', 'file.json']
* 2: ['file.html', 'file.txt']
* 3: ['FILE.HTML!', 'FILE.TXT!']
*/
const exampleFns = {
transducers: [
mapFn(reduceFn),
filterFn(mapFn(reduceFn)),
composeFn(reduceFn),
],
raw: [
(acc, x) => acc.concat([x.src]),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
],
};
const execExample = (currentValue, index) =>
console.log('Example ' + index, example.data.reduce(currentValue, []));
exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);
You can do this
var sources = [];
images.map(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is not .json
sources.push(img.src); // just push valid value
}
});
I use foreach():
var sources = [];
images.forEach(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is .json
sources.push(img);
}
});
NOTE: I negated your logic.
you can use map + filter like this :
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}})?.filter(x => x !== null);

Categories

Resources