Javascript Array.some() and Array.every() equivalent for Set? - javascript

In JavaScript, is there an equivalent of Array.some() and Array.every() for the Set built-in object?

No, the only built-in methods on Set.prototype are:
Set.prototype​.add()
Set.prototype​.clear()
Set.prototype​.delete()
Set.prototype​.entries()
Set.prototype​.for​Each()
Set.prototype​.has()
Set.prototype​.values()
Set.prototype​[##iterator]()
It'd probably be easiest to just convert the set to an array, and then use the array methods.
const set1 = new Set([1, 2]);
const set2 = new Set([-1, 2]);
const allPositive = set => [...set].every(num => num > 0);
console.log(
allPositive(set1),
allPositive(set2)
);

It's not natively available on the Set prototype, but if you found yourself needing this frequently, you could easily extent Set to add it.
class extendedSet extends Set{
every(f){
return Array.prototype.every.call([...this], f)
}
some(f){
return Array.prototype.some.call([...this], f)
}
}
let a_set = new extendedSet([1, 2, 3, 4]);
console.log(a_set.every(n => n < 2))
console.log(a_set.some(n => n < 2))
// still works as a Set
console.log(a_set.has(4))

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set#Methods is the documentation for the list avaiable methods for the Set
Methods:
Set.prototype​.add()
Set.prototype​.clear()
Set.prototype​.delete()
Set.prototype​.entries()
Set.prototype​.for​Each()
Set.prototype​.has()
Set.prototype​.values()
Set.prototype​##iterator
In your context you could do something like below:
Array.from(set).some() or Array.from(set).every()
For more info regarding Array vs Set

Other answers suggest first converting the set to an array and then using some array method. This is completely unnecessary and even sub-optimal.
You can use a for loop on the set instance itself, iterate over it, find an element that matches a given condition, then break the loop:
function some(set, predicate) {
for (const item of set)
if (predicate(item))
return true;
return false;
}
function every(set, predicate) {
for (const item of set)
if (!predicate(item))
return false;
return true;
}
const set = new Set([ 42, 17, -1, 8.3 ]);
every(set, (item) => typeof item === "number"); // true
every(set, (item) => item > 0); // false
some(set, (item) => item < 0); // true
some(set, (item) => item === 0); // false

Related

JavaScript - get range of entries Map entries

Since a JavaScript Map remembers the original insertion order I was wondering if there was a (clean) way to get a range of entries? Specifically, I want to get all entries after a certain entry. I thought I could just use the map or forEach function but you cannot use map with Map's and forEach does not pass the index (Map docs)
You don't really have any option but to loop through the entries, either explicitly or using loops in function calls.
Specifically, I want to get all entries after a certain entry.
I'm going to assume you mean a certain value (rather than key), but the below is easily adjusted for keys instead.
Unless theMap has hundreds of thousands or even millions of entries, you can use fairly concise code by using spread to get an array of the entries, indexOf to find the target value, and slice to get only the ones after it:
const values = [...theMap.values()];
const index = values.indexOf(theDesiredStartingValue);
const valuesAfter = index === -1 ? [] : values.slice(index + 1);
If you're concern that that's two-three loops through the data (but again, that's really unlikely to matter), you could use a single loop:
const valuesAfter = [];
let seen = false;
for (const value of theMap.values()) {
if (seen) {
valuesAfter.push(value);
} else if (value === theDesiredStartingValue) {
seen = true;
}
}
If you meant by key, I'm thinking you probably want a Map as a result:
const entries = [...theMap.entries()];
const index = entries.findIndex(([key]) => key === theDesiredStartingKey);
const entriesAfter = new Map(index === -1 ? [] : entries.slice(index + 1));
If you're concern that that's two-three loops through the data (but again, that's really unlikely to matter), you could use a single loop:
const entriesAfter = new Map();
let seen = false;
for (const [key, value] of theMap.entries()) {
if (seen) {
entriesAfter.set(key, value);
} else if (key === theDesiredStartingKey) {
seen = true;
}
}
I would be tempted to make this some sort of iterator function as below
function* entriesAfter(map, entry) {
let found = false
for(let e of map.values()){
if(found)
yield e;
else if(e == entry)
found = true
}
}
var input = new Map([['foo', 1], ['bar', 2], ['baz', 3], ['bing', 4]]);
for(let item of entriesAfter(input,2))
console.log(item)

find value exist in a array of array in javascript

I have an array of array, how to find if a value exists inside this array in javascript. Here is the code example
let multival = [['Individual'], ['Non-Individual'],null]
Now I have to find if string 'Non-Individual' is present in this array of array or not, can anyone have solution in this regards
You could use Array.prototype.some() to check the string is present in the array of array.
const multival = [['Individual'], ['Non-Individual'], null];
const searchItem = 'Non-Individual';
const ret = multival.some(
(x) => Array.isArray(x) && x.some((y) => y === searchItem)
);
console.log(ret);
Out of interest, a recursive solution for arrays with any level of nested depth (And I believe quicker than a solution using flat() due to short circuiting):
function nestedIncludes(arr, val) {
let check = i => Array.isArray(i) ? i.some(check) : i === val;
return arr.some(check);
}
let myArr = [[[['foo'],1],2],3,5];
let test = nestedIncludes(myArr, 'foo');
console.log(test); // true
Try flat and includes function:
const multival = [['Individual'], ['Non-Individual'],null]
const exists = multival.flat().includes('Non-Individual');
console.log(exists ? 'exists' : 'non-exists')
Note: I didn't notice, but User863 comment with the same answer.
Method 1, Using findIndex, optional chaining and includes
Method 2, Using some, optional chaining and includes
// Method 1, Using `findIndex` and `includes`
const find = (arr, item) => arr.findIndex((arr2) => arr2?.includes(item)) > -1;
// Method 2, Using `some` and `includes`
const find2 = (arr, item) => arr.some((arr2) => arr2?.includes(item));
let multival = [["Individual"], ["Non-Individual"], null];
console.log(find(multival, "Individual"));
console.log(find2(multival, "Individual"));
console.log(find(multival, ""));
console.log(find2(multival, ""));

How to remove an item with map function index? [duplicate]

How can I skip an array element in .map?
My code:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
});
This will return:
["img.png", null, "img.png"]
Just .filter() it first:
var sources = images.filter(function(img) {
if (img.src.split('.').pop() === "json") {
return false; // skip
}
return true;
}).map(function(img) { return img.src; });
If you don't want to do that, which is not unreasonable since it has some cost, you can use the more general .reduce(). You can generally express .map() in terms of .reduce:
someArray.map(function(element) {
return transform(element);
});
can be written as
someArray.reduce(function(result, element) {
result.push(transform(element));
return result;
}, []);
So if you need to skip elements, you can do that easily with .reduce():
var sources = images.reduce(function(result, img) {
if (img.src.split('.').pop() !== "json") {
result.push(img.src);
}
return result;
}, []);
In that version, the code in the .filter() from the first sample is part of the .reduce() callback. The image source is only pushed onto the result array in the case where the filter operation would have kept it.
update — This question gets a lot of attention, and I'd like to add the following clarifying remark. The purpose of .map(), as a concept, is to do exactly what "map" means: transform a list of values into another list of values according to certain rules. Just as a paper map of some country would seem weird if a couple of cities were completely missing, a mapping from one list to another only really makes sense when there's a 1 to 1 set of result values.
I'm not saying that it doesn't make sense to create a new list from an old list with some values excluded. I'm just trying to make clear that .map() has a single simple intention, which is to create a new array of the same length as an old array, only with values formed by a transformation of the old values.
Since 2019, Array.prototype.flatMap is a good option.
images.flatMap(({src}) => src.endsWith('.json') ? [] : src);
From MDN:
flatMap can be used as a way to add and remove items (modify the
number of items) during a map. In other words, it allows you to map
many items to many items (by handling each input item separately),
rather than always one-to-one. In this sense, it works like the
opposite of filter. Simply return a 1-element array to keep the item,
a multiple-element array to add items, or a 0-element array to remove
the item.
I think the most simple way to skip some elements from an array is by using the filter() method.
By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);
console.log(sources);
TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.
Answer:
We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.
The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.
The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator
// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])
// note that [1,2,3].reduce(concat, []) would return [1,2,3]
// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))
// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)
// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']
// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])
// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)
// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
console.log(img)
if(img.src.split('.').pop() === 'json') {
// game.loadSprite(...);
return false;
} else {
return true;
}
}
const filteringJson = filtering(filterJsonAndLoad)
// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays
const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]
// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
const fns = args
var i = fns.length
while (i--) {
x = fns[i].call(this, x);
}
return x
}
const doABunchOfStuff = composeAll(
filtering((x) => x.src.split('.').pop() !== 'json'),
mapping((x) => x.src),
mapping((x) => x.toUpperCase()),
mapping((x) => x + '!!!')
)
const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']
Resources: rich hickey transducers post
Here's a fun solution:
/**
* Filter-map. Like map, but skips undefined values.
*
* #param callback
*/
function fmap(callback) {
return this.reduce((accum, ...args) => {
const x = callback(...args);
if(x !== undefined) {
accum.push(x);
}
return accum;
}, []);
}
Use with the bind operator:
[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]
Why not just use a forEach loop?
let arr = ['a', 'b', 'c', 'd', 'e'];
let filtered = [];
arr.forEach(x => {
if (!x.includes('b')) filtered.push(x);
});
console.log(filtered) // filtered === ['a','c','d','e'];
Or even simpler use filter:
const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];
Answer sans superfluous edge cases:
const thingsWithoutNulls = things.reduce((acc, thing) => {
if (thing !== null) {
acc.push(thing);
}
return acc;
}, [])
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
}).filter(Boolean);
The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.
To extrapolate on Felix Kling's comment, you can use .filter() like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json") { // if extension is .json
return null; // skip
} else {
return img.src;
}
}).filter(Boolean);
That will remove falsey values from the array that is returned by .map()
You could simplify it further like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() !== "json") { // if extension is .json
return img.src;
}
}).filter(Boolean);
Or even as a one-liner using an arrow function, object destructuring and the && operator:
var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);
Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):
function mapNonNull(arr, cb) {
return arr.reduce(function (accumulator, value, index, arr) {
var result = cb.call(null, value, index, arr);
if (result != null) {
accumulator.push(result);
}
return accumulator;
}, []);
}
var result = mapNonNull(["a", "b", "c"], function (value) {
return value === "b" ? null : value; // exclude "b"
});
console.log(result); // ["a", "c"]
if it null or undefined in one line ES5/ES6
//will return array of src
images.filter(p=>!p.src).map(p=>p.src);//p = property
//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);
const arr = [0, 1, '', undefined, false, 2, undefined, null, , 3, NaN];
const filtered = arr.filter(Boolean);
console.log(filtered);
/*
Output: [ 1, 2, 3 ]
*/
I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice
You can use after of you method map(). The method filter() for example in your case:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else {
return img.src;
}
});
The method filter:
const sourceFiltered = sources.filter(item => item)
Then, only the existing items are in the new array sourceFiltered.
Here is a updated version of the code provided by #theprtk. It is a cleaned up a little to show the generalized version whilst having an example.
Note: I'd add this as a comment to his post but I don't have enough reputation yet
/**
* #see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
* #description functions that transform reducing functions
*/
const transduce = {
/** a generic map() that can take a reducing() & return another reducing() */
map: changeInput => reducing => (acc, input) =>
reducing(acc, changeInput(input)),
/** a generic filter() that can take a reducing() & return */
filter: predicate => reducing => (acc, input) =>
predicate(input) ? reducing(acc, input) : acc,
/**
* a composing() that can take an infinite # transducers to operate on
* reducing functions to compose a computed accumulator without ever creating
* that intermediate array
*/
compose: (...args) => x => {
const fns = args;
var i = fns.length;
while (i--) x = fns[i].call(this, x);
return x;
},
};
const example = {
data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
/** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
concat: (acc, input) => acc.concat([input]),
getSrc: x => x.src,
filterJson: x => x.src.split('.').pop() !== 'json',
};
/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
filterFn,
mapFn,
transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);
/**
* Expected example output
* Note: each is wrapped in `example.data.reduce(x, [])`
* 1: ['file.html', 'file.txt', 'file.json']
* 2: ['file.html', 'file.txt']
* 3: ['FILE.HTML!', 'FILE.TXT!']
*/
const exampleFns = {
transducers: [
mapFn(reduceFn),
filterFn(mapFn(reduceFn)),
composeFn(reduceFn),
],
raw: [
(acc, x) => acc.concat([x.src]),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
],
};
const execExample = (currentValue, index) =>
console.log('Example ' + index, example.data.reduce(currentValue, []));
exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);
You can do this
var sources = [];
images.map(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is not .json
sources.push(img.src); // just push valid value
}
});
I use foreach():
var sources = [];
images.forEach(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is .json
sources.push(img);
}
});
NOTE: I negated your logic.
you can use map + filter like this :
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}})?.filter(x => x !== null);

Changing one item in a list In immutable.js

I'm using immutable.js, my data structure is like:
class ItemList extends Record({
items: new List()
})
I want to write function that change one item in this list and keep other the same. For example, a list of {1, 2, 3, 4}, I need a function if an item equals 2, change it to 5.
I'm using something like
updateInfo(updatedInfo) {
return this.withMutations(itemList => {
itemList.set('items', list);
});
}
My question is in this function, how can I just update one item? where should I put the if judgment?
Thanks!
NB: As mentioned by another answer, there is also the undocumented indexOf method which may be easier to use in some cases, taking only the value to find as parameter.
Using findIndex to find the index of the value you need to change and set with the index to change:
list = Immutable.List.of(1, 2, 3, 4);
list = list.set(list.findIndex(function(item) {
return item === 2;
}), 5);
ES6:
list = list.set(list.findIndex((item) => item === 2), 5);
If you need the old value to change it, you can use update instead of set like:
list = list.update(list.findIndex(function(item) {
return item === 2;
}), function(oldValue) {
return 5;
});
ES6:
list = list.update(list.findIndex((item) => item === 2), (oldValue) => 5);
It is easy.
list = Immutable.List.of(1, 2, 3, 4);
list = list.set(list.indexOf(2), 5);
console.log(list.get(1)); //5
A much cleaner version, based on forEach. Its a sideeffect (mutates an otherwise immutable list), so the syntax is similar to using a mutable list -
var list = Immutable.List.of(1, 2, 3, 4);
// Notice no LHS assignment is required as
// forEach is a side-effect method.
list.forEach((value, index, theList) => {
// You can check either value, or index
if (index === soAndSo
|| value.something === something){
// Just change the value!
value.prop = someNewValue;
// Or, in the above case where value
// is not a reference
theList.set(index) = newValue;
// As we found and changed the value
// of interest, lets exit forEach
return false;
}
});
And yes, there's a version for Map too.

How to skip over an element in .map()?

How can I skip an array element in .map?
My code:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
});
This will return:
["img.png", null, "img.png"]
Just .filter() it first:
var sources = images.filter(function(img) {
if (img.src.split('.').pop() === "json") {
return false; // skip
}
return true;
}).map(function(img) { return img.src; });
If you don't want to do that, which is not unreasonable since it has some cost, you can use the more general .reduce(). You can generally express .map() in terms of .reduce:
someArray.map(function(element) {
return transform(element);
});
can be written as
someArray.reduce(function(result, element) {
result.push(transform(element));
return result;
}, []);
So if you need to skip elements, you can do that easily with .reduce():
var sources = images.reduce(function(result, img) {
if (img.src.split('.').pop() !== "json") {
result.push(img.src);
}
return result;
}, []);
In that version, the code in the .filter() from the first sample is part of the .reduce() callback. The image source is only pushed onto the result array in the case where the filter operation would have kept it.
update — This question gets a lot of attention, and I'd like to add the following clarifying remark. The purpose of .map(), as a concept, is to do exactly what "map" means: transform a list of values into another list of values according to certain rules. Just as a paper map of some country would seem weird if a couple of cities were completely missing, a mapping from one list to another only really makes sense when there's a 1 to 1 set of result values.
I'm not saying that it doesn't make sense to create a new list from an old list with some values excluded. I'm just trying to make clear that .map() has a single simple intention, which is to create a new array of the same length as an old array, only with values formed by a transformation of the old values.
Since 2019, Array.prototype.flatMap is a good option.
images.flatMap(({src}) => src.endsWith('.json') ? [] : src);
From MDN:
flatMap can be used as a way to add and remove items (modify the
number of items) during a map. In other words, it allows you to map
many items to many items (by handling each input item separately),
rather than always one-to-one. In this sense, it works like the
opposite of filter. Simply return a 1-element array to keep the item,
a multiple-element array to add items, or a 0-element array to remove
the item.
I think the most simple way to skip some elements from an array is by using the filter() method.
By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);
console.log(sources);
TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.
Answer:
We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.
The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.
The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator
// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])
// note that [1,2,3].reduce(concat, []) would return [1,2,3]
// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))
// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)
// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']
// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])
// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)
// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
console.log(img)
if(img.src.split('.').pop() === 'json') {
// game.loadSprite(...);
return false;
} else {
return true;
}
}
const filteringJson = filtering(filterJsonAndLoad)
// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays
const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]
// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
const fns = args
var i = fns.length
while (i--) {
x = fns[i].call(this, x);
}
return x
}
const doABunchOfStuff = composeAll(
filtering((x) => x.src.split('.').pop() !== 'json'),
mapping((x) => x.src),
mapping((x) => x.toUpperCase()),
mapping((x) => x + '!!!')
)
const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']
Resources: rich hickey transducers post
Here's a fun solution:
/**
* Filter-map. Like map, but skips undefined values.
*
* #param callback
*/
function fmap(callback) {
return this.reduce((accum, ...args) => {
const x = callback(...args);
if(x !== undefined) {
accum.push(x);
}
return accum;
}, []);
}
Use with the bind operator:
[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]
Why not just use a forEach loop?
let arr = ['a', 'b', 'c', 'd', 'e'];
let filtered = [];
arr.forEach(x => {
if (!x.includes('b')) filtered.push(x);
});
console.log(filtered) // filtered === ['a','c','d','e'];
Or even simpler use filter:
const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];
Answer sans superfluous edge cases:
const thingsWithoutNulls = things.reduce((acc, thing) => {
if (thing !== null) {
acc.push(thing);
}
return acc;
}, [])
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
}).filter(Boolean);
The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.
To extrapolate on Felix Kling's comment, you can use .filter() like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json") { // if extension is .json
return null; // skip
} else {
return img.src;
}
}).filter(Boolean);
That will remove falsey values from the array that is returned by .map()
You could simplify it further like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() !== "json") { // if extension is .json
return img.src;
}
}).filter(Boolean);
Or even as a one-liner using an arrow function, object destructuring and the && operator:
var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);
Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):
function mapNonNull(arr, cb) {
return arr.reduce(function (accumulator, value, index, arr) {
var result = cb.call(null, value, index, arr);
if (result != null) {
accumulator.push(result);
}
return accumulator;
}, []);
}
var result = mapNonNull(["a", "b", "c"], function (value) {
return value === "b" ? null : value; // exclude "b"
});
console.log(result); // ["a", "c"]
if it null or undefined in one line ES5/ES6
//will return array of src
images.filter(p=>!p.src).map(p=>p.src);//p = property
//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);
const arr = [0, 1, '', undefined, false, 2, undefined, null, , 3, NaN];
const filtered = arr.filter(Boolean);
console.log(filtered);
/*
Output: [ 1, 2, 3 ]
*/
I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice
You can use after of you method map(). The method filter() for example in your case:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else {
return img.src;
}
});
The method filter:
const sourceFiltered = sources.filter(item => item)
Then, only the existing items are in the new array sourceFiltered.
Here is a updated version of the code provided by #theprtk. It is a cleaned up a little to show the generalized version whilst having an example.
Note: I'd add this as a comment to his post but I don't have enough reputation yet
/**
* #see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
* #description functions that transform reducing functions
*/
const transduce = {
/** a generic map() that can take a reducing() & return another reducing() */
map: changeInput => reducing => (acc, input) =>
reducing(acc, changeInput(input)),
/** a generic filter() that can take a reducing() & return */
filter: predicate => reducing => (acc, input) =>
predicate(input) ? reducing(acc, input) : acc,
/**
* a composing() that can take an infinite # transducers to operate on
* reducing functions to compose a computed accumulator without ever creating
* that intermediate array
*/
compose: (...args) => x => {
const fns = args;
var i = fns.length;
while (i--) x = fns[i].call(this, x);
return x;
},
};
const example = {
data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
/** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
concat: (acc, input) => acc.concat([input]),
getSrc: x => x.src,
filterJson: x => x.src.split('.').pop() !== 'json',
};
/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
filterFn,
mapFn,
transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);
/**
* Expected example output
* Note: each is wrapped in `example.data.reduce(x, [])`
* 1: ['file.html', 'file.txt', 'file.json']
* 2: ['file.html', 'file.txt']
* 3: ['FILE.HTML!', 'FILE.TXT!']
*/
const exampleFns = {
transducers: [
mapFn(reduceFn),
filterFn(mapFn(reduceFn)),
composeFn(reduceFn),
],
raw: [
(acc, x) => acc.concat([x.src]),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
],
};
const execExample = (currentValue, index) =>
console.log('Example ' + index, example.data.reduce(currentValue, []));
exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);
You can do this
var sources = [];
images.map(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is not .json
sources.push(img.src); // just push valid value
}
});
I use foreach():
var sources = [];
images.forEach(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is .json
sources.push(img);
}
});
NOTE: I negated your logic.
you can use map + filter like this :
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}})?.filter(x => x !== null);

Categories

Resources