How to skip over an element in .map()? - javascript

How can I skip an array element in .map?
My code:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
});
This will return:
["img.png", null, "img.png"]

Just .filter() it first:
var sources = images.filter(function(img) {
if (img.src.split('.').pop() === "json") {
return false; // skip
}
return true;
}).map(function(img) { return img.src; });
If you don't want to do that, which is not unreasonable since it has some cost, you can use the more general .reduce(). You can generally express .map() in terms of .reduce:
someArray.map(function(element) {
return transform(element);
});
can be written as
someArray.reduce(function(result, element) {
result.push(transform(element));
return result;
}, []);
So if you need to skip elements, you can do that easily with .reduce():
var sources = images.reduce(function(result, img) {
if (img.src.split('.').pop() !== "json") {
result.push(img.src);
}
return result;
}, []);
In that version, the code in the .filter() from the first sample is part of the .reduce() callback. The image source is only pushed onto the result array in the case where the filter operation would have kept it.
update — This question gets a lot of attention, and I'd like to add the following clarifying remark. The purpose of .map(), as a concept, is to do exactly what "map" means: transform a list of values into another list of values according to certain rules. Just as a paper map of some country would seem weird if a couple of cities were completely missing, a mapping from one list to another only really makes sense when there's a 1 to 1 set of result values.
I'm not saying that it doesn't make sense to create a new list from an old list with some values excluded. I'm just trying to make clear that .map() has a single simple intention, which is to create a new array of the same length as an old array, only with values formed by a transformation of the old values.

Since 2019, Array.prototype.flatMap is a good option.
images.flatMap(({src}) => src.endsWith('.json') ? [] : src);
From MDN:
flatMap can be used as a way to add and remove items (modify the
number of items) during a map. In other words, it allows you to map
many items to many items (by handling each input item separately),
rather than always one-to-one. In this sense, it works like the
opposite of filter. Simply return a 1-element array to keep the item,
a multiple-element array to add items, or a 0-element array to remove
the item.

I think the most simple way to skip some elements from an array is by using the filter() method.
By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);
console.log(sources);

TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.
Answer:
We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.
The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.
The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator
// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])
// note that [1,2,3].reduce(concat, []) would return [1,2,3]
// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))
// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)
// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']
// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])
// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)
// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
console.log(img)
if(img.src.split('.').pop() === 'json') {
// game.loadSprite(...);
return false;
} else {
return true;
}
}
const filteringJson = filtering(filterJsonAndLoad)
// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays
const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]
// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
const fns = args
var i = fns.length
while (i--) {
x = fns[i].call(this, x);
}
return x
}
const doABunchOfStuff = composeAll(
filtering((x) => x.src.split('.').pop() !== 'json'),
mapping((x) => x.src),
mapping((x) => x.toUpperCase()),
mapping((x) => x + '!!!')
)
const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']
Resources: rich hickey transducers post

Here's a fun solution:
/**
* Filter-map. Like map, but skips undefined values.
*
* #param callback
*/
function fmap(callback) {
return this.reduce((accum, ...args) => {
const x = callback(...args);
if(x !== undefined) {
accum.push(x);
}
return accum;
}, []);
}
Use with the bind operator:
[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]

Why not just use a forEach loop?
let arr = ['a', 'b', 'c', 'd', 'e'];
let filtered = [];
arr.forEach(x => {
if (!x.includes('b')) filtered.push(x);
});
console.log(filtered) // filtered === ['a','c','d','e'];
Or even simpler use filter:
const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];

Answer sans superfluous edge cases:
const thingsWithoutNulls = things.reduce((acc, thing) => {
if (thing !== null) {
acc.push(thing);
}
return acc;
}, [])

var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
}).filter(Boolean);
The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.

To extrapolate on Felix Kling's comment, you can use .filter() like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json") { // if extension is .json
return null; // skip
} else {
return img.src;
}
}).filter(Boolean);
That will remove falsey values from the array that is returned by .map()
You could simplify it further like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() !== "json") { // if extension is .json
return img.src;
}
}).filter(Boolean);
Or even as a one-liner using an arrow function, object destructuring and the && operator:
var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);

Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):
function mapNonNull(arr, cb) {
return arr.reduce(function (accumulator, value, index, arr) {
var result = cb.call(null, value, index, arr);
if (result != null) {
accumulator.push(result);
}
return accumulator;
}, []);
}
var result = mapNonNull(["a", "b", "c"], function (value) {
return value === "b" ? null : value; // exclude "b"
});
console.log(result); // ["a", "c"]

if it null or undefined in one line ES5/ES6
//will return array of src
images.filter(p=>!p.src).map(p=>p.src);//p = property
//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);

const arr = [0, 1, '', undefined, false, 2, undefined, null, , 3, NaN];
const filtered = arr.filter(Boolean);
console.log(filtered);
/*
Output: [ 1, 2, 3 ]
*/

I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice

You can use after of you method map(). The method filter() for example in your case:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else {
return img.src;
}
});
The method filter:
const sourceFiltered = sources.filter(item => item)
Then, only the existing items are in the new array sourceFiltered.

Here is a updated version of the code provided by #theprtk. It is a cleaned up a little to show the generalized version whilst having an example.
Note: I'd add this as a comment to his post but I don't have enough reputation yet
/**
* #see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
* #description functions that transform reducing functions
*/
const transduce = {
/** a generic map() that can take a reducing() & return another reducing() */
map: changeInput => reducing => (acc, input) =>
reducing(acc, changeInput(input)),
/** a generic filter() that can take a reducing() & return */
filter: predicate => reducing => (acc, input) =>
predicate(input) ? reducing(acc, input) : acc,
/**
* a composing() that can take an infinite # transducers to operate on
* reducing functions to compose a computed accumulator without ever creating
* that intermediate array
*/
compose: (...args) => x => {
const fns = args;
var i = fns.length;
while (i--) x = fns[i].call(this, x);
return x;
},
};
const example = {
data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
/** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
concat: (acc, input) => acc.concat([input]),
getSrc: x => x.src,
filterJson: x => x.src.split('.').pop() !== 'json',
};
/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
filterFn,
mapFn,
transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);
/**
* Expected example output
* Note: each is wrapped in `example.data.reduce(x, [])`
* 1: ['file.html', 'file.txt', 'file.json']
* 2: ['file.html', 'file.txt']
* 3: ['FILE.HTML!', 'FILE.TXT!']
*/
const exampleFns = {
transducers: [
mapFn(reduceFn),
filterFn(mapFn(reduceFn)),
composeFn(reduceFn),
],
raw: [
(acc, x) => acc.concat([x.src]),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
],
};
const execExample = (currentValue, index) =>
console.log('Example ' + index, example.data.reduce(currentValue, []));
exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);

You can do this
var sources = [];
images.map(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is not .json
sources.push(img.src); // just push valid value
}
});

I use foreach():
var sources = [];
images.forEach(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is .json
sources.push(img);
}
});
NOTE: I negated your logic.

you can use map + filter like this :
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}})?.filter(x => x !== null);

Related

In javascript, filter and map an array in one function?

In Javascript, Array.filter() takes an array and filters it down based on a certain criteria.
const a = [1,2,3,4,5].filter(el => el > 3);
console.log(a);
Result: [4,5]
Array.map() takes an array and returns a new array of equal length, usually mutating the original's elements in the process.
const a = [1,2,3,4].map(el => el + 10);
console.log(a);
Result: [11,12,13,14,15]
My question is, besides combining the two functions like this:
let a = [1,2,3,4,5].filter(el => el > 3).map(el => el + 10);
console.log(a);
is there an efficient way to filter and mutating an array, that doesn't involve multiple lines of code like most Array.forEach, for, and for..in routines? I know that Array.filter().map() is pretty efficient, I'm just wondering if it can be further streamlined.
Use Array.prototype.reduce to take care of filtering and mapping all at once.
let a = [1,2,3,4,5].reduce((arr, el) => el > 3 ? arr.concat(el + 10) : arr, []);
console.log(a);
You could also make your own mapIf polyfill function.
// Reduce Only
if (Array.prototype.mapIf === undefined) {
Array.prototype.mapIf = function(predicateFn, applyFn) {
return this.reduce((ref, el) => predicateFn(el) ? ref.concat(applyFn(el)) : ref, []);
};
}
// Filter + Map
if (Array.prototype.mapIf === undefined) {
Array.prototype.mapIf = function(predicateFn, applyFn) {
return this.filter(predicateFn).map(applyFn);
};
}
let a = [1,2,3,4,5].mapIf(el => el > 3, el => el + 10);
console.log(a);

How to remove an item with map function index? [duplicate]

How can I skip an array element in .map?
My code:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
});
This will return:
["img.png", null, "img.png"]
Just .filter() it first:
var sources = images.filter(function(img) {
if (img.src.split('.').pop() === "json") {
return false; // skip
}
return true;
}).map(function(img) { return img.src; });
If you don't want to do that, which is not unreasonable since it has some cost, you can use the more general .reduce(). You can generally express .map() in terms of .reduce:
someArray.map(function(element) {
return transform(element);
});
can be written as
someArray.reduce(function(result, element) {
result.push(transform(element));
return result;
}, []);
So if you need to skip elements, you can do that easily with .reduce():
var sources = images.reduce(function(result, img) {
if (img.src.split('.').pop() !== "json") {
result.push(img.src);
}
return result;
}, []);
In that version, the code in the .filter() from the first sample is part of the .reduce() callback. The image source is only pushed onto the result array in the case where the filter operation would have kept it.
update — This question gets a lot of attention, and I'd like to add the following clarifying remark. The purpose of .map(), as a concept, is to do exactly what "map" means: transform a list of values into another list of values according to certain rules. Just as a paper map of some country would seem weird if a couple of cities were completely missing, a mapping from one list to another only really makes sense when there's a 1 to 1 set of result values.
I'm not saying that it doesn't make sense to create a new list from an old list with some values excluded. I'm just trying to make clear that .map() has a single simple intention, which is to create a new array of the same length as an old array, only with values formed by a transformation of the old values.
Since 2019, Array.prototype.flatMap is a good option.
images.flatMap(({src}) => src.endsWith('.json') ? [] : src);
From MDN:
flatMap can be used as a way to add and remove items (modify the
number of items) during a map. In other words, it allows you to map
many items to many items (by handling each input item separately),
rather than always one-to-one. In this sense, it works like the
opposite of filter. Simply return a 1-element array to keep the item,
a multiple-element array to add items, or a 0-element array to remove
the item.
I think the most simple way to skip some elements from an array is by using the filter() method.
By using this method (ES5) and the ES6 syntax you can write your code in one line, and this will return what you want:
let images = [{src: 'img.png'}, {src: 'j1.json'}, {src: 'img.png'}, {src: 'j2.json'}];
let sources = images.filter(img => img.src.slice(-4) != 'json').map(img => img.src);
console.log(sources);
TLDR: You can first filter your array and then perform your map but this would require two passes on the array (filter returns an array to map). Since this array is small, it is a very small performance cost. You can also do a simple reduce. However if you want to re-imagine how this can be done with a single pass over the array (or any datatype), you can use an idea called "transducers" made popular by Rich Hickey.
Answer:
We should not require increasing dot chaining and operating on the array [].map(fn1).filter(f2)... since this approach creates intermediate arrays in memory on every reducing function.
The best approach operates on the actual reducing function so there is only one pass of data and no extra arrays.
The reducing function is the function passed into reduce and takes an accumulator and input from the source and returns something that looks like the accumulator
// 1. create a concat reducing function that can be passed into `reduce`
const concat = (acc, input) => acc.concat([input])
// note that [1,2,3].reduce(concat, []) would return [1,2,3]
// transforming your reducing function by mapping
// 2. create a generic mapping function that can take a reducing function and return another reducing function
const mapping = (changeInput) => (reducing) => (acc, input) => reducing(acc, changeInput(input))
// 3. create your map function that operates on an input
const getSrc = (x) => x.src
const mappingSrc = mapping(getSrc)
// 4. now we can use our `mapSrc` function to transform our original function `concat` to get another reducing function
const inputSources = [{src:'one.html'}, {src:'two.txt'}, {src:'three.json'}]
inputSources.reduce(mappingSrc(concat), [])
// -> ['one.html', 'two.txt', 'three.json']
// remember this is really essentially just
// inputSources.reduce((acc, x) => acc.concat([x.src]), [])
// transforming your reducing function by filtering
// 5. create a generic filtering function that can take a reducing function and return another reducing function
const filtering = (predicate) => (reducing) => (acc, input) => (predicate(input) ? reducing(acc, input): acc)
// 6. create your filter function that operate on an input
const filterJsonAndLoad = (img) => {
console.log(img)
if(img.src.split('.').pop() === 'json') {
// game.loadSprite(...);
return false;
} else {
return true;
}
}
const filteringJson = filtering(filterJsonAndLoad)
// 7. notice the type of input and output of these functions
// concat is a reducing function,
// mapSrc transforms and returns a reducing function
// filterJsonAndLoad transforms and returns a reducing function
// these functions that transform reducing functions are "transducers", termed by Rich Hickey
// source: http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
// we can pass this all into reduce! and without any intermediate arrays
const sources = inputSources.reduce(filteringJson(mappingSrc(concat)), []);
// [ 'one.html', 'two.txt' ]
// ==================================
// 8. BONUS: compose all the functions
// You can decide to create a composing function which takes an infinite number of transducers to
// operate on your reducing function to compose a computed accumulator without ever creating that
// intermediate array
const composeAll = (...args) => (x) => {
const fns = args
var i = fns.length
while (i--) {
x = fns[i].call(this, x);
}
return x
}
const doABunchOfStuff = composeAll(
filtering((x) => x.src.split('.').pop() !== 'json'),
mapping((x) => x.src),
mapping((x) => x.toUpperCase()),
mapping((x) => x + '!!!')
)
const sources2 = inputSources.reduce(doABunchOfStuff(concat), [])
// ['ONE.HTML!!!', 'TWO.TXT!!!']
Resources: rich hickey transducers post
Here's a fun solution:
/**
* Filter-map. Like map, but skips undefined values.
*
* #param callback
*/
function fmap(callback) {
return this.reduce((accum, ...args) => {
const x = callback(...args);
if(x !== undefined) {
accum.push(x);
}
return accum;
}, []);
}
Use with the bind operator:
[1,2,-1,3]::fmap(x => x > 0 ? x * 2 : undefined); // [2,4,6]
Why not just use a forEach loop?
let arr = ['a', 'b', 'c', 'd', 'e'];
let filtered = [];
arr.forEach(x => {
if (!x.includes('b')) filtered.push(x);
});
console.log(filtered) // filtered === ['a','c','d','e'];
Or even simpler use filter:
const arr = ['a', 'b', 'c', 'd', 'e'];
const filtered = arr.filter(x => !x.includes('b')); // ['a','c','d','e'];
Answer sans superfluous edge cases:
const thingsWithoutNulls = things.reduce((acc, thing) => {
if (thing !== null) {
acc.push(thing);
}
return acc;
}, [])
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}
}).filter(Boolean);
The .filter(Boolean) will filter out any falsey values in a given array, which in your case is the null.
To extrapolate on Felix Kling's comment, you can use .filter() like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json") { // if extension is .json
return null; // skip
} else {
return img.src;
}
}).filter(Boolean);
That will remove falsey values from the array that is returned by .map()
You could simplify it further like this:
var sources = images.map(function (img) {
if(img.src.split('.').pop() !== "json") { // if extension is .json
return img.src;
}
}).filter(Boolean);
Or even as a one-liner using an arrow function, object destructuring and the && operator:
var sources = images.map(({ src }) => src.split('.').pop() !== "json" && src).filter(Boolean);
Here's a utility method (ES5 compatible) which only maps non null values (hides the call to reduce):
function mapNonNull(arr, cb) {
return arr.reduce(function (accumulator, value, index, arr) {
var result = cb.call(null, value, index, arr);
if (result != null) {
accumulator.push(result);
}
return accumulator;
}, []);
}
var result = mapNonNull(["a", "b", "c"], function (value) {
return value === "b" ? null : value; // exclude "b"
});
console.log(result); // ["a", "c"]
if it null or undefined in one line ES5/ES6
//will return array of src
images.filter(p=>!p.src).map(p=>p.src);//p = property
//in your condition
images.filter(p=>p.src.split('.').pop() !== "json").map(p=>p.src);
const arr = [0, 1, '', undefined, false, 2, undefined, null, , 3, NaN];
const filtered = arr.filter(Boolean);
console.log(filtered);
/*
Output: [ 1, 2, 3 ]
*/
I use .forEach to iterate over , and push result to results array then use it, with this solution I will not loop over array twice
You can use after of you method map(). The method filter() for example in your case:
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else {
return img.src;
}
});
The method filter:
const sourceFiltered = sources.filter(item => item)
Then, only the existing items are in the new array sourceFiltered.
Here is a updated version of the code provided by #theprtk. It is a cleaned up a little to show the generalized version whilst having an example.
Note: I'd add this as a comment to his post but I don't have enough reputation yet
/**
* #see http://clojure.com/blog/2012/05/15/anatomy-of-reducer.html
* #description functions that transform reducing functions
*/
const transduce = {
/** a generic map() that can take a reducing() & return another reducing() */
map: changeInput => reducing => (acc, input) =>
reducing(acc, changeInput(input)),
/** a generic filter() that can take a reducing() & return */
filter: predicate => reducing => (acc, input) =>
predicate(input) ? reducing(acc, input) : acc,
/**
* a composing() that can take an infinite # transducers to operate on
* reducing functions to compose a computed accumulator without ever creating
* that intermediate array
*/
compose: (...args) => x => {
const fns = args;
var i = fns.length;
while (i--) x = fns[i].call(this, x);
return x;
},
};
const example = {
data: [{ src: 'file.html' }, { src: 'file.txt' }, { src: 'file.json' }],
/** note: `[1,2,3].reduce(concat, [])` -> `[1,2,3]` */
concat: (acc, input) => acc.concat([input]),
getSrc: x => x.src,
filterJson: x => x.src.split('.').pop() !== 'json',
};
/** step 1: create a reducing() that can be passed into `reduce` */
const reduceFn = example.concat;
/** step 2: transforming your reducing function by mapping */
const mapFn = transduce.map(example.getSrc);
/** step 3: create your filter() that operates on an input */
const filterFn = transduce.filter(example.filterJson);
/** step 4: aggregate your transformations */
const composeFn = transduce.compose(
filterFn,
mapFn,
transduce.map(x => x.toUpperCase() + '!'), // new mapping()
);
/**
* Expected example output
* Note: each is wrapped in `example.data.reduce(x, [])`
* 1: ['file.html', 'file.txt', 'file.json']
* 2: ['file.html', 'file.txt']
* 3: ['FILE.HTML!', 'FILE.TXT!']
*/
const exampleFns = {
transducers: [
mapFn(reduceFn),
filterFn(mapFn(reduceFn)),
composeFn(reduceFn),
],
raw: [
(acc, x) => acc.concat([x.src]),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src] : []),
(acc, x) => acc.concat(x.src.split('.').pop() !== 'json' ? [x.src.toUpperCase() + '!'] : []),
],
};
const execExample = (currentValue, index) =>
console.log('Example ' + index, example.data.reduce(currentValue, []));
exampleFns.raw.forEach(execExample);
exampleFns.transducers.forEach(execExample);
You can do this
var sources = [];
images.map(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is not .json
sources.push(img.src); // just push valid value
}
});
I use foreach():
var sources = [];
images.forEach(function (img) {
if(img.src.split('.').pop() !== "json"){ // if extension is .json
sources.push(img);
}
});
NOTE: I negated your logic.
you can use map + filter like this :
var sources = images.map(function (img) {
if(img.src.split('.').pop() === "json"){ // if extension is .json
return null; // skip
}
else{
return img.src;
}})?.filter(x => x !== null);

JSON.stringify(localStorage) - filtering by key

I use a small code snipped to save the localStorage of my application as a string:
var saveStr = JSON.stringify(localStorage);
It works great at a first glance, but it basically dump the entire localStorage object, which I don't want. I'd like to stringify the localStorage, but only the keys that contains a certain string.
For instance:
var saveStr = JSON.stringify(filteredLS("example"));
filteredLS should return the localStorage data, but only the keys that contains the string that was passed as an argument.
Someone knows an easy snipped to achieve this?
Thanks!
Try this
function filteredLS(term) {
var filteredObj = {};
Object.keys(localStorage)
.filter(function (key) {
return key.indexOf(term) >= 0;
})
.map(function (key) {
filteredObj[key] = localStorage.getItem(key);
});
return JSON.stringify(filteredObj);
}
You should use the methods localStorage.getItem and localStorage.setItem. With those, you can write your own get & set functions to easily use JSON objects:
function get(item) {
return JSON.parse(localStorage.getItem(item))
}
function set(item, value) {
return localStorage.setItem(item, JSON.stringify(value))
}
// use like this:
set('foo', { bar: 1 })
var result = get('foo')
// result: { bar: 1 }
Depending on your target browser you may want to transpile this, but for brevity I'm going with a (mostly) es6 style - this should run in modern browsers
Filtering an object by keys:
const filterByKeys = obj => keys => Object.entries(obj)
// keep only the keys we care about
.filter( ([key, val]) => keys.includes(key) )
// make a new object with just the filtered keys
.reduce( (accum, [key, val]) => Object.assign(accum, {[key]:val} ), {} )
Usage:
// create a function for getting select keys
const localStore = filterByKeys(localStorage)
// call that function with a list of keys you want
const myValues = localStore(['foo', 'bar'])
// and JSON for completeness
const localStoreJson = keys => JSON.stringify(localStore(keys))
Alternate option if you're transpiling or reading this in the future - using spread operator and compacting filter+reduce into one step - for your purposes this is likely unnecessary:
const filterByKeys = obj => keys => Object.entries(obj)
// filter and reduce in one step
.reduce( (accum, [key, val]) => keys.includes(key) ? {...accum, [key]:val } : accum, {} )

How can I get a key in a JavaScript 'Map' by its value?

I have a JavaScript 'Map' like this one
let people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
I want some method to return a key by its value.
let jhonKey = people.getKey('jhon'); // jhonKey should be '1'
You can use a for..of loop to loop directly over the map.entries and get the keys.
function getByValue(map, searchValue) {
for (let [key, value] of map.entries()) {
if (value === searchValue)
return key;
}
}
let people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
console.log(getByValue(people, 'jhon'))
console.log(getByValue(people, 'abdo'))
You could convert it to an array of entries (using [...people.entries()]) and search for it within that array.
let people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
let jhonKeys = [...people.entries()]
.filter(({ 1: v }) => v === 'jhon')
.map(([k]) => k);
console.log(jhonKeys); // if empty, no key found otherwise all found keys.
Though late and other great answers already exist, still you can give the below "..." and "Array.find" a try:
let people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
function getKey(value) {
return [...people].find(([key, val]) => val == value)[0]
}
console.log('Jasmein - ', getKey('jasmein'))
console.log('Jhon - ', getKey('jhon'))
JavaScript Map and Object
Given a JavaScript Map, I like Nitish's answer:
const map = new Map([
[1, 'one'],
[2, 'two'],
[3, 'three'],
]);
function getKey(val) {
return [...map].find(([key, value]) => val === value)[0];
}
console.log(getKey('one')); // 1
console.log(getKey('two')); // 2
console.log(getKey('three')); // 3
For a JavaScript object, you could do something like so:
const map = {
1: 'one',
2: 'two',
3: 'three',
};
function getKey(val) {
return Object.keys(map).find(key => map[key] === val);
}
console.log(getKey('one')); // 1
console.log(getKey('two')); // 2
console.log(getKey('three')); // 3
There isn't any direct method for picking out information in this direction, so if all you have is the map you need to loop through the set as suggested by others.
If the map/array/other is large enough that such a loop would be a performance issue and the requirement for a reverse lookup is common within the project, you could implement your own structure using a pair of maps/arrays/other with one as per the current object and the other with the key and value reversed.
That way, the reverse lookup is as efficient as the normal one. Of course, you have more work to do as you need to implement each method that you need as a pass-through to one or both of the underlying objects so if the map is small and/or the reverse lookup is not needed often the scan-via-loop option is likely to be preferable due to being simpler to maintain and possible simpler for the JiT compiler to optimise.
In any case, one thing to be wary of is the possibility that multiple keys could have the same value. If this is possible then when looping through your map you need to decide if you are fine to return one of the possible keys arbitrarily (probably the first one) or if you want to return an array of keys, and if implementing a reverse index for data that could have duplicate values the same issue also needs to be accounted for.
One could invert the Map so that the keys are the values and the values are the keys and then lookup the original value as a key. Here's an example:
let myMap = new Map([
[1, 'one'],
[2, 'two'],
[3, 'three'],
]);
let invertedMap = new Map([...myMap.entries()].map(
([key, value]) => ([value, key]))
);
console.log(invertedMap.get('one'))
// => 1
Here is a properly typed Typescript solution that doesn't unnecessarily create an array.
function find_map_value<K, V>(m: Map<K, V>, predicate: (v: V) => boolean): [K, V] | undefined {
for (const [k, v] of m) {
if (predicate(v)) {
return [k, v];
}
}
return undefined;
}
If you want all values you can use a generator:
function* find_all_map_values<K, V>(m: Map<K, V>, predicate: (v: V) => boolean): Generator<[K, V]> {
for (const [k, v] of m) {
if (predicate(v)) {
yield [k, v];
}
}
}
Why not simply make use of map's built in iterator prototype/instance reference looking for the target value? Injection into the prototype chain/polyfill inspired solution of sorts makes it universal to ones code:
Map.prototype.getKey = function(targetValue){
let iterator = this[Symbol.iterator]()
for (const [key, value] of iterator) {
if(value === targetValue)
return key;
}
}
const people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
const jhonKey = people.getKey('jhon');
console.log(`The key for 'jhon' is: ${jhonKey}`);
For anyone curious why I added yet another answer. Most of these answers (exception, I like Rajesh's answer, but I added to the prototype chain) are doing a lot of data duplication in the name of finding a value by using the spread operator or even straight up crafting Arrays. Object.keys() mind you is also terribly nonperformant.
Note, I use for..of which iterates on iterables. One could do short hand simply with for(const [key, value] of this){...} if desired.
Tailing off what Maciej Krawczyk suggested here is a general circular map implementation for that.
class ReferenceMap {
#left = new Map();
#right = new Map();
constructor(iterable = []) {
this.#left = new Map(iterable);
this.#right = new Map(ReferenceMap.swapKeyValues(iterable));
}
has(key) {
return this.#left.has(key) || this.#right.has(key);
}
get(key) {
return this.#left.has(key) ? this.#left.get(key) : this.#right.get(key);
}
set(key, value) {
this.#left.set(key, value);
this.#right.set(value, key);
}
delete(key) {
if (this.#left.has(key)) {
let ref = this.#left.get(key);
this.#left.delete(key);
this.#right.delete(ref);
} else if (this.#right.has(key)) {
let ref = this.#right.get(key);
this.#right.delete(key);
this.#left.delete(ref);
}
}
entries() {
return this.#left.entries();
}
keys() {
return this.#left.keys();
}
values() {
return this.#left.values();
}
[Symbol.iterator]() {
return this.entries();
}
get size() {
return this.#left.size;
}
static * swapKeyValues(entries) {
for (let [key, value] of entries) yield [value, key];
}
}
My TypeScript version:
const getByValue = <A, B>(m: Map<A,B>, searchValue: B):[A, B] | undefined => {
const l:IterableIterator<[A, B]> = m.entries();
const a:[A, B][] = Array.from(l);
return a.find(([_k,v]) => v === searchValue);
}
Cache
The question is a bit wrong because one value can be assigned to many keys. Therefore, the result for a given value should be an array of keys (not a single key). If you want to oftet make such search you can use following cache generator for reverse map
let genRevMapCache = map => [...map.entries()].reduce((a,[k,v]) => {
if(!a.get(v)) a.set(v,[]);
a.get(v).push(k);
return a;
}, new Map() );
let genRevMapCache = map => [...map.entries()].reduce((a,[k,v]) => {
if(!a.get(v)) a.set(v,[]);
a.get(v).push(k);
return a;
}, new Map() );
// TEST
let people = new Map();
people.set('1', 'jhon');
people.set('2', 'jasmein');
people.set('3', 'abdo');
people.set('4', 'jhon');
let cache = genRevMapCache(people);
console.log('jasmein', cache.get('jasmein'));
console.log('jhon', cache.get('jhon'));
JS:
// Returns keys for all instances
function findAll(obj) {
return Array.from(items.keys()).map(k => items.get(k) === obj ? k : undefined).filter(k => k);
}
// Returns keys for the first instances
function findFirst(obj) {
return Array.from(items.keys()).find(k => items.get(k) === obj);
}
Typescript:
protected items = new Map<TKey, TObject>();
public findAll(obj: TObject): Array<TKey> {
return Array.from(this.items.keys()).map(k => this.items.get(k) === obj ? k : undefined).filter(k => !!k);
}
public findFirst(obj: TObject): TKey | undefined {
return Array.from(this.items.keys()).find(k => this.items.get(k) === obj);
}
Explanation:
// Gets the keys as an array
Array.from(this.items.keys())
// Map the keys whose object matches the instance of `obj` to the key itself, undefined otherwise
.map(k => this.items.get(k) === obj ? k : undefined)
// Filter out array elements that are undefined
// (!! is for strict type-checking/readability practices, you can simply use k => k)
.filter(k => !!k)
// Finds the first occurrence of the key for the given object, undefined if not found
.find(k => this.items.get(k) === obj)

Array to Object es6 javascript

I am trying to see if there is a smaller way of converting an array to an object in es6. ( I do not have to worry about cross-browser )
I currently have:
function (values) // values being an array.
{
let [videos, video_count, page] = values;
let data = { videos, video_count, page };
someFunctions(data);
moreFunctions(data);
}
But I was wondering if it's possible to cut out the first line of the function, let [videos....] part. And somehow inline do the conversion.
I have read through mozilla: Destructuring assignment but I could not see it there. (but I may have understood it wrong) and I am really not clever enough to understand ECMA: ES6 Spec.
I suspect it is not possible and the above is already the simplest I can make it.
But if I can get away with not creating the videos, video_count & page tmp variables I would be happier.
You can destructure right in the function parameters
function myFunc([ videos, video_count, page ])
{
let data = { videos, video_count, page };
someFunctions(data);
moreFunctions(data);
}
myFunc(values);
I do a lot of data abstraction using this technique
// basic abstraction
const ratio = (n, d) => ({n, d});
const numer = ({n}) => n;
const denom = ({d}) => d;
// compound abstraction using selectors
const ratioAdd = (x,y) => ratio(
numer(x) * denom(y) + numer(y) * denom(x),
denom(x) * denom(y)
);
// or skip selectors if you're feeling lazy
const printRatio = ({n,d}) => `${n}/${d}`;
console.log(printRatio(ratioAdd(ratio(1,3), ratio(1,4)))); //= 7/12
You seem hell-bent on somehow making the code shorter, so here you go. In this case, "making it shorter" means making it longer first.
// Goal:
obuild(keys,values) //=> ourObject
Generic procedures zip, assign, and obuild should give us what we need. This is vastly superior to #CodingIntigue's answer as it's not one big function that tries to do all of the tasks. Keeping them separate means reducing complexity, and increasing readability and reusability.
// zip :: [a] -> [b] -> [[a,b]]
const zip = ([x,...xs], [y,...ys]) => {
if (x === undefined || y === undefined)
return [];
else
return [[x,y], ...zip(xs,ys)];
}
// assign :: (Object{k:v}, [k,v]) -> Object{k:v}
const assign = (o, [k,v]) => Object.assign(o, {[k]: v});
// obuild :: ([k], [v]) -> Object{k:v}
const obuild = (keys, values) => zip(keys, values).reduce(assign, {});
let keys = ['a', 'b', 'c'];
let values = [1, 2, 3];
console.log(obuild(keys,values));
// { 'a': 1, 'b': 2, 'c': 3 }
With the given properties and names you have, that's probably the shortest way to achieve your desired result.
However, if you had more fields, you could use reduce to avoid repetition. It's not as readable as the destructuring though:
let data = values.reduce((prev, val, index) => Object.assign(prev, {[["videos", "video_count", "page"][index]]: val} ), {});
You could then abstract that out into a generic function:
const values = ["test1","test2","test3"];
const mapArrayToObject = (array, fields) =>
array.reduce(
(prev, val, index) => Object.assign(prev, { [fields[index]]: val } ),
{}
);
const data = mapArrayToObject(values, ["videos", "video_count", "page"]);
console.log(data);

Categories

Resources