Related
I need help with creating a function to return the elements that are only present in one of 3 arrays, for example
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
In the three arrays above, 'd' and 'f' are found only in one of the arrays (arr2 and arr3), I need to return them.
['d','f']
The arrays can be of different sizes and the returned elements must not be duplicated.
I tried to find better alternatives, but I failed and just went with the brute force approach, looping through each array and checking if the element exists in the other two arrays, but obviously, it's really slow and hard to read.
function elementsInOnlyOneArr(a1, a2, a3) {
let myArr = [];
for(let el of a1){
if(a2.includes(el) == false && a3.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
for(let el of a2){
if(a1.includes(el) == false && a3.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
for(let el of a3){
if(a2.includes(el) == false && a1.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
return myArr;
}
Assuming there are less than 32 arrays, you can do this efficiently with bitmaps. Basically, build an index key -> number where the number has the Nth bit set if the key is in the Nth array. Finally return keys whose numbers only have a single bit set (=are powers of two):
function difference(...arrays) {
let items = {}
for (let [n, a] of arrays.entries())
for (let x of a) {
items[x] = (items[x] ?? 0) | (1 << n)
}
return Object.keys(items).filter(x =>
Number.isInteger(Math.log2(items[x])))
}
let arr1 = ['a', 'b', 'c', 'a', 'b', 'z', 'z', 'z']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(difference(arr1, arr2, arr3))
(As noted in the comments x & (x-1) === 0 would be more idiomatic to check whether x is a power of two. See How does the formula x & (x - 1) works? for explanations.)
Here's a more general approach that doesn't limit the number of arrays and doesn't require keys to be strings:
function difference(...arrays) {
let items = new Map
for (let [n, a] of arrays.entries())
for (let x of a) {
if (!items.has(x))
items.set(x, new Set)
items.get(x).add(n)
}
let result = []
for (let [x, ns] of items)
if (ns.size === 1)
result.push(x)
return result
}
let arr1 = ['a', 'b', 'c', 'a', 'b', 'z', 'z', 'z']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(difference(arr1, arr2, arr3))
EDIT: Misunderstood OP and it's not an intersect, but extracting values that are unique (e.g. NOT the intersection) between the individual arrays, for that this might work:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
return Array.from(
arrs.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
)
.filter(([value, count]) => count === 1)
.map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
I would think #gog's answer is way more sophisticated and probably much faster, but i have a slightly hard time wrapping my head around it (call me stupid, i take it =D, EDIT: had to do some research, read/learn something about bitsets here and here), so here's the breakdown of the slightly convoluted way of doing this with a Map and array methods:
pass all arrays to be analyzed into function, order doesn't matter
Loop (i chose reduce, but any loop structure works) trough all input arrays and their values, counting up occurrences in the Map, at the end the Map will look as follows:
0: {"a" => 4}
1: {"b" => 3}
2: {"c" => 3}
3: {"d" => 1}
4: {"f" => 1}
Once done with that, we convert the Map back into an array via Array.from() creating an array of tuples:
[
["a", 4],
["b", 3],
["c", 3],
["d", 1],
["f", 1],
]
Filter that resulting array of tuples (now in the form of [<value>, <count>] to only be left with values that exactly occurred once, leaving us with:
[
["d", 1],
["f", 1],
]
Map over the filtered array to "dumb" it down into a one-dimensional array again and return the result:
["d", "f"]
WARNING: Internally this code does a ****load of loops, so call it a brute-force loop as well, it just looks "shorter" due to "sexy" ES6 array-syntax-sugar.
A slightly modified version for completeness as the Array.filter() step can be omitted (although it seems to be faster) by iterating the counter-Map once it's finalized and simply deleting Map-entries that do not have value 1.
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
let result;
arrs
.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
// the result of .reduce will be a Map!
.forEach((value, key, map) => { value !== 1 && map.delete(key); result = map; });
return Array.from(result).map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
UPDATE: as #Nick Parsons pointed out, the previous version of the code would not output elements that were only present in one array, but multiple times.
This will produce an incorrect output if one array contains the same value multiple times and that element isn't present in any other arrays. eg, if you remove b from arr2, then only arr1 has b in it but no others do, so it should b should be included in the final result.
This can easily be solved by turning the array that is checked into a Set() (thereby reducing the arrays values to "unique" ones).
If anyone (besides me) wonders, here's a benchmark between gog's options and mine, his bitset approach is clearly the fastest, so if you are comparing less than 32 arrays, that's the most performant solution by far: https://jsben.ch/YkKSu
and if anyone prefers an ES6-ified version of gog's bitset implementation (improved by #ralphmerridew suggestion), here you go:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyone(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0) | (1 << n));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyone(arr1, arr2, arr3));
updated the benchmark with this as well, interestingly (or unexpectedly) this "slower"-looking ES6 implementation somehow beats gog's for-loop implementation by a tad, tested in chrome and firefox multiple times, as i couldn't believe it myself, thought those syntax-sugar methods slow things down slightly compared to for loops, well...good to know =)
I also tried implementing the bitset approach with BigInt() to eliminate the issue with it only being able to deal with 32 arrays (depending on the Engine with BigInt it should be possible to deal with 1 million to 1 billion arrays), unfortunately that seems to make it the slowest of all solutions (benchmark updated):
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyoneBigInt(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0n) | (1n << BigInt(n)));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1n)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyoneBigInt(arr1, arr2, arr3));
Maybe someone sees something that can be improved to make this faster?
This is really just Set operations. The method single below finds any entry in a test array that does not appear in the other arrays in the collection. Deliberately implementing this so you can test individual arrays since it's not clear in the question if you need to return the letters, or the arrays.
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
// The set of arrays
let arrays = [ arr1, arr2, arr3 ]
// Finds any entries in the test array that doesn't appear in the arrays that aren't the test arrays
let singles = (test) => {
// others is the Set of all value in the other arrays
others = arrays.reduce( ( accum, elem ) => {
if (elem != test) { elem.forEach(accum.add, accum) }
return accum
}, new Set())
// find anything in the test array that the others do not have
return [...new Set(test.filter( value => ! others.has(value) ))]
}
// collect results from testing all arrays
result = []
for(const array of arrays) { result.push(...singles(array))
}
console.log(result)
Borrowing the parameter construction from #gog's excellent answer, you could also define it so that it takes a test array and an arbitrary collection of arrays to test against:
let singles = (test, ...arrays) => {
// others is the Set of all value in the other arrays
others = arrays.reduce( ( accum, elem ) => {
if (elem != test) { elem.forEach(accum.add, accum) }
return accum
}, new Set())
// find anything in the test array that the others do not have
return [...new Set(test.filter( value => ! others.has(value) ))]
}
console.log(singles(arr2, arr1, arr2, arr3))
The advantage here is that this should work with any number of arrays, while gog's answer is probably faster for a collection of less than 32 arrays (or technically any number if you were willing to extend it using BigInt, but that may lose some of the speed)
A fairly simple approach:
const inOnlyOne = (
xss,
keys = [... new Set (xss .flat ())],
uniques = xss .map (xs => new Set (xs))
) => keys .filter (k => uniques .filter (f => f .has (k)) .length == 1)
console .log (inOnlyOne ([['a', 'b', 'c', 'a', 'b'], ['a', 'd', 'b', 'c'], ['f', 'c', 'a']]))
We find the list of unique keys by flattening our array of arrays and turning that into a Set and then back into an array, convert the arrays into Sets, then filter the keys to find only those where the number of sets including that key has exactly one entry.
There is a little inefficiency here in that we check all the Sets when seeing if a number is in there. It would be easy enough to modify it to check only until we find a second Set, but the code would be more complex. I would only bother to do so if I found that this simple version was not performant enough for my needs.
One advantage of this approach is that it works for other data types than strings and numbers:
const a = {a: 1}, b = {b: 3}, c = {c: 3}, d = {d: 4}, e = {e: 5}, f = {f: 6}
inOnlyOne ([[a, b, c, a, b], [a, d, b, c], [f, c, a]])
//=> [{d: 4}, {f: 6}]
Of course that only helps if your items are shared references. If you wanted to use value equality rather than reference equality, it would be significantly more complex.
If we wanted to pass the arrays individually, rather than wrap them in a common array, this variant should work:
const inOnlyOne = (...xss) => ((
keys = [... new Set (xss .flat ())],
uniques = xss .map (xs => new Set (xs))
) => keys .filter (k => uniques .filter (f => f .has (k)) .length == 1)
) ()
The Array.prototype.includes() method seems like the way to go here.
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a', 'f']
var arrays = [arr1,arr2,arr3];
const items = arr1.concat(arr2, arr3);
let results = [];
items.forEach(isInOneArray);
function isInOneArray(item){
let found = 0;
for (const arr of arrays){
if (arr.includes(item)){
found ++;
}
}
if (found===1 && !results.includes(item)){
results.push(item);
}
}
console.log(results);
This is a brute force iterator much like your own, but reduces the number of re-entries by removing items from the array:
function elementsInOnlyOneArr(...arrays) {
// de-dup and sort so we process the longest array first
let sortedArrays = arrays.map(ar => [...new Set(ar)]).sort((a,b) => b.length - a.length);
for (let ai1 = 0 ; ai1 < sortedArrays.length; ai1 ++) {
for(let i = sortedArrays[ai1].length - 1; i >= 0; i --){
let exists = false;
let val = sortedArrays[ai1][i];
for(let ai2 = ai1 + 1 ; ai2 < sortedArrays.length ; ai2 ++) {
let foundIndex = sortedArrays[ai2].indexOf(val);
if (foundIndex >= 0) {
exists = true;
sortedArrays[ai2].splice(foundIndex,1);
// do not break, check for match in the other arrays
}
}
// if there was a match in any of the other arrays, remove it from the first one too!
if (exists)
sortedArrays[ai1].splice(i,1);
}
}
// concat the remaining elements, they are all unique
let output = sortedArrays[0];
for(let i = 1; i < sortedArrays.length; i ++)
output = output.concat(sortedArrays[i]);
return output;
}
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(elementsInOnlyOneArr(arr1,arr2,arr3));
See this fiddle: https://jsfiddle.net/4deq7xwm/
Updated - Use splice() instead of pop()
Create a collection of pairs (x,y) where x is an element (in your case, a string) and y identifies the array it comes from. Sort this in O(log n) time by x first (where n is the total number of items over all arrays). It is easy to iterate over the result and detect the desired items.
This is easily solved with the built-in .lastIndexOf() Array method:
const arr1 = ['a', 'b', 'c', 'a', 'b'];
const arr2 = ['a', 'd', 'b', 'c'];
const arr3 = ['f', 'c', 'a'];
function getValuesInOneArray(...arrays) {
const combinedArr = arrays.flat();
const result = [];
for (const value of combinedArr) {
if (combinedArr.indexOf(value) === combinedArr.lastIndexOf(value)) {
result.push(value);
}
}
return result;
}
getValuesInOneArray(arr1, arr2, arr3); // ['d', 'f']
I generally try to avoid "ninja code" for the benefit of maintainability and readability, but I couldn't resist rewriting the above getValuesInOneArray() function as a slicker arrow function.
const getValuesInOneArray = (...arrays) =>
arrays
.flat()
.filter(
(value, index, array) => array.indexOf(value) === array.lastIndexOf(value)
);
You can read more about "ninja code" (and why you should avoid it) here, on Javacript.info, but I recommend avoiding practices like this in production codebases.
Hope this helps.
function elementsInOnlyOneArr(arr1, arr2, arr3){
let arr = arr1.concat(arr2).concat(arr3);
return removeDuplicate(arr);
}
function removeDuplicate(arr){
for(each of arr){
let count = 0;
for(ch of arr){
if(each === ch){
count++;
if(count > 1){
//removing element that exist more than one
arr = arr.filter(item => item !== each);
return removeDuplicate(arr);
}
}
}
}
return arr;
}
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
console.log(elementsInOnlyOneArr(arr1, arr2, arr3));
Do a diff of each of the array and concat those to get the unique values only in any one of the arrays.
const arr1 = ['a', 'b', 'c', 'a', 'b'];
const arr2 = ['a', 'd', 'b', 'c'];
const arr3 = ['f', 'c', 'a'];
function diff(a1, a2, a3) {
let u1 = a1.filter(el => { return !a2.includes(el) })
.filter(el => { return !a3.includes(el) });
let u2 = a2.filter(el => { return !a1.includes(el) })
.filter(el => { return !a3.includes(el) });
let u3 = a3.filter(el => { return !a2.includes(el) })
.filter(el => { return !a1.includes(el) });
return u1.concat(u2).concat(u3);
}
/* diff them */
const adiff = diff(arr1, arr2, arr3);
console.log(adiff);
I have 2 arrays. I am trying to return the similar values between the 2 but in the order of the second. For example, take a look at the two arrays:
array1 = ['a', 'b', 'c']
array2 = ['b', 'c', 'a', 'd']
What I would like to return is this:
sim = ['b', 'c', 'a']
Here is a link to what I am trying to accomplish. Currently the script is faulty and not catching the corner case.
You could use a Set for array1 use Array#filter array2 by checking the set.
var array1 = ['a', 'b', 'c'],
array2 = ['b', 'c', 'a', 'd'],
theSet = new Set(array1),
result = array2.filter(v => theSet.has(v));
console.log(result);
Some annotations to your code:
function arr_sim (a1, a2) {
var //a = {}, // take an object as hash table, better
a = Object.create(null), // a really empty object without prototypes
sim = [],
i; // use single declaration at top
for (i = 0; i < a1.length; i++) { // iterate all item of array 1
a[a1[i]] = true;
}
for (var i = 0; i < a2.length; i++) {
if (a[a2[i]]) {
sim.push(a2[i]); // just push the value
}
}
return sim;
}
console.log(arr_sim(['a', 'b', 'c'], ['b', 'c', 'a', 'd']));
You can iterate array2 with a filter, and check if the value is contained in array1:
let array1 = ['a', 'b', 'c'];
let array2 = ['b', 'c', 'a', 'd'];
let sim = array2.filter((entry) => {
return array1.includes(entry);
});
console.log(sim);
I think this is what you are looking for?
function arr_sim (a1, a2) {
a1 = Array.isArray(a1)?a1:typeof a1 == "string"?a1.split(""):false;
a2 = Array.isArray(a2)?a1:typeof a2 == "string"?a2.split(""):false;
if(!a1 || !a2){
alert("Not valid values");
return;
}
var filterArray = a1.filter(function(val){
return a2.indexOf(val) !== -1;
})
return filterArray;
}
console.log(arr_sim(['a', 'b'], ['b', 'a', 'c', 'd']));
console.log(arr_sim("abcd", "abcde"));
console.log(arr_sim("cxz", "zcx"));
Try this
const arr_sim = (a1, a2) => a2.filter(a => a1.includes(a))
console.log(arr_sim(['a', 'b', 'c'], ['b', 'c', 'a', 'd']));
try this example here similar-values betwe
en two arrays
var a1 = ['a' ,'b'];
var a2 = ['a' ,'b' ,'c'];
var result = arr_sim(a1,a2);// call method arr_sim
console.log(result);
function arr_sim (a1, a2) {
var similar = [];
for( var i = 0 ; i <a1.length ; i++ ){ // loop a1 array
for( var j = 0 ; j <a2.length ; j++ ){ // loop a2 array
if( a1[i] == a2[j] ){ // check if is similar
similar.push(a1[i]); // add to similar array
break; // break second loop find that is similar
} // end if
} // end second lopp
} // end first loop
return similar; // return result
} // end function
I have an Array with duplicate values.
I want to create a Set to get the distinct values of that array and remove or create a new Array that will have the same data MINUS the elements required to create the Set.
This is not just a matter of remove the duplicates, but remove a SINGLE entry of a each distinct value in the original array
Something like that works, but I wonder if there is a more direct approach:
let originalValues = [
'a',
'a',
'a',
'b',
'b',
'c',
'c',
'd'
];
let distinct = new Set(originalValues);
/*
distinct -> { 'a', 'b', 'c', 'd' }
*/
// Perhaps originalValues.extract(distinct) ??
for (let val of distinct.values()) {
const index = originalValues.indexOf(val);
originalValues.splice(index, 1);
}
/*
originalValues -> [
'a',
'a',
'b',
'c'
];
*/
Use Array#filter in combination with the Set:
const originalValues = ['a', 'a', 'a', 'b', 'b', 'c', 'c', 'd'];
const remainingValues = originalValues.filter(function(val) {
if (this.has(val)) { // if the Set has the value
this.delete(val); // remove it from the Set
return false; // filter it out
}
return true;
}, new Set(originalValues));
console.log(remainingValues);
You could use closure over a Set and check for existence.
let originalValues = ['a', 'a', 'a', 'b', 'b', 'c', 'c', 'd'],
result = originalValues.filter((s => a => s.has(a) || !s.add(a))(new Set));
console.log(result);
You should not use indexOf inside a loop, because it has linear cost, and the total cost becomes quadratic. What I would do is use a map to count the occurrences of each item in your array, and then convert back to an array subtracting one occurrence.
let originalValues = ['a', 'a', 'a', 'b', 'b', 'c', 'c', 'd'];
let freq = new Map(); // frequency table
for (let item of originalValues)
if (freq.has(item)) freq.set(item, freq.get(item)+1);
else freq.set(item, 1);
var arr = [];
for (let [item,count] of freq)
for (let i=1; i<count; ++i)
arr.push(item);
console.log(arr);
If all items are strings you can use a plain object instead of a map.
You can create a simple Array.prototype.reduce loop with a hash table to count the number of occurrences and populate the result only if it occurs more than once.
See demo below:
var originalValues=['a','a','a','a','b','b','b','c','c','d'];
var result = originalValues.reduce(function(hash) {
return function(p,c) {
hash[c] = (hash[c] || 0) + 1;
if(hash[c] > 1)
p.push(c);
return p;
};
}(Object.create(null)), []);
console.log(result);
.as-console-wrapper{top:0;max-height:100%!important;}
Instead of using Set for this you could just use reduce() and create new array with unique values and also update original array with splice().
let oV = ["a", "a", "a", "a", "b", "b", "c", "c", "d"]
var o = {}
var distinct = oV.reduce(function(r, e) {
if (!o[e]) o[e] = 1 && r.push(e) && oV.splice(oV.indexOf(e), 1)
return r;
}, [])
console.log(distinct)
console.log(oV)
As an alternate approach, you can use following algorithm that will remove only 1st entry of a duplicate element. If not duplicate, it will not remove anything.
const originalValues = ['a', 'a', 'a', 'b', 'b', 'c', 'c', 'd'];
var r = originalValues.reduce(function(p, c, i, a) {
var lIndex = a.lastIndexOf(c);
var index = a.indexOf(c)
if (lIndex === index || index !== i)
p.push(c);
return p
}, [])
console.log(r)
If duplicates are not case, then you can directly remove first iteration directly
const originalValues = ['a', 'a', 'a', 'b', 'b', 'c', 'c', 'd'];
var r = originalValues.filter(function(el, i) {
return originalValues.indexOf(el) !== i
})
console.log(r)
Consider the following scenario;
var defaultArr = ['a', 'b', 'c', 'd'];
var availArr = [];
var selectedArr = [];
If I am passing array some index's value in param's, I need to split up my array's
Example:
If Array Index : 0,2
Expected result:
availArr = ['b', 'd'];
selectedArr = ['a', 'c'];
Is there any default method to achieve this?
Failrly easy with Array.reduce
var defaultArr = ['a', 'b', 'c', 'd'];
var indexes = [0,2];
var result = defaultArr.reduce(function(p, c, i){
if(indexes.indexOf(i)>-1)
p.selectedArr.push(c);
else
p.availArr.push(c);
return p;
}, {availArr: [], selectedArr:[]});;
console.log('availArr',result.availArr);
console.log('selectedArr',result.selectedArr);
This works because reduce takes a callback argument which is passed 3 arguments - in my example above
p the seed object passed in
c the current array element
i the index of the current element
And uses that information along with indexOf to determine which result array to push to.
You could use Array#reduceRight and iterate the indices array.
var defaultArr = ['a', 'b', 'c', 'd'],
availArr = defaultArr.slice(),
selectedArr = [],
indices = [0, 2];
indices.reduceRight(function (_, a) {
selectedArr.unshift(availArr.splice(a, 1)[0]);
}, 0);
console.log(availArr);
console.log(selectedArr);
var defaultArr = ['a', 'b', 'c', 'd'];
var availArr = [];
var selectedArr = [];
function splitArray(indexes) {
availArr = defaultArr;
indexes.forEach(function(idx) {
let item = availArr.splice(idx, 1)[0];
selectedArr.push(item);
})
}
splitArray([0, 2]);
console.log(availArr);
console.log(selectedArr);
You can use Array methods like forEach and includes
var given = ['a', 'b', 'c', 'd'];
var indexes = [0, 2];
var available = [];
var selected = [];
given.forEach(function (v, i) {
if (indexes.includes(i)) {
selected.push(v);
} else {
available.push(v);
}
});
document.write(JSON.stringify({
given: given,
available: available,
selected: selected
}));
In JS Array.prototype.reduceRight() is the ideal functor to iterate over an array and to morph it by removing items. Accordingly i would approach this job as follows;
var defaultArr = ['a', 'b', 'c', 'd'],
indices = [0, 2];
result = defaultArr.reduceRight((p,c,i,a) => indices.includes(i) ? p.concat(a.splice(i,1)) : p ,[]);
console.log(defaultArr,result);
You can use array.splice + array.concat to achieve this
var defaultArr = ['a', 'b', 'c', 'd'];
var availArr = [];
var selectedArr = [];
function parseIndexes(indexArr){
var deleteCount = 0;
availArr = defaultArr.map(x=>x);
indexArr.forEach(function(i){
selectedArr = selectedArr.concat(availArr.splice(i-deleteCount,1))
deleteCount++;
});
console.log(availArr, selectedArr)
}
parseIndexes([0,2])
With only Array.filter
var array = ['a', 'b', 'c', 'd'];
var indexes = [0, 2]
array.filter(function(el, i) {
return indexes.indexOf(i) !== -1
});
// ["a", "c"]
With array the array of your elements, objects, strings... and indexes the array containing all the indexes of the elements you want to keep, you just remove from the arrayarray all the elements whose id isn't in theindexes array.
The array of all selected entries can be obtained in one line via the Array.map:
var defaultArr = ['a', 'b', 'c', 'd']
var index = [0,2]
var selectedArr = index.map(i => defaultArr[i]) //=> ['a', 'c']
Then the array of the remaining entries can be retrieved e.g. with the Ramda's difference operator:
var availArr = R.difference(defaultArr, selectedArr) //=> ['b', 'd']
I Have a multidimensional array such as
MultArrary = [
['a','b'],
['c','d'],
['f','g']
]
What i need is to get the key and value of each array inside the array and push it into another array
Expected array1 = ['a','c','f'];
expected array2 = ['b','d','g'];
Any ideas how to achieve this with javascript or rxjs will be great
Easiest thing to do (while not syntatically correct, see the comments below your question) in your case is to use a Map (provided you are within an ecmascript6 capable environment):
var MultiArray = [
['a', 'b'],
['c', 'd'],
['f', 'g']
];
var m = new Map(MultiArray);
var index0 = Array.from(m.keys());
var index1 = Array.from(m.values());
console.log(index0, index1);
Using Underscore.js you can do it with unzip:
var MultArrary = [
['a', 'b'],
['c', 'd'],
['f', 'g']
];
var rst = _.unzip(MultArrary);
//[['a','c','f'],['b','d','g']]
console.log(rst);
<script data-require="underscore.js#1.8.3" data-semver="1.8.3" src="//cdnjs.cloudflare.com/ajax/libs/underscore.js/1.8.3/underscore-min.js"></script>
You could use a single approach and build the result in one task.
Your wanted result is in pivot[0] and pivot[1].
var multArrary = [['a', 'b'], ['c', 'd'], ['f', 'g']],
pivot = multArrary.reduce(function (r, a) {
a.forEach(function (b, i) {
r[i] = r[i] || [];
r[i].push(b);
});
return r;
}, []);
console.log(pivot);