Javascript Concat Values of Two Arrays - javascript

I have two arrays like this:
const a = ['a', 'b', 'c', 'd'];
const b = ['1', '2', '3', '4'];
I'm trying to make a new array like this:
const c = ['a1', 'b2', 'c3', 'd4'];
I tried it this way:
const c = [];
c.push([`${a[0]}${b[0]}`, `${a[1]}${b[1]}`, `${a[2]}${b[2]}`, `${a[3]}${b[3]}`]);
With actually looping through data and doing this took 17400ms.
I took out the c.push([........]); and it dropped to 1250ms.
Why does this take so long to do?
And what is the best way to do this?

you can use .map to achieve that. map a, then use index on each loop to get element of b.
const a = ['a', 'b', 'c', 'd'];
const b = ['1', '2', '3', '4'];
var c = a.map(function (d, i) {
return d + String(b[i])
})
console.log(c)
// ["a1", "b2", "c3", "d4"]
cleaner code using es6:
var c = a.map((d, i) => `${d}${b[i]}`)

A simple loop.
const a = ['a', 'b', 'c', 'd', 'e'];
const b = ['1', '2', '3'];
var result = [];
for (var i = 0; i < a.length; i++) {
result[i] = a[i] + b[i];
}
alert(result);

As I suspected and you confirmed, the real problem is that you do too many push.
push modifies the length of the array. The specification does not enforce any data structure for arrays, but for non-sparse ones, implementations usually use lists which store the values consecutively in memory. That's problematic when you change the length of the array, because the additional data could not fit in the place in memory where the data currently is, so all data must be moved. push ends up being constant in amortized time instead of just constant.
However, if you know the length of the resulting array beforehand, it's usually better to preallocate.
Then, instead of
var array = [];
for(var i=0; i<2e4; ++i)
array.push([a[0]+b[0], a[1]+b[1], a[2]+b[2], a[3]+b[3]]);
I would use
var array = new Array(2e4);
for(var i=0; i<2e4; ++i)
array[i] = [a[0]+b[0], a[1]+b[1], a[2]+b[2], a[3]+b[3]];

Related

Are there any differences between spreading two arrays in a new array vs calling `[].concat`? [duplicate]

What is the difference between spread operator and array.concat()
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log([...numbers, ...parts]);
Array.concat() function
let parts = ['four', 'five'];
let numbers = ['one', 'two', 'three'];
console.log(numbers.concat(parts));
Both results are same. So, what kind of scenarios we want to use them? And which one is best for performance?
concat and spreads are very different when the argument is not an array.
When the argument is not an array, concat adds it as a whole, while ... tries to iterate it and fails if it can't. Consider:
a = [1, 2, 3]
x = 'hello';
console.log(a.concat(x)); // [ 1, 2, 3, 'hello' ]
console.log([...a, ...x]); // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Here, concat treats the string atomically, while ... uses its default iterator, char-by-char.
Another example:
x = 99;
console.log(a.concat(x)); // [1, 2, 3, 99]
console.log([...a, ...x]); // TypeError: x is not iterable
Again, for concat the number is an atom, ... tries to iterate it and fails.
Finally:
function* gen() { yield *'abc' }
console.log(a.concat(gen())); // [ 1, 2, 3, Object [Generator] {} ]
console.log([...a, ...gen()]); // [ 1, 2, 3, 'a', 'b', 'c' ]
concat makes no attempt to iterate the generator and appends it as a whole, while ... nicely fetches all values from it.
To sum it up, when your arguments are possibly non-arrays, the choice between concat and ... depends on whether you want them to be iterated.
The above describes the default behaviour of concat, however, ES6 provides a way to override it with Symbol.isConcatSpreadable. By default, this symbol is true for arrays, and false for everything else. Setting it to true tells concat to iterate the argument, just like ... does:
str = 'hello'
console.log([1,2,3].concat(str)) // [1,2,3, 'hello']
str = new String('hello');
str[Symbol.isConcatSpreadable] = true;
console.log([1,2,3].concat(str)) // [ 1, 2, 3, 'h', 'e', 'l', 'l', 'o' ]
Performance-wise concat is faster, probably because it can benefit from array-specific optimizations, while ... has to conform to the common iteration protocol. Timings:
let big = (new Array(1e5)).fill(99);
let i, x;
console.time('concat-big');
for(i = 0; i < 1e2; i++) x = [].concat(big)
console.timeEnd('concat-big');
console.time('spread-big');
for(i = 0; i < 1e2; i++) x = [...big]
console.timeEnd('spread-big');
let a = (new Array(1e3)).fill(99);
let b = (new Array(1e3)).fill(99);
let c = (new Array(1e3)).fill(99);
let d = (new Array(1e3)).fill(99);
console.time('concat-many');
for(i = 0; i < 1e2; i++) x = [1,2,3].concat(a, b, c, d)
console.timeEnd('concat-many');
console.time('spread-many');
for(i = 0; i < 1e2; i++) x = [1,2,3, ...a, ...b, ...c, ...d]
console.timeEnd('spread-many');
Well console.log(['one', 'two', 'three', 'four', 'five']) has the same result as well, so why use either here? :P
In general you would use concat when you have two (or more) arrays from arbitrary sources, and you would use the spread syntax in the array literal if the additional elements that are always part of the array are known before. So if you would have an array literal with concat in your code, just go for spread syntax, and just use concat otherwise:
[...a, ...b] // bad :-(
a.concat(b) // good :-)
[x, y].concat(a) // bad :-(
[x, y, ...a] // good :-)
Also the two alternatives behave quite differently when dealing with non-array values.
I am replying just to the performance question since there are already good answers regarding the scenarios. I wrote a test and executed it on the most recent browsers. Below the results and the code.
/*
* Performance results.
* Browser Spread syntax concat method
* --------------------------------------------------
* Chrome 75 626.43ms 235.13ms
* Firefox 68 928.40ms 821.30ms
* Safari 12 165.44ms 152.04ms
* Edge 18 1784.72ms 703.41ms
* Opera 62 590.10ms 213.45ms
* --------------------------------------------------
*/
Below the code I wrote and used.
const array1 = [];
const array2 = [];
const mergeCount = 50;
let spreadTime = 0;
let concatTime = 0;
// Used to popolate the arrays to merge with 10.000.000 elements.
for (let i = 0; i < 10000000; ++i) {
array1.push(i);
array2.push(i);
}
// The spread syntax performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = [ ...array1, ...array2 ];
spreadTime += performance.now() - startTime;
}
// The concat performance test.
for (let i = 0; i < mergeCount; ++i) {
const startTime = performance.now();
const array3 = array1.concat(array2);
concatTime += performance.now() - startTime;
}
console.log(spreadTime / mergeCount);
console.log(concatTime / mergeCount);
The one difference I think is valid is that using spread operator for large array size will give you error of Maximum call stack size exceeded which you can avoid using the concat operator.
var someArray = new Array(600000);
var newArray = [];
var tempArray = [];
someArray.fill("foo");
try {
newArray.push(...someArray);
} catch (e) {
console.log("Using spread operator:", e.message)
}
tempArray = newArray.concat(someArray);
console.log("Using concat function:", tempArray.length)
There is one very important difference between concat and push in that the former does not mutate the underlying array, requiring you to assign the result to the same or different array:
let things = ['a', 'b', 'c'];
let moreThings = ['d', 'e'];
things.concat(moreThings);
console.log(things); // [ 'a', 'b', 'c' ]
things.push(...moreThings);
console.log(things); // [ 'a', 'b', 'c', 'd', 'e' ]
I've seen bugs caused by the assumption that concat changes the array (talking for a friend ;).
Update:
Concat is now always faster than spread. The following benchmark shows both small and large-size arrays being joined: https://jsbench.me/nyla6xchf4/1
// preparation
const a = Array.from({length: 1000}).map((_, i)=>`${i}`);
const b = Array.from({length: 2000}).map((_, i)=>`${i}`);
const aSmall = ['a', 'b', 'c', 'd'];
const bSmall = ['e', 'f', 'g', 'h', 'i'];
const c = [...a, ...b];
// vs
const c = a.concat(b);
const c = [...aSmall, ...bSmall];
// vs
const c = aSmall.concat(bSmall)
Previous:
Although some of the replies are correct when it comes to performance on big arrays, the performance is quite different when you are dealing with small arrays.
You can check the results for yourself at https://jsperf.com/spread-vs-concat-size-agnostic.
As you can see, spread is 50% faster for smaller arrays, while concat is multiple times faster on large arrays.
The answer by #georg was helpful to see the comparison. I was also curious about how .flat() would compare in the running and it was by far the worst. Don't use .flat() if speed is a priority. (Something I wasn't aware of until now)
let big = new Array(1e5).fill(99);
let i, x;
console.time("concat-big");
for (i = 0; i < 1e2; i++) x = [].concat(big);
console.timeEnd("concat-big");
console.time("spread-big");
for (i = 0; i < 1e2; i++) x = [...big];
console.timeEnd("spread-big");
console.time("flat-big");
for (i = 0; i < 1e2; i++) x = [[], big].flat();
console.timeEnd("flat-big");
let a = new Array(1e3).fill(99);
let b = new Array(1e3).fill(99);
let c = new Array(1e3).fill(99);
let d = new Array(1e3).fill(99);
console.time("concat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3].concat(a, b, c, d);
console.timeEnd("concat-many");
console.time("spread-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, ...a, ...b, ...c, ...d];
console.timeEnd("spread-many");
console.time("flat-many");
for (i = 0; i < 1e2; i++) x = [1, 2, 3, a, b, c, d].flat();
console.timeEnd("flat-many");

Better way to check if an element only exists in one array

I need help with creating a function to return the elements that are only present in one of 3 arrays, for example
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
In the three arrays above, 'd' and 'f' are found only in one of the arrays (arr2 and arr3), I need to return them.
['d','f']
The arrays can be of different sizes and the returned elements must not be duplicated.
I tried to find better alternatives, but I failed and just went with the brute force approach, looping through each array and checking if the element exists in the other two arrays, but obviously, it's really slow and hard to read.
function elementsInOnlyOneArr(a1, a2, a3) {
let myArr = [];
for(let el of a1){
if(a2.includes(el) == false && a3.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
for(let el of a2){
if(a1.includes(el) == false && a3.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
for(let el of a3){
if(a2.includes(el) == false && a1.includes(el) == false && myArr.includes(el) == false){
myArr.push(el);
}
}
return myArr;
}
Assuming there are less than 32 arrays, you can do this efficiently with bitmaps. Basically, build an index key -> number where the number has the Nth bit set if the key is in the Nth array. Finally return keys whose numbers only have a single bit set (=are powers of two):
function difference(...arrays) {
let items = {}
for (let [n, a] of arrays.entries())
for (let x of a) {
items[x] = (items[x] ?? 0) | (1 << n)
}
return Object.keys(items).filter(x =>
Number.isInteger(Math.log2(items[x])))
}
let arr1 = ['a', 'b', 'c', 'a', 'b', 'z', 'z', 'z']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(difference(arr1, arr2, arr3))
(As noted in the comments x & (x-1) === 0 would be more idiomatic to check whether x is a power of two. See How does the formula x & (x - 1) works? for explanations.)
Here's a more general approach that doesn't limit the number of arrays and doesn't require keys to be strings:
function difference(...arrays) {
let items = new Map
for (let [n, a] of arrays.entries())
for (let x of a) {
if (!items.has(x))
items.set(x, new Set)
items.get(x).add(n)
}
let result = []
for (let [x, ns] of items)
if (ns.size === 1)
result.push(x)
return result
}
let arr1 = ['a', 'b', 'c', 'a', 'b', 'z', 'z', 'z']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(difference(arr1, arr2, arr3))
EDIT: Misunderstood OP and it's not an intersect, but extracting values that are unique (e.g. NOT the intersection) between the individual arrays, for that this might work:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
return Array.from(
arrs.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
)
.filter(([value, count]) => count === 1)
.map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
I would think #gog's answer is way more sophisticated and probably much faster, but i have a slightly hard time wrapping my head around it (call me stupid, i take it =D, EDIT: had to do some research, read/learn something about bitsets here and here), so here's the breakdown of the slightly convoluted way of doing this with a Map and array methods:
pass all arrays to be analyzed into function, order doesn't matter
Loop (i chose reduce, but any loop structure works) trough all input arrays and their values, counting up occurrences in the Map, at the end the Map will look as follows:
0: {"a" => 4}
1: {"b" => 3}
2: {"c" => 3}
3: {"d" => 1}
4: {"f" => 1}
Once done with that, we convert the Map back into an array via Array.from() creating an array of tuples:
[
["a", 4],
["b", 3],
["c", 3],
["d", 1],
["f", 1],
]
Filter that resulting array of tuples (now in the form of [<value>, <count>] to only be left with values that exactly occurred once, leaving us with:
[
["d", 1],
["f", 1],
]
Map over the filtered array to "dumb" it down into a one-dimensional array again and return the result:
["d", "f"]
WARNING: Internally this code does a ****load of loops, so call it a brute-force loop as well, it just looks "shorter" due to "sexy" ES6 array-syntax-sugar.
A slightly modified version for completeness as the Array.filter() step can be omitted (although it seems to be faster) by iterating the counter-Map once it's finalized and simply deleting Map-entries that do not have value 1.
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
let result;
arrs
.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
// the result of .reduce will be a Map!
.forEach((value, key, map) => { value !== 1 && map.delete(key); result = map; });
return Array.from(result).map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
UPDATE: as #Nick Parsons pointed out, the previous version of the code would not output elements that were only present in one array, but multiple times.
This will produce an incorrect output if one array contains the same value multiple times and that element isn't present in any other arrays. eg, if you remove b from arr2, then only arr1 has b in it but no others do, so it should b should be included in the final result.
This can easily be solved by turning the array that is checked into a Set() (thereby reducing the arrays values to "unique" ones).
If anyone (besides me) wonders, here's a benchmark between gog's options and mine, his bitset approach is clearly the fastest, so if you are comparing less than 32 arrays, that's the most performant solution by far: https://jsben.ch/YkKSu
and if anyone prefers an ES6-ified version of gog's bitset implementation (improved by #ralphmerridew suggestion), here you go:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyone(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0) | (1 << n));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyone(arr1, arr2, arr3));
updated the benchmark with this as well, interestingly (or unexpectedly) this "slower"-looking ES6 implementation somehow beats gog's for-loop implementation by a tad, tested in chrome and firefox multiple times, as i couldn't believe it myself, thought those syntax-sugar methods slow things down slightly compared to for loops, well...good to know =)
I also tried implementing the bitset approach with BigInt() to eliminate the issue with it only being able to deal with 32 arrays (depending on the Engine with BigInt it should be possible to deal with 1 million to 1 billion arrays), unfortunately that seems to make it the slowest of all solutions (benchmark updated):
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyoneBigInt(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0n) | (1n << BigInt(n)));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1n)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyoneBigInt(arr1, arr2, arr3));
Maybe someone sees something that can be improved to make this faster?
This is really just Set operations. The method single below finds any entry in a test array that does not appear in the other arrays in the collection. Deliberately implementing this so you can test individual arrays since it's not clear in the question if you need to return the letters, or the arrays.
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
// The set of arrays
let arrays = [ arr1, arr2, arr3 ]
// Finds any entries in the test array that doesn't appear in the arrays that aren't the test arrays
let singles = (test) => {
// others is the Set of all value in the other arrays
others = arrays.reduce( ( accum, elem ) => {
if (elem != test) { elem.forEach(accum.add, accum) }
return accum
}, new Set())
// find anything in the test array that the others do not have
return [...new Set(test.filter( value => ! others.has(value) ))]
}
// collect results from testing all arrays
result = []
for(const array of arrays) { result.push(...singles(array))
}
console.log(result)
Borrowing the parameter construction from #gog's excellent answer, you could also define it so that it takes a test array and an arbitrary collection of arrays to test against:
let singles = (test, ...arrays) => {
// others is the Set of all value in the other arrays
others = arrays.reduce( ( accum, elem ) => {
if (elem != test) { elem.forEach(accum.add, accum) }
return accum
}, new Set())
// find anything in the test array that the others do not have
return [...new Set(test.filter( value => ! others.has(value) ))]
}
console.log(singles(arr2, arr1, arr2, arr3))
The advantage here is that this should work with any number of arrays, while gog's answer is probably faster for a collection of less than 32 arrays (or technically any number if you were willing to extend it using BigInt, but that may lose some of the speed)
A fairly simple approach:
const inOnlyOne = (
xss,
keys = [... new Set (xss .flat ())],
uniques = xss .map (xs => new Set (xs))
) => keys .filter (k => uniques .filter (f => f .has (k)) .length == 1)
console .log (inOnlyOne ([['a', 'b', 'c', 'a', 'b'], ['a', 'd', 'b', 'c'], ['f', 'c', 'a']]))
We find the list of unique keys by flattening our array of arrays and turning that into a Set and then back into an array, convert the arrays into Sets, then filter the keys to find only those where the number of sets including that key has exactly one entry.
There is a little inefficiency here in that we check all the Sets when seeing if a number is in there. It would be easy enough to modify it to check only until we find a second Set, but the code would be more complex. I would only bother to do so if I found that this simple version was not performant enough for my needs.
One advantage of this approach is that it works for other data types than strings and numbers:
const a = {a: 1}, b = {b: 3}, c = {c: 3}, d = {d: 4}, e = {e: 5}, f = {f: 6}
inOnlyOne ([[a, b, c, a, b], [a, d, b, c], [f, c, a]])
//=> [{d: 4}, {f: 6}]
Of course that only helps if your items are shared references. If you wanted to use value equality rather than reference equality, it would be significantly more complex.
If we wanted to pass the arrays individually, rather than wrap them in a common array, this variant should work:
const inOnlyOne = (...xss) => ((
keys = [... new Set (xss .flat ())],
uniques = xss .map (xs => new Set (xs))
) => keys .filter (k => uniques .filter (f => f .has (k)) .length == 1)
) ()
The Array.prototype.includes() method seems like the way to go here.
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a', 'f']
var arrays = [arr1,arr2,arr3];
const items = arr1.concat(arr2, arr3);
let results = [];
items.forEach(isInOneArray);
function isInOneArray(item){
let found = 0;
for (const arr of arrays){
if (arr.includes(item)){
found ++;
}
}
if (found===1 && !results.includes(item)){
results.push(item);
}
}
console.log(results);
This is a brute force iterator much like your own, but reduces the number of re-entries by removing items from the array:
function elementsInOnlyOneArr(...arrays) {
// de-dup and sort so we process the longest array first
let sortedArrays = arrays.map(ar => [...new Set(ar)]).sort((a,b) => b.length - a.length);
for (let ai1 = 0 ; ai1 < sortedArrays.length; ai1 ++) {
for(let i = sortedArrays[ai1].length - 1; i >= 0; i --){
let exists = false;
let val = sortedArrays[ai1][i];
for(let ai2 = ai1 + 1 ; ai2 < sortedArrays.length ; ai2 ++) {
let foundIndex = sortedArrays[ai2].indexOf(val);
if (foundIndex >= 0) {
exists = true;
sortedArrays[ai2].splice(foundIndex,1);
// do not break, check for match in the other arrays
}
}
// if there was a match in any of the other arrays, remove it from the first one too!
if (exists)
sortedArrays[ai1].splice(i,1);
}
}
// concat the remaining elements, they are all unique
let output = sortedArrays[0];
for(let i = 1; i < sortedArrays.length; i ++)
output = output.concat(sortedArrays[i]);
return output;
}
let arr1 = ['a', 'b', 'c', 'a', 'b']
let arr2 = ['a', 'd', 'b', 'c']
let arr3 = ['f', 'c', 'a']
console.log(elementsInOnlyOneArr(arr1,arr2,arr3));
See this fiddle: https://jsfiddle.net/4deq7xwm/
Updated - Use splice() instead of pop()
Create a collection of pairs (x,y) where x is an element (in your case, a string) and y identifies the array it comes from. Sort this in O(log n) time by x first (where n is the total number of items over all arrays). It is easy to iterate over the result and detect the desired items.
This is easily solved with the built-in .lastIndexOf() Array method:
const arr1 = ['a', 'b', 'c', 'a', 'b'];
const arr2 = ['a', 'd', 'b', 'c'];
const arr3 = ['f', 'c', 'a'];
function getValuesInOneArray(...arrays) {
const combinedArr = arrays.flat();
const result = [];
for (const value of combinedArr) {
if (combinedArr.indexOf(value) === combinedArr.lastIndexOf(value)) {
result.push(value);
}
}
return result;
}
getValuesInOneArray(arr1, arr2, arr3); // ['d', 'f']
I generally try to avoid "ninja code" for the benefit of maintainability and readability, but I couldn't resist rewriting the above getValuesInOneArray() function as a slicker arrow function.
const getValuesInOneArray = (...arrays) =>
arrays
.flat()
.filter(
(value, index, array) => array.indexOf(value) === array.lastIndexOf(value)
);
You can read more about "ninja code" (and why you should avoid it) here, on Javacript.info, but I recommend avoiding practices like this in production codebases.
Hope this helps.
function elementsInOnlyOneArr(arr1, arr2, arr3){
let arr = arr1.concat(arr2).concat(arr3);
return removeDuplicate(arr);
}
function removeDuplicate(arr){
for(each of arr){
let count = 0;
for(ch of arr){
if(each === ch){
count++;
if(count > 1){
//removing element that exist more than one
arr = arr.filter(item => item !== each);
return removeDuplicate(arr);
}
}
}
}
return arr;
}
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
console.log(elementsInOnlyOneArr(arr1, arr2, arr3));
Do a diff of each of the array and concat those to get the unique values only in any one of the arrays.
const arr1 = ['a', 'b', 'c', 'a', 'b'];
const arr2 = ['a', 'd', 'b', 'c'];
const arr3 = ['f', 'c', 'a'];
function diff(a1, a2, a3) {
let u1 = a1.filter(el => { return !a2.includes(el) })
.filter(el => { return !a3.includes(el) });
let u2 = a2.filter(el => { return !a1.includes(el) })
.filter(el => { return !a3.includes(el) });
let u3 = a3.filter(el => { return !a2.includes(el) })
.filter(el => { return !a1.includes(el) });
return u1.concat(u2).concat(u3);
}
/* diff them */
const adiff = diff(arr1, arr2, arr3);
console.log(adiff);

count items in an one array, if the number of instances is the same as the length of a different array, return the item, javascript

I have two arrays, id like to return the items from array1 if they appear as much as the length of array2
I understand how to do this in python, but I cant figure out how to do it in javascript
arr1 = ['a', 'b', 'c', 'c']
arr2 = ['one', 'two']
arr3 = []
for i in arr1:
if arr1.count(i) == len(arr2):
arr3.append(i)
desired result would be ['c']
Can someone please help me write this in javascript?
You can use reduce and filter
Here idea is :-
First use elements of arr1 as key on obj, if key is already there increment it's value, else set it to zero,
Now take the key's of obj and filter if it's value is equal to length of arr2
let arr1 = ['a', 'b', 'c', 'c']
let arr2 = ['one', 'two']
let obj = arr1.reduce((op,inp)=>{
let key = inp.toLowerCase()
op[key] = op[key] || 0
op[key]++
return op
},{})
let final = Object.keys(obj).filter(key=>{
return obj[key] === arr2.length
})
console.log(final)
Note :- Here i ignored case, if you want both case to be different than you can remove this line
let key = inp.toLowerCase()
Nested loops could do that. One for the for, and one for the count:
var arr1 = ['a', 'b', 'c', 'c'];
var arr2 = ['one', 'two'];
var arr3 = [];
var unique = new Set(arr1);
var len2 = arr2.length;
for(var i of unique){
var count = 0;
for(var j of arr1)
if(j === i)
count++;
if(count == len2)
arr3.push(i);
}
console.log(arr3);
Side remark: I think that Python code would result in ['c','c'], though I have not actually tried.
First use reduce() to create object of whose keys will elements of arr1 and value will be their count. Then filter() its keys according to given condition.
let arr1 = ['a', 'b', 'c', 'c']
let arr2 = ['one', 'two']
const obj = arr1.reduce((ac,a) => (ac[a] = ac[a] + 1 || 1, ac),{});
let res = Object.keys(obj).filter(k => obj[k] === arr2.length);
console.log(res)

Merge two multidimensional arrays, but cant duplicate using datetime

I wish to merge two arrays into one, but cant duplicate using datetime, basically cant merge two results from both arrays with the same datetime.
In each array, the datetime is never repeated.
Both arrays have the same structure, with the same exact positions.
Each array have +60 sub arrays.
example:
Array1 = [[a,b,0000-00-00 00:00],[c,d,0000-00-00 00:59],[e,f,0000-00-00 00:10]];
Array2 = [[z,x,0000-00-00 00:00],[h,s,0000-00-00 00:49],[e,f,0000-00-00 00:20]];
Array12 = [[a,b,0000-00-00 00:00],[c,d,0000-00-00 00:59],[e,f,0000-00-00 00:10],[h,s,0000-00-00 00:49],[e,f,0000-00-00 00:20]];
How can i make this work? I tried a lot of functions, but cant get this working.
Thanks.
If I'm correct you are trying to merge the arrays based in timestamps. Try out this fiddle
var Array1 = [
['a', 'b', '0000-00-00 00:00'],
['c', 'd', '0000-00-00 00:59'],
['e', 'f', '0000-00-00 00:10']
];
var Array2 = [
['z', 'x', '0000-00-00 00:00'],
['h', 's', '0000-00-00 00:49'],
['e', 'f', '0000-00-00 00:20']
];
function mergeArrays(arr1, arr2) {
var merger = {};
for (var i = 0; i < arr1.length; i++) {
merger[arr1[i][2]] = [arr1[i][0], arr1[i][1], arr1[i][2]];
}
for (var i = 0; i < arr2.length; i++) {
if (!(arr2[i][2] in merger)) {
merger[arr2[i][2]] = [arr2[i][0], arr2[i][1], arr2[i][2]];
}
}
var output = [];
for (var key in merger) {
output.push(merger[key]);
}
return output;
}
var result = mergeArrays(Array1, Array2);
console.log(result);

Merge Two Arrays so that the Values Alternate

I'm looking for a jQuery method to merge two arrays so that their values alternate:
var array1 = [1,2,3,4,5];
var array2 = ['a', 'b', 'c', 'd', 'e'];
The result I want is:
var arrayCombined = [1, 'a', 2, 'b', 3, 'c', 4, 'd', 5, 'e'];
Please note that I know it is trivial to do this in JS, however I am after a jQuery method that will do this.
You can use the map method:
var array1 = [1, 2, 3, 4, 5];
var array2 = ['a', 'b', 'c', 'd', 'e'];
var arrayCombined = $.map(array1, function(v, i) {
return [v, array2[i]];
});
console.log(arrayCombined);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Demo: http://jsfiddle.net/Guffa/hmUy6/
If you must use jQuery, you can take advantage of their broken $.map implementation.
var result = $.map(array1, function(v, i) {
return [v, array2[i]];
});
jQuery's $.map flattens the returned array, giving you the result you want.
DEMO: http://jsfiddle.net/8rn2w/
Pure JS solution:
var result = array1.reduce(function(arr, v, i) {
return arr.concat(v, array2[i]);
}, []);
DEMO: http://jsfiddle.net/8rn2w/1/
Just another solution using Array.prototype.flat() and Array.prototype.map().
var array1 = [1, 2, 3, 4, 5];
var array2 = ['a', 'b', 'c', 'd', 'e'];
var result = array1.map(
(element, index) => [element, array2[index]]
).flat();
console.log(result);
For those who arrive here by search engine and want a Lodash option:
_.compact(_.flatten(_.zip(array1, array2)))
Try something like this:
function merge(array1, array2) {
if (array1.length == array2.length) {
var c = [];
for (var i = 0; i < array1.length; i++) {
c.push([array1[i], array2[i]]);
}
return c;
}
return null;
}
I came here curious to see if there was a new Array.prototype.<whatever>() that was helpful for this - and flatMap is that... kinda - but I'm going to leave with the simplest, most obvious solution...
(and a few tears that in 10 years no one else used a for loop to build a single array)
Plain, old JavaScript
If you were merging bits of something like stereo channels, or template strings, etc, this is all you need:
// Supports the two most common cases:
// - interleaving left and right audio channels
// - interleaving template string and values
// (right is assumed to be the same length as left, or one shorter)
function interleave(left, right) {
var both = [];
var i;
for (i = 0; i < left.length; i += 1) {
both.push(left[i]);
// because a template string will have one fewer value
// than it will have string parts
if (i < right.length) {
both.push(right[i]);
}
}
return both;
}
// a template string will always have one extra item
var abc = ['a', 'b', 'c', 'd' ];
var one = [1, 2, 3 ];
var both = interleave(abc, one);
console.log(both);
// ['a', 1, 'b', 2, 'c', 3, 'd']
flatMap
As mentioned in a comment, this is probably suitable:
function cleverInterleave() {
return abc.flatMap(function (val, i) {
if (one[i]) {
return [val, one[i]];
}
return [val];
});
}
var abc = ['a', 'b', 'c', 'd' ];
var one = [1, 2, 3 ];
cleverInterleave(abc, one);
// ['a', 1, 'b', 2, 'c', 3, 'd']
However, this seems a little too "clever" and not as intuitive. Also not as efficient... but readability counts far more than efficiency for most code, so... two strikes against.
mo' channels, mo' problems
You could also create a more generic form of this to handle arbitrary numbers of arrays, or zip up arrays of uneven lengths.
However, I'd recommend using a similar pattern, but tuned to the actual number of arrays that you need - such as 3 for RGB, or 4 for RGBA, etc.

Categories

Resources