Appropriate JS data structure for linking three arrays - javascript

I have three arrays that I need to link to each other in this way:
arr1 = ['A', 'A', 'B', 'B', 'C', 'C' 'A', 'C']
arr2 = ['a', 'aa', 'b', 'bb', 'c', 'cc', 'aaa', 'ccc']
arr3 = [1, 2, 3, 4, 5, 6, 7, 8]
I want these arrays to be linked like this: [['A', ['a', 1], ['aa',2], ['aaa',7]], ['B', ['b', 3], ['bb',4]], ['C', ['c', 5], ['cc',6], ['ccc', 8]]
I could create a 2d array like this but I feel it won't be an efficient data structure. Moreover, I will be using HTML to display content from this data structure and the 2d array could get complicated. I was thinking of using a Map but not sure whether it supports arrays as values to the keys. Any ideas/suggestions how I can achieve this?

Update 2
With this version requiring only shared indices in the three arrays, plus some grouping of the results, it's a bit easier:
const link = (a1, a2, a3) =>
[... new Set (a1)] .map ((a) => [
a,
...Object.keys(a1) .filter (k => a1[k] == a) .map (k => [a2 [k], a3 [k]])
])
const arr1 = ['A', 'A', 'B', 'B', 'C', 'C', 'A', 'C']
const arr2 = ['a', 'aa', 'b', 'bb', 'c', 'cc', 'aaa', 'ccc']
const arr3 = [1, 2, 3, 4, 5, 6, 7, 8]
console .log (link (arr1, arr2, arr3))
.as-console-wrapper {max-height: 100% !important; top: 0}
I would still strongly recommend that you rethink your data structures. Shared array indices are a pretty nasty way to deal with data.
I would find this much, much cleaner:
[
{foo: 'A', bars: [{baz: 'a', qux: 1}, {baz: 'aa', qux: 2}, {baz: 'aaa', qux: 7}]},
{foo: 'B', bars: [{baz: 'b', qux: 3}, {baz: 'bb', qux: 4}]},
{foo: 'C', bars: [{baz: 'c', qux: 5}, {baz: 'cc', qux: 6}, {baz: 'ccc', qux: 8}]}
]
Explanation
A comment asked for an explanation of this code. Here's an attempt.
const link = (a1, a2, a3) =>
[... new Set (a1)] .map ((a) => [
a,
...Object.keys(a1) .filter (k => a1[k] == a) .map (k => [a2 [k], a3 [k]])
])
We start by defining link as a function of three parameters, a1, a2, and a3. With realistic data you would likely give these more meaningful names. Our goal will be to take the three arrays and, using shared indices, pair up the values in each by indices, and then group the results based on the first value, each of which may be duplicated multiple times in the first array.
We could have chosen to use a crossproduct implementation to create an intermediate result like [['A', 'a', '1'], ['A', 'aa', 2], ['B', 'b', 3], ...] then do the grouping on those values.
Instead we choose to go a little more directly at this. Our arrow function contains only one expression, so there is no need for { - } or a return statement. We first create an array of unique members of a1, using what's probably the most common implementation of uniqueness: [... new Set (a1)]. Here new Set (a1) creates a set, an container without duplicates. When we prepend it with ..., we get an iterator for the set. And wrapping that in [ - ] makes it into an array. That will give us ['A', 'B', 'C'].
We want an output array containing one output array for each of these, so we can use map, which by applying a function to each element turns one array into another of the same size. The function we pass to map takes a value from ['A', 'B', 'C'] and returns an array. The first element is that value, and the others are found by calling Object.keys on our array. (Logically, Array.keys makes more sense here, as it will give us numeric keys. But that would take one additional step as we'd have to turn its iterator result into an array; and in this case the string keys will work just as well.) We filter these keys/indices to find those where the values in our original array match the current value. For A, this would be ['0', '1', '6'].
We map again, converting these indices into a pair of same-index values from the other two arrays. This is .map (k => [a2 [k], a3 [k]]). Finally, we apply a ... to spread the results into our array.
I hope that helps.
Updated Version
We have more information now. If I understand correctly, we will have some way to match a given element from the second array with one from the first array. Let's encode that in a function, which we can then pass to our generic solution.
Here I demonstrate with the sample data assuming a match is simply a letter match.
const link = (match) => (a1, a2, a3) =>
a1 .map (x => [
x,
...Object.keys(a2) .filter (k => match (x, a2[k]))
.map (k => [a2[k], a3[k]])
])
const arr1 = ['A', 'B', 'C']
const arr2 = ['a', 'aa', 'b', 'bb', 'c', 'cc', 'aaa', 'ccc']
const arr3 = [1, 2, 3, 4, 5, 6, 7, 8]
const matchingLetter = (a1Val, a2Val) =>
a2Val .startsWith (a1Val.toLowerCase())
console .log (link (matchingLetter) (arr1, arr2, arr3))
.as-console-wrapper {max-height: 100% !important; top: 0}
Original Version
As discussed in the comments, this strikes me as a bad idea, if your data sources don't force it upon you.
But if all you need to do is to takes one element from the first array and pair it with a linked set of two pairs from the next two arrays, then you could write something like this:
const link = (a1, a2, a3) =>
a1 .map ((a, i) => [a, [a2[2 * i], a3[2 * i]], [a2[2 * i + 1], a3[2 * i + 1]]])
const arr1 = ['A', 'B', 'C']
const arr2 = ['a', 'aa', 'b', 'bb', 'c', 'cc']
const arr3 = [1, 2, 3, 4, 5, 6]
console .log (link (arr1, arr2, arr3))
.as-console-wrapper {max-height: 100% !important; top: 0}
If there is some other relationship between the linked sets than this little two-for-one mapping, then I think we need more information.

Related

How to add items of arrays to end of array?

There is sourceArray and some additionalArray. Need to add items from additionalArray to the end of sourceArray. And in result sourceArray contains all items (no create new array). The problem is items count of additionalArray may be thousands.
// example
push([], [1, 2, 3], [10, 20, 30]) // [1, 2, 3, 10, 20, 30]
push(['a', 'b'], 'x', ['z', '0']) // ['a', 'b', 'x', 'z', '0']
// my solution
function push(sourceArray, ...additionalArray) {
additionalArray.forEach((array) => {
array = Array.isArray(array) ? array : [array];
if (array.length < 1000) {
sourceArray.push.apply(sourceArray, array);
} else {
array.forEach((item) => sourceArray.push(item));
}
});
return sourceArray;
}
My question is there more elegant solution for this task?
You might find using .flat() can help you here. If you use that on additionalArray, you can then spread ... those elements into a call to .push() as arguments:
const push = (source, ...rest) => {
source.push(...rest.flat());
return source;
}
console.log(push([], [1, 2, 3], [10, 20, 30])) // [1, 2, 3, 10, 20, 30]
console.log(push(['a', 'b'], 'x', ['z', '0'])) // ['a', 'b', 'x', 'z', '0']
This does have a limitation though in that .push() can only accept a certain amount of arguments. You might hit the max argument limit and this can throw considering that your additionalArray can be large. Using a for..of loop would help with that:
const push = (source, ...rest) => {
for(const item of rest.flat())
source.push(item)
return source;
}
console.log(push([], [1, 2, 3], [10, 20, 30])) // [1, 2, 3, 10, 20, 30]
console.log(push(['a', 'b'], 'x', ['z', '0'])) // ['a', 'b', 'x', 'z', '0']
As of MDN, you can use the array spread syntax:
let vegetables = ['parsnip', 'potato']
let moreVegs = ['celery', 'beetroot']
// Merge the second array into the first one
vegetables.push(...moreVegs);
console.log(vegetables) // ['parsnip', 'potato', 'celery', 'beetroot']

Returning the array in an array of arrays with a value from another array

I have and array of arrays aa = [['a'], ['b'], ['c']] and i have an array a = ['a', 'b', 'c']
I need to get the item in aa for each element in a i.e i want to list elements in a with their respective arrays in aa the result should be like
a: ['a'] b: ['b'] c: ['c']
I tried this code but it does return the first element i aa for each element in a
I wonder what's wrong here
const aa = [
['a'],
['b'],
['c']
]
const a = ['a', 'b', 'c']
let b = []
a.forEach((el) => {
b.push(
aa.filter((element) => {
return element.includes(el)
})
)
})
console.log(b)
Try this
const aa = [
['a'],
['b'],
['c']
];
const a = ['a', 'b', 'c'];
let b = {};
a.forEach( // loop "a"
aEl => b[aEl] = aa.filter( // filter "aa"
aaEl => aaEl.includes(aEl) // on array that includes the item from 'a'
).flat() // we need to flatten the resulting array before returning it
);
console.log(JSON.stringify(b)); // using stringify to make it readable
Since you want your output to be a key-value list (a: ['a']), variable b should be a map. Let's also rename b to out for readability.
out = {}
To get a better view of if our code is working, let's use some unique test data, and let's rename a to keys and aa to values.
const keys = ['A', 'B', 'C']
const values = [
['A', 'A2', 'a3'],
['B1', 'B', 'b3'],
['C1', 'C2', 'C']
]
For every key in keys, we want to set search for all arrays in values that contain the key. To set the search result to out we use brackets like so:
keys.forEach((key) => {
out[key] = values.filter(valueArr => valueArr.includes(key))
})
This outputs:
{
"A": [["A", "A2", "a3"]],
"B": [["B1", "B", "b3"]],
"C": [["C1", "C2", "C"]]
}
Now there are two arrays around each value. This is because values.filter can return multiple arrays. To combine these into a single array you can use the flat() function. The whole code looks like:
const keys = ['A', 'B', 'C']
const values = [
['A', 'A2', 'a3'],
['B1', 'B', 'b3'],
['C1', 'C2', 'C']
]
out = {}
keys.forEach((key) => {
out[key] = values.filter(valueArr => valueArr.includes(key)).flat()
})
console.log(out)

Ramda, counting value frequency in array

I have the following simple array
['a', 'b', 'a', 'c', 'a', 'c', 'd', 'a']
How can Ramda help me in achieving the following
{a: 4, b: 1, c: 2, d: 1}
a:4 represents that value a exists 4 times in the main array
b:1 represents that value b exists 1 time in the main array
c:2 represents that value c exists 2 times in the main array
d:1 represents that value d exists 1 time in the main array
Use R.countBy with R.identity as the function that generates the keys:
const data = ['a', 'b', 'a', 'c', 'a', 'c', 'd', 'a']
const result = R.countBy(R.identity, data)
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.26.1/ramda.js"></script>

Merge arrays and keep ordering

NOTE
The question has been edited following the good advise from #Kaddath to highlight the fact that the ordering doesn't have to be alphabetical but depending on the position of items inside the arrays.
I have an array of arrays where each of the arrays are based on a given ordering but they can differ a bit.
For example, the base ordering is X -> D -> H -> B and here is my array of arrays:
const arrays = [
['X', 'D', 'H', 'B'],
['X', 'D', 'K', 'Z', 'H', 'B', 'A'],
['X', 'M', 'D', 'H', 'B'],
['X', 'H', 'T'],
['X', 'D', 'H', 'B']
]
I would like to merge all arrays into a single one and remove duplicates but by keeping the ordering. In my example the result would be ['X', 'M', 'D', 'K', 'Z', 'H', 'T', 'B', 'A'].
In the example we can see that M is between X and D inside the third array and it is so placed between X and D in the final output.
I know conflicts may arise but here are the following rules:
Every items should appear in the final output.
If an item is appearing in multiple arrays at different positions, the first appearance is the right one (skip others).
What I've done so far is merging all of these arrays into a single one by using
const merged = [].concat.apply([], arrays);
(cf. https://stackoverflow.com/a/10865042/3520621).
And then getting unique values by using this code snippet from https://stackoverflow.com/a/1584377/3520621 :
Array.prototype.unique = function() {
var a = this.concat();
for(var i=0; i<a.length; ++i) {
for(var j=i+1; j<a.length; ++j) {
if(a[i] === a[j])
a.splice(j--, 1);
}
}
return a;
};
const finalArray = merged.unique();
But my result is this:
[
"X",
"D",
"H",
"B",
"K",
"Z",
"A",
"M",
"T"
]
Any help is welcome!
Thanks.
const arrays = [
['X', 'D', 'H', 'B'],
['X', 'D', 'K', 'Z', 'H', 'B', 'A'],
['X', 'M', 'D', 'H', 'B'],
['X', 'H', 'T'],
['X', 'D', 'H', 'B']
];
const result = [];
arrays.forEach(array => {
array.forEach((item, idx) => {
// check if the item has already been added, if not, try to add
if(!~result.indexOf(item)) {
// if item is not first item, find position of his left sibling in result array
if(idx) {
const result_idx = result.indexOf(array[idx - 1]);
// add item after left sibling position
result.splice(result_idx + 1, 0, item);
return;
}
result.push(item);
}
});
});
console.log('expected result', ['X', 'M', 'D', 'K', 'Z', 'H', 'T', 'B', 'A'].join(','));
console.log(' current result',result.join(','));
Every array is in fact a set of rules that tells what is the relative order between the elements. Final list should return all elements while respecting relative order defined by all rules.
Some solutions have solved the initial request, some even didn't solve that one (all that suggest using sort kind of missed the point of the question). Nevertheless, none proposed a generic solution.
The problem
If we look at the problem asked in the OP, this is how the rules define what is the relative position between elements:
M K -> Z T
^ \ ^ \ ^
/ v/ v/
X -> D ------> H -> B -> A
So, it is easy to see that our array starts with X. Next element can be both D and M. But, D requires M to already be in array. That is why we will put M as our next element, and then D. Next, D points to both K and H. But since H has some other predecessor that are not collected until now, and K has none (actually it has D, but it is already collected in the list), we will put K and Z, and only then H.
H points to both T and B. It actually doesn't matter which one we put first. So, last three elements can be in any of the following three orders:
T, B, A
B, A, T
B, T, A
Let us also take into account a little bit more complex case. Here are the rules:
['10', '11', '12', '1', '2'],
['11', '12', '13', '2'],
['9', '13'],
['9', '10'],
If we draw the graph using those rules we would get following:
--------------> 13 ----
/ ^ \
/ / v
9 -> 10 -> 11 -> 12 > 1 -> 2
What is specific about this case? Two things:
Only in the last rule we "find out" that the number 9 is the beginning of the array
There are two non direct paths from 12 to 2 (one over the number 1, second over the number 13).
Solution
My idea is to create a node from each element. And then use that node to keep track of all immediate successors and immediate predecessors. After that we would find all elements that don't have predecessors and start "collecting" results from there. If we came to the node that has multiple predecessors, but some of them are not collected, we would stop recursion there. It can happen that some of the successors is already collected in some other path. We would skip that successor.
function mergeAndMaintainRelativeOrder(arrays/*: string[][]*/)/*: string[]*/ {
/*
interface NodeElement {
value: string;
predecessor: Set<NodeElement>;
successor: Set<NodeElement>;
collected: boolean;
}
*/
const elements/*: { [key: string]: NodeElement }*/ = {};
// For every element in all rules create NodeElement that will
// be used to keep track of immediate predecessors and successors
arrays.flat().forEach(
(value) =>
(elements[value] = {
value,
predecessor: new Set/*<NodeElement>*/(),
successor: new Set/*<NodeElement>*/(),
// Used when we form final array of results to indicate
// that this node has already be collected in final array
collected: false,
}),
);
arrays.forEach((list) => {
for (let i = 0; i < list.length - 1; i += 1) {
const node = elements[list[i]];
const nextNode = elements[list[i + 1]];
node.successor.add(nextNode);
nextNode.predecessor.add(node);
}
});
function addElementsInArray(head/*: NodeElement*/, array/*: string[]*/) {
let areAllPredecessorsCollected = true;
head.predecessor.forEach((element) => {
if (!element.collected) {
areAllPredecessorsCollected = false;
}
});
if (!areAllPredecessorsCollected) {
return;
}
array.push(head.value);
head.collected = true;
head.successor.forEach((element) => {
if (!element.collected) {
addElementsInArray(element, array);
}
});
}
const results/*: string[]*/ = [];
Object.values(elements)
.filter((element) => element.predecessor.size === 0)
.forEach((head) => {
addElementsInArray(head, results);
});
return results;
}
console.log(mergeAndMaintainRelativeOrder([
['X', 'D', 'H', 'B'],
['X', 'D', 'K', 'Z', 'H', 'B', 'A'],
['X', 'M', 'D', 'H', 'B'],
['X', 'H', 'T'],
['X', 'D', 'H', 'B'],
]));
console.log(mergeAndMaintainRelativeOrder([
['10', '11', '12', '1', '2'],
['11', '12', '13', '2'],
['9', '13'],
['9', '10'],
]));
Big O
If we say that n is the number of the rules, and m is number of elements in each rule, complexity of this algorithm is O(n*m). This takes into account that Set implementation for the JS is near O(1).
Flatten, remove duplicates and sort could be simpler:
const arrays = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D'],
];
console.log(
arrays
.flat()
.filter((u, i, all) => all.indexOf(u) === i)
.sort((a, b) => a.localeCompare(b)),
);
Or event simpler according to Mohammad Usman's now deleted post:
const arrays = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D'],
];
console.log(
[...new Set([].concat(...arrays))].sort((a, b) =>
a.localeCompare(b),
),
);
You can use .concat() with Set to get the resultant array of unique values:
const data = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D']
];
const result = [...new Set([].concat(...data))].sort((a, b) => a.localeCompare(b));
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Create a single array using array#concat and then using Set get the unique values from this array then sort the array.
const arrays = [ ['A', 'B', 'C', 'D'], ['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'], ['A', 'A-bis', 'B', 'C', 'D'], ['A', 'C', 'E'], ['A', 'B', 'C', 'D'] ],
result = [...new Set([].concat(...arrays))].sort();
console.log(result);
merge [].concat.apply([], arrays)
find uniq [...new Set(merged)]
sort .sort()
const arrays = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D']
];
let merged = [].concat.apply([], arrays); // merge array
let sort = [...new Set(merged)].sort(); // find uniq then sort
console.log(sort);
Fun problem to solve; I think I only partly succeeded.
I ignored the "underspecified" example of B -> A -> T vs T -> B -> A
It's very inefficient
Still posting cause I think it might help you get things right. Here's my approach:
Step 1: create a naive index
We're creating an object that, for each unique element in the nested arrays, tracks which it has succeeded or preceded:
{
"X": { prev: Set({}), next: Set({ "D", "H", "B", "K", "Z", "A", "M", "T" })
"M": { prev: Set({ "X" }), next: Set({ "D", "H", "B" })
// etc.
}
I named it "naive" because these Sets only contain information of one level deep.
I.e.: they only report relations between elements that were in the same array. They cannot see the M comes before the K because they were never in the same array.
Step 2: join the indexes recursively
This is where I ignored all big-O concerns one might have 😉. I merge the index recursively: The next of M is a join of the next of D, H, B. Recurse until you found an element that has no next, i.e. the T or A.
Step 3: create a sorter that respects the sort index:
const indexSorter = idx => (a, b) =>
idx[a].next.has(b) || idx[b].prev.has(a) ? -1 :
idx[a].prev.has(b) || idx[b].next.has(a) ? 1 :
0 ;
This function creates a sort method that uses the generated index to look up the sort order between any two elements.
Bringing it all together:
(function() {
const naiveSortIndex = xss => xss
.map(xs =>
// [ prev, cur, next ]
xs.map((x, i, xs) => [
xs.slice(0, i), x, xs.slice(i + 1)
])
)
// flatten
.reduce((xs, ys) => xs.concat(ys), [])
// add to index
.reduce(
(idx, [prev, cur, next]) => {
if (!idx[cur])
idx[cur] = {
prev: new Set(),
next: new Set()
};
prev.forEach(p => {
idx[cur].prev.add(p);
});
next.forEach(n => {
idx[cur].next.add(n);
});
return idx;
}, {}
);
const expensiveSortIndex = xss => {
const naive = naiveSortIndex(xss);
return Object
.keys(naive)
.reduce(
(idx, k) => Object.assign(idx, {
[k]: {
prev: mergeDir("prev", naive, k),
next: mergeDir("next", naive, k)
}
}), {}
)
}
const mergeDir = (dir, idx, k, s = new Set()) =>
idx[k][dir].size === 0
? s
: Array.from(idx[k][dir])
.reduce(
(s, k2) => mergeDir(dir, idx, k2, s),
new Set([...s, ...idx[k][dir]])
);
// Generate a recursive sort method based on an index of { key: { prev, next } }
const indexSorter = idx => (a, b) =>
idx[a].next.has(b) || idx[b].prev.has(a) ? -1 :
idx[a].prev.has(b) || idx[b].next.has(a) ? 1 :
0;
const uniques = xs => Array.from(new Set(xs));
// App:
const arrays = [
['X', 'D', 'H', 'B'],
['X', 'D', 'K', 'Z', 'H', 'B', 'A'],
['X', 'M', 'D', 'H', 'B'],
['X', 'H', 'T'],
['X', 'D', 'H', 'B']
];
const sortIndex = expensiveSortIndex(arrays);
const sorter = indexSorter(sortIndex);
console.log(JSON.stringify(
uniques(arrays.flat()).sort(sorter)
))
}())
Recommendations
I suppose the elegant solution to the problem might be able to skip all the merging of Sets by using a linked list / tree-like structure and injecting elements at the right indexes by traversing until an element of its prev/next is found.
I would just flatten the arrays, map them as keys to an object (thus removing the doubles), and then sort the final result
const arrays = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D']
];
const final = Object.keys( arrays.flat().reduce( (aggregate, entry) => {
aggregate[entry] = '';
return aggregate;
}, {} ) ).sort( (x1, x2) => x1.localeCompare(x2) );
console.log( final );
To your code, after the merge you need to remove the duplicates. So you will get the unique array.
Use the array.sort, to sort the array.
I hope this will solve the issue.
const arrays = [
['A', 'B', 'C', 'D'],
['A', 'B', 'B-bis', 'B-ter', 'C', 'D', 'D-bis'],
['A', 'A-bis', 'B', 'C', 'D'],
['A', 'C', 'E'],
['A', 'B', 'C', 'D']
]
const merged = [].concat.apply([], arrays);
const unique = Array.from(new Set(merged))
const sorted = unique.sort()
console.log("sorted Array", sorted)
// Single Line
const result = [...new Set([].concat(...arrays))].sort();
console.log("sorted Array single line", result)
Use a BST for this. Add in all elements to the bst and then traverse in-order.
function BST(){
this.key = null;
this.value = null;
this.left = null;
this.right = null;
this.add = function(key}{
const val = key;
key = someOrderFunction(key.replace(/\s/,''));
if(this.key == null) {
this.key = key;
this.val = val;
} else if(key < this.key) {
if(this.left){
this.left.add(val);
} else {
this.left = new BST();
this.left.key = key;
this.left.val = val;
}
} else if(key > this.key) {
if(this.right){
this.right.add(val);
} else {
this.right= new BST();
this.right.key = key;
this.right.val = val;
}
}
this.inOrder = function(){
const leftNodeOrder = this.left ? this.left.inOrder() : [],
rightNodeOrder = this.right? this.right.inOrder() : [];
return leftNodeOrder.concat(this.val).concat(this.rightNodeOrder);
}
}
// MergeArrays uses a BST to insert all elements of all arrays
// and then fetches them sorted in order
function mergeArrays(arrays) {
const bst = new BST();
arrays.forEach(array => array.forEach( e => bst.add(e)));
return bst.inOrder();
}
My solution focuses nothing on efficiency, so I wouldn't try this for large arrays. But it works fine for me.
The idea is to walk through all elements multiple times and only insert an element into the sorted array in one of three cases:
The current element is first in its array, and one of its successors is first in the sorted array.
The current element is last in its array, and one of its predecessors is last in the sorted array.
The preceding element is in the sorted array and one of the current elements successors are directly succeeding this preceding element in the sorted array.
For the current problem, as stated above, the order between T and B, A, isn't uniquely determined. To handle this I use a flag force which takes any legal option when no new inserts could be made during an iteration.
The following rule from the problem is not implemented in my solution. If an item is appearing in multiple arrays at different positions, the first appearance is the right one (skip others). There is no hierarchy between the arrays. It should however be easy to implement the desired check and continue if it's not satisfied.
let merge = (arrays) => {
let sorted = [...arrays[0]];
const unused_rules = arrays.slice(1);
let not_inserted = unused_rules.flat().filter((v) => !sorted.includes(v));
let last_length = -1;
let force = false;
// avoids lint warning
const sortedIndex = (sorted) => (v) => sorted.indexOf(v);
// loop until all elements are inserted, or until not even force works
while (not_inserted.length !== 0 && !force) {
force = not_inserted.length === last_length; //if last iteration didn't add elements, our arrays lack complete information and we must add something using what little we know
last_length = not_inserted.length;
for (let j = 0; j < unused_rules.length; j += 1) {
const array = unused_rules[j];
for (let i = 0; i < array.length; i += 1) {
// check if element is already inserted
if (sorted.indexOf(array[i]) === -1) {
if (i === 0) {
// if element is first in its array, check if it can be prepended to sorted array
const index = array.indexOf(sorted[0]);
if (index !== -1 || force) {
const insert = array.slice(0, force ? 1 : index);
sorted = [...insert, ...sorted];
not_inserted = not_inserted.filter((v) => !insert.includes(v));
force = false;
}
} else if (i === array.length - 1) {
// if element is last in its array, check if it can be appended to sorted array
const index = array.indexOf(sorted[sorted.length - 1]);
if (index !== -1 || force) {
const insert = array.slice(force ? array.length - 1 : index + 1);
sorted = [...sorted, ...insert];
not_inserted = not_inserted.filter((v) => !insert.includes(v));
force = false;
}
} else {
const indices = array.map(sortedIndex(sorted)); // map all elements to its index in sorted
const predecessorIndexSorted = indices[i - 1]; // index in the sorted array of the element preceding current element
let successorIndexArray;
if (force) {
successorIndexArray = i + 1;
} else {
successorIndexArray = indices.indexOf(predecessorIndexSorted + 1); // index in the current array of the element succeeding the current elements predecessor in the sorted array
}
if (predecessorIndexSorted !== -1 && successorIndexArray !== -1) {
// insert all elements between predecessor and successor
const insert = array.slice(i, successorIndexArray);
sorted.splice(i, 0, ...insert);
not_inserted = not_inserted.filter((v) => !insert.includes(v));
force = false;
}
}
}
}
}
}
return sorted;
};
In fact, the rule If an item is appearing in multiple arrays at different positions, the first appearance is the right one (skip others). is a bit vague. For example using the arrays below, is it okay to end up with arrays[3] as the sorted array, since it doesn't violate the first appearance of any element, or should arrays[2] take precedence?
const arrays = [['a', 'b', 'd'],
['a', 'c', 'd'],
['a', 'b', 'c', 'd']
['a', 'c', 'b', 'd']]

Ramda: Extract flat array of uniq values

I do have a working code like the following, but I am wondering if there is a way with Ramda to turn this whole expression into a curried function where I can specify the input data argument. Perhaps even compose a whole thing differently.
const data = [
{ val: ['A', 'B'] },
{ val: ['C', 'D'] },
{ val: ['A', 'C', 'E'] },
]
R.uniq(R.flatten(R.map(R.prop('val'), data)))
I tried using R.__, but that's probably working differently, not for such nested calls.
Here's a simple transformation of your function, using compose:
const {compose, uniq, flatten, map, prop} = R;
const data = [
{ val: ['A', 'B'] },
{ val: ['C', 'D'] },
{ val: ['A', 'C', 'E'] },
]
const extract = compose(uniq, flatten, map(prop('val')))
console.log(extract(data))
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.25.0/ramda.js"></script>
This could also be written with Ramda's order-reversed twin of compose, pipe:
const {pipe, uniq, flatten, map, prop} = R;
const data = [
{ val: ['A', 'B'] },
{ val: ['C', 'D'] },
{ val: ['A', 'C', 'E'] },
]
const extract = pipe(
map(prop('val')),
flatten,
uniq
)
console.log(extract(data))
<script src="//cdnjs.cloudflare.com/ajax/libs/ramda/0.25.0/ramda.js"></script>
I personally choose compose for one-liners, pipe for anything longer.
The notion of function composition expressed in these two functions is quite central to Ramda. (Disclaimer: I'm a Ramda author.)

Categories

Resources