I'm having a tough time figuring out how to loop through an array and if certain items do exist within the array, i'd like to perform a .slice(0, 16) to kind of filter an already existing array (lets call that existing array "routes").
For example, a previous process will yield the following array:
points = ['=00ECY20WA200_RECV_P1SEL',
'=00ECY20WA200_RECV_P2SEL',
'=00RECV_C1A_EINCSCMPP1',
'=00RECV_C1A_EINCSCMPP2',
'=00BYPS_C1A_EINCSCMP',
'=00ECY20WA200_BYPS_SPSL1',
'=00ECC92AG184YB01',
'=00ECC92AG185YB01',
'=00ECC92AG186YB01',
'=00ECC92AG187YB01',
]
So if any of the above items exist in the "points" Array, which in this case they all do (but in some cases it could just be 1 of the 10 items existing there), I'm trying to perform routes.slice(0, 16) to the other already existing array.
I've tried lots of different ways (for loops with if statements) and at this point I'm not sure if its my syntax or what, but I'm back at square 0 and I don't even have a competent piece of code to show for. Any direction would be greatly appreciated.
You could use a hash table for checking and filtering.
var points = ['=00ECY20WA200_RECV_P1SEL', '=00ECY20WA200_RECV_P2SEL', '=00RECV_C1A_EINCSCMPP1', '=00RECV_C1A_EINCSCMPP2', '=00BYPS_C1A_EINCSCMP', '=00ECY20WA200_BYPS_SPSL1', '=00ECC92AG184YB01', '=00ECC92AG185YB01', '=00ECC92AG186YB01', '=00ECC92AG187YB01'],
hash = Object.create(null),
filtered = points.filter(function (a) {
if (!hash[a.slice(0, 16)]) {
hash[a.slice(0, 16)] = true;
return true;
}
});
console.log(filtered);
ES6 with Set
var points = ['=00ECY20WA200_RECV_P1SEL', '=00ECY20WA200_RECV_P2SEL', '=00RECV_C1A_EINCSCMPP1', '=00RECV_C1A_EINCSCMPP2', '=00BYPS_C1A_EINCSCMP', '=00ECY20WA200_BYPS_SPSL1', '=00ECC92AG184YB01', '=00ECC92AG185YB01', '=00ECC92AG186YB01', '=00ECC92AG187YB01'],
pSet = new Set,
filtered = points.filter(a => !pSet.has(a.slice(0, 16)) && pSet.add(a.slice(0, 16)));
console.log(filtered);
EDIT: So it seems like you want to remove an element from an array called routes for each element in the points array. This is how you could do this:
function removeBrokenRoutes(brokenPoints, routes){
for(let pt of brokenPoints){
let index = routes.indexOf(pt);
if(index !== -1) routes.splice(index,1);
}
return routes;
}
Keep in mind that the larger the arrays, the more time this is going to take to complete.
You could use the filter and indexOf methods in combination:
var arr = [/* all the data you're checking against */];
var points = [/* the data you're checking for */];
var filteredArr = arr.filter(function(x) {
// will return -1 if the point is not found
return points.indexOf(x) !== -1;
});
filteredArr will contain all the points that appear in both arrays. The filter function works by taking a function with one argument x, which represents each item in the array. if the function returns true, the item will be added to the new array (filteredArr), and if false the function will move on to the next item. indexOf will check if the item is found in the other array. Also it is important to note that you will need a more complex solution (such as a hashtable) if the data set is very, very large as this is not necessarily the most performant method. But it's a good place to start as it is easy to understand.
Related
I want to filter a array by keeping the same array without creating a new one.
with Array.filter() :
getFiltersConfig() {
return this.config.filter((topLevelConfig) => topLevelConfig.name !== 'origin')
}
what is the best way to get the same result by filtering by value without returning a new array ?
For completeness, I thought it might make sense to show a mutated array variant.
Below is a snippet with a simple function mutationFilter, this will filter the array directly, notice in this function the loop goes in reverse, this is a technique for deleting items with a mutated array.
Also a couple of tests to show how Array.filter creates a new array, and mutationFilter does not.
Although in most cases creating a new array with Array.filter is normally what you want. One advantage of using a mutated array, is that you can pass the array by reference, without you would need to wrap the array inside another object. Another advantage of course is memory, if your array was huge, inline filtering would take less memory.
let arr = ['a','b','a'];
let ref = arr; //keep reference of original arr
function mutationFilter(arr, cb) {
for (let l = arr.length - 1; l >= 0; l -= 1) {
if (!cb(arr[l])) arr.splice(l, 1);
}
}
const cond = x => x !== 'a';
const filtered = arr.filter(cond);
mutationFilter(arr, cond);
console.log(`ref === array -> ${ref === arr}`);
console.log(arr);
console.log(`ref === filtered -> ${ref === filtered}`);
console.log(filtered);
I want to filter a array by keeping the same array without creating a new one.
what is the best way to get the same result by filtering by value without returning a new array ?
I have an answer for the second criterion, but violates the first. I suspect that you may want to "not create a new one" specifically because you only want to preserve the reference to the array, not because you don't want to create a new array, necessarily (e.g. for memory concerns).
What you could do is create a temp array of what you want
var temp = this.config.filter((topLevelConfig) => topLevelConfig.name !== 'origin')
Then set the length of the original array to 0 and push.apply() the values "in-place"
this.config.length = 0; //clears the array
this.config.push.apply(this.config, temp); //adds what you want to the array of the same reference
You could define you custom method like so:
if(!Array.prototype.filterThis){
Array.prototype.filterThis = function (callBack){
if(typeof callBack !== 'function')
throw new TypeError('Argument must of type <function>');
let t = [...this];
this.length = 0;
for(let e of t) if(callBack(e)) this.push(e);
return this;
}
}
let a = [1,2,3,4,5,5,1,5];
a.filterThis(x=>x!=5);
console.log(a);
Warning: Be very cautious in altering built in prototypes. I would even say unless your making a polyfill don't touch. The errors it can cause can be very subtle and very hard to debug.
Not sure why would you want to do mutation but if you really want to do it, maybe assign it back to itself?
let arr = ['a','b','a'];
arr = arr.filter(x => x !== 'a');
console.log(arr)
I am looking for an algorithm to merge multiple sorted sequences, lets say X sorted sequences with n elements, into one sorted sequence in javascript , can you provide some examples?
note: I do not want to use any library.
Trying to solve https://icpc.kattis.com/problems/stacking
what will be the minimal number of operations needed to merge sorted arrays, under conditions :
Split: a single stack can be split into two stacks by lifting any top portion of the stack and putting it aside to form a new stack.
Join: two stacks can be joined by putting one on top of the other. This is allowed only if the bottom plate of the top stack is no larger than the top plate of the bottom stack, that is, the joined stack has to be properly ordered.
History
This problem has been solved for more than a century, going back to Hermann Hollerith and punchcards. Huge sets of punchcards, such as those resulting from a census, were sorted by dividing them into batches, sorting each batch, and then merging the sorted batches--the so-called
"merge sort". Those tape drives you see spinning in 1950's sci-fi movies were most likely merging multiple sorted tapes onto one.
Algorithm
All the algorithms you need can be found at https://en.wikipedia.org/wiki/Merge_algorithm. Writing this in JS is straightforward. More information is available in the question Algorithm for N-way merge. See also this question, which is an almost exact duplicate, although I'm not sure any of the answers are very good.
The naive concat-and-resort approach does not even qualify as an answer to the problem. The somewhat naive take-the-next-minimum-value-from-any-input approach is much better, but not optimal, because it takes more time than necessary to find the next input to take a value from. That is why the best solution using something called a "min-heap" or a "priority queue".
Simple JS solution
Here's a real simple version, which I make no claim to be optimized, other than in the sense of being able to see what it is doing:
const data = [[1, 3, 5], [2, 4]];
// Merge an array or pre-sorted arrays, based on the given sort criteria.
function merge(arrays, sortFunc) {
let result = [], next;
// Add an 'index' property to each array to keep track of where we are in it.
arrays.forEach(array => array.index = 0);
// Find the next array to pull from.
// Just sort the list of arrays by their current value and take the first one.
function findNext() {
return arrays.filter(array => array.index < array.length)
.sort((a, b) => sortFunc(a[a.index], b[b.index]))[0];
}
// This is the heart of the algorithm.
while (next = findNext()) result.push(next[next.index++]);
return result;
}
function arithAscending(a, b) { return a - b; }
console.log(merge(data, arithAscending));
The above code maintains an index property on each input array to remember where we are. The simplistic alternative would be to shift the element from the front of each array when it is its turn to be merged, but that would be rather inefficient.
Optimizing finding the next array to pull from
This naive implementation of findNext, to find the array to pull the next value from, simply sorts the list of inputs by the first element, and takes the first array in the result. You can optimize this by using a "min-heap" to manage the arrays in sorted order, which removes the need to resort them each time. A min-heap is a tree, consisting of nodes, where each node contains a value which is the minimum of all values below, with left and right nodes giving additional (greater) values, and so on. You can find information on a JS implementation of a min-heap here.
A generator solution
It might be slightly cleaner to write this as a generator which takes a list of iterables as inputs, which includes arrays.
// Test data.
const data = [[1, 3, 5], [2, 4]];
// Merge an array or pre-sorted arrays, based on the given sort criteria.
function* merge(iterables, sortFunc) {
let next;
// Create iterators, with "result" property to hold most recent result.
const iterators = iterables.map(iterable => {
const iterator = iterable[Symbol.iterator]();
iterator.result = iterator.next();
return iterator;
});
// Find the next iterator whose value to use.
function findNext() {
return iterators
.filter(iterator => !iterator.result.done)
.reduce((ret, cur) => !ret || cur.result.value < ret.result.value ? cur : ret,
null);
}
// This is the heart of the algorithm.
while (next = findNext()) {
yield next.result.value;
next.result = next.next();
}
}
function arithAscending(a, b) { return a - b; }
console.log(Array.from(merge(data, arithAscending)));
The naive approach is concatenating all the k sequences, and sort the result. But if each sequence has n elements, the the cost will be O(k*n*log(k*n)). Too much!
Instead, you can use a priority queue or heap. Like this:
var sorted = [];
var pq = new MinPriorityQueue(function(a, b) {
return a.number < b.number;
});
var indices = new Array(k).fill(0);
for (var i=0; i<k; ++i) if (sequences[i].length > 0) {
pq.insert({number: sequences[i][0], sequence: i});
}
while (!pq.empty()) {
var min = pq.findAndDeleteMin();
sorted.push(min.number);
++indices[min.sequence];
if (indices[min.sequence] < sequences[i].length) pq.insert({
number: sequences[i][indices[min.sequence]],
sequence: min.sequence
});
}
The priority queue only contains at most k elements simultaneously, one for each sequence. You keep extracting the minimum one, and inserting the following element in that sequence.
With this, the cost will be:
k*n insertions to a heap of k elements: O(k*n)
k*n deletions in a heap of k elements: O(k*n*log(k))
Various constant operations for each number: O(k*n)
So only O(k*n*log(k))
Just add them into one big array and sort it.
You could use a heap, add the first element of each sequence to it, pop the lowest one (that's your first merged element), add the next element from the sequence of the popped element and continue until all sequences are over.
It's much easier to just add them into one big array and sort it, though.
This is a simple javascript algo I came up with. Hope it helps. It will take any number of sorted arrays and do a merge. I am maintaining an array for index of positions of the arrays. It basically iterates through the index positions of each array and checks which one is the minimum. Based on that it picks up the min and inserts into the merged array. Thereafter it increments the position index for that particular array. I feel the time complexity can be improved. Will post back if I come up with a better algo, possibly using a min heap.
function merge() {
var mergedArr = [],pos = [], finished = 0;
for(var i=0; i<arguments.length; i++) {
pos[i] = 0;
}
while(finished != arguments.length) {
var min = null, selected;
for(var i=0; i<arguments.length; i++) {
if(pos[i] != arguments[i].length) {
if(min == null || min > arguments[i][pos[i]]) {
min = arguments[i][pos[i]];
selected = i;
}
}
}
mergedArr.push(arguments[selected][pos[selected]]);
pos[selected]++;
if(pos[selected] == arguments[selected].length) {
finished++;
}
}
return mergedArr;
}
This is a beautiful question. Unlike concatenating the arrays and applying a .sort(); a simple dynamical programming approach with .reduce() would yield a result in O(m.n) time complexity. Where m is the number of arrays and n is their average length.
We will handle the arrays one by one. First we will merge the first two arrays and then we will merge the result with the third array and so on.
function mergeSortedArrays(a){
return a.reduce(function(p,c){
var pc = 0,
cc = 0,
len = p.length < c.length ? p.length : c.length,
res = [];
while (p[pc] !== undefined && c[cc] !== undefined) p[pc] < c[cc] ? res.push(p[pc++])
: res.push(c[cc++]);
return p[pc] === undefined ? res.concat(c.slice(cc))
: res.concat(p.slice(pc));
});
}
var sortedArrays = Array(5).fill().map(_ => Array(~~(Math.random()*5)+5).fill().map(_ => ~~(Math.random()*20)).sort((a,b) => a-b));
sortedComposite = mergeSortedArrays(sortedArrays);
sortedArrays.forEach(a => console.log(JSON.stringify(a)));
console.log(JSON.stringify(sortedComposite));
OK as per #Mirko Vukušić's comparison of this algorithm with .concat() and .sort(), this algorithm is still the fastest solution with FF but not with Chrome. The Chrome .sort() is actually very fast and i can not make sure about it's time complexity. I just needed to tune it up a little for JS performance without touching the essence of the algorithm at all. So now it seems to be faster than FF's concat and sort.
function mergeSortedArrays(a){
return a.reduce(function(p,c){
var pc = 0,
pl =p.length,
cc = 0,
cl = c.length,
res = [];
while (pc < pl && cc < cl) p[pc] < c[cc] ? res.push(p[pc++])
: res.push(c[cc++]);
if (cc < cl) while (cc < cl) res.push(c[cc++]);
else while (pc < pl) res.push(p[pc++]);
return res;
});
}
function concatAndSort(a){
return a.reduce((p,c) => p.concat(c))
.sort((a,b) => a-b);
}
var sortedArrays = Array(5000).fill().map(_ => Array(~~(Math.random()*5)+5).fill().map(_ => ~~(Math.random()*20)).sort((a,b) => a-b));
console.time("merge");
mergeSorted = mergeSortedArrays(sortedArrays);
console.timeEnd("merge");
console.time("concat");
concatSorted = concatAndSort(sortedArrays);
console.timeEnd("concat");
5000 random sorted arrays of random lengths between 5-10.
es6 syntax:
function mergeAndSort(arrays) {
return [].concat(...arrays).sort()
}
function receives array of arrays to merge and sort.
*EDIT: as cought by #Redu, above code is incorrect. Default sort() if sorting function is not provided, is string Unicode. Fixed (and slower) code is:
function mergeAndSort(arrays) {
return [].concat(...arrays).sort((a,b)=>a-b)
}
I'm using a javascript library which returns arrays not starting from zero like starting from 26 or 1500, what i want to do is a method to get the first element in that array regardless of the index number starting with 0 or any other number.
Are they any method to do this in javascript ?
I suggest to use Array#some. You get the first nonsparse element and the index. The iteration stops immediately if you return true in the callback:
var a = [, , 22, 33],
value,
index;
a.some(function (v, i) {
value = v;
index = i;
return true;
});
console.log(index, value);
The information below is generally useful, but for the problem the OP listed, Nina's answer is by far a better solution.
Those are called sparse arrays and they're one of the few situations where you may want to use for-in on an array.
Remember that arrays are objects in JavaScript, and array entries are properties keyed by names (array indexes) that meet certain criteria. So we can use the features that let us discover the properties on an object to find the indexes on your sparse array.
for-in example:
for (var n in theArray) {
if (theArray.hasOwnProperty(n) && isArrayIndex(n)) {
// Use theArray[n]
}
}
This answer shows how you can determine that n is an array index as opposed to being some other property. A very technical definition would be
function isArrayIndex(n) {
return /^0$|^[1-9]\d*$/.test(n) &&
n <= 4294967294;
}
...but a definition that's good enough for most of us would be
function isArrayIndex(n) {
return !isNaN(parseInt(n, 10));
}
Similarly, you can use Object.keys; since it only looks at own enumerable properties, you don't need the hasOwnProperty check:
Object.keys(theArray).forEach(function(n) {
if (isArrayIndex(n)) {
// ...
}
});
Note that officially, neither of those is in any particular order, not even in ES2015 ("ES6"). So in theory, you could see the indexes out of numeric order. In the real world, I've never seen an even vaguely-modern JavaScript engine that returned array indexes out of order. They're not required to, but every one I've tried does.
So officially, you would need to get a full list and then find the minimum value in it:
var min = Object.keys(theArray).reduce(function(min, n) {
var i = parseInt(n, 10);
return isNaN(i) || (min !== undefined && min > i) ? min : i;
}, undefined);
That'll given you undefined if the array is empty, or the min index if it isn't. But if you want to make the assumption you'll get the keys in numeric order:
// Makes an assumption that may not be true
var min = +Object.keys(theArray).filter(isArrayIndex)[0];
If you're using a JavaScript engine that's entirely up-to-date, you can rely on the order returned by Object.getOwnPropertyNames, which is required to list the array indexes in order.
var min = +Object.getOwnPropertyNames(theArray).filter(isArrayIndex)[0];
It may be useful to use a filter function on the array to get back a normalised array.
var fullArray = array.filter(function(n){
return n != undefined;
});
fullArray[0]
The answers here may help you decide Remove empty elements from an array in Javascript
I guess one alternative to Array.prototype.some() is the Array.prototype.findIndex() method. These are much faster than filter alone and will keep your array and indices untouched.
var arr = new Array(1000),
fi = -1;
arr[777] = 1453; // now we have a nice sparse array
fi = arr.findIndex(f => f !== void 0); // void 0 is the perfect undefined
console.log(fi);
console.log(arr[fi]);
With this piece of code you can find first assigned value index and then get the value from your array:
var a = [, , 22, 33];
var value = a.find((v, i) => i in a);
console.log(value);
/* ---------------------------------------------- */
var i = 0
while (!(i in a) && i < a.length) i++; // If i === a.length then the array is emtpy
console.info(i, a[i]);
First implementation uses Array.prototype.find which makes less variable usage so this is cleaner but to find the index you should call indexOf over the array.
But the second one is a little bit old fashioned but give the chance of having index without extra efforts.
BTW Nina's seems better. (can make it shorter?)
const arr = [0,1,2]
// using destructuring to get the first element
let [first] = arr
// plus: using destructuring to get the last element
let [first] = [...arr].reverse()
I have this two integers arrays:
I am working on my Angularjs tutorial project.
In controller I have this two arrays:
var arrA=[12,54,76,98,45];
var arrB=[12,98];
I need to delete from arrA all numbers that inside arrB.
arrA have to be like this after implementation:
arrA=[54,76,45]
What is best and elegantic way to implement it in angularjs?
You can use Array.prototype.filter() in conjunction with Array.prototype.indexOf()
The filter() method creates a new array with all elements that pass the test implemented by the provided function.
The indexOf() method returns the first index at which a given element can be found in the array, or -1 if it is not present.
var arrA=[12,54,76,98,45];
var arrB=[12,98];
arrA = arrA.filter(function(o){
return arrB.indexOf(o) == -1;
});
document.write(JSON.stringify(arrA));
Off the top of my head.
//Run a loop to go through all elements in arrB
for (var i=0;i<arrB.length;i++) {
//keep position of element i in arrA
//if it's not there index will be equal to -1
var index=arrA.indexOf(arrB[i])
//if it is there
if(index!=-1) {
//remove 1 element at position index from arrA
arrA.splice(index,1)
}
}
Good luck.
This has nothing to do with angular btw, it's basic javascript.
Here's a fiddle:
https://jsfiddle.net/MichaelSel/t2dfg31c/
how about the following:
var result = arrA.filter(function(elem){
return arrB.indexOf(elem) === -1;
);
To delete items from any array you need to use splice:
$scope.items.splice(index, 1);
now what you can do is, you can run a for loop to identify the duplicate element. Once identified you can remove it using splice function.
Angular doesn't concern itself with things like array manipulation. JavaScript provides facilities for that though:
var diff = arrA.filter(function(item) {
return arrB.indexOf(item) < 0;
});
Fiddle
If arrB is very large, you might want to allow it to be O(N) (for smallish ones) up to O(N log N), instead of O(n^2):
var lookup = arrB.reduce(function(lookup, item) {
lookup[item] = true;
return lookup;
}, {});
diff = arrA.filter(function(item) {
return !Object.prototype.hasOwnProperty.call(lookup, item);
});
However, this only works if the string representation of the item is what you are looking at. It would work for integers.
Title is pretty much self explanatory...
I want to be able to find duplicated values from JavaScript array.
The array keys can be duplicated so I need to validate only the array values.
Here is an example :
var arr=[
Ibanez: 'JoeSatriani',
Ibanez: 'SteveVai',
Fender: 'YngwieMalmsteen',
Fender: 'EricJohnson',
Gibson: 'EricJohnson',
Takamine: 'SteveVai'
];
In that example:
the key is the guitar brand
the value is the guitar player name.
So:
If there is duplicated keys (like: Ibanez or Fender) as on that current example that is OK :-)
But
If there is duplicated values (like: EricJohnson or SteveVai) I'm expecting to get (return) that error:
EricJohnson,SteveVai
You can't have associative arrays in Javascript. You can create an array of objects, like:
var arr=[
{Ibanez: 'JoeSatriani'},
{Ibanez: 'SteveVai'},
{Fender: 'YngwieMalmsteen'},
{Fender: 'EricJohnson'},
{Gibson: 'EricJohnson'},
{Takamine: 'SteveVai'}
];
Then you'll need a for...in loop to go over every object in the array, create a new array of values and check that for duplicates, which is also not very straightforward - basically you'll want to sort the array and make sure no value is the same as the one after it.
var arrayOfValues = [];
arr.forEach(function(obj){
for(var prop in obj)
arrayOfValues.push(obj[prop]);
});
arrayOfValues.sort(); // by default it will sort them alphabetically
arrayOfValues.forEach(function(element,index,array){
if(array[index+1] && element==array[index+1])
alert("Duplicate value found!");
});
First of all, object keys can not be repeated.
This means that:
({
"Fender": "Jim",
"Fender": "Bob"
})["Fender"]
Would simply return: "Bob".
However, I did make a code that could allow you to find duplicates in values, but as I said, the key will have to be unique:
var arr = {
Ibanez: 'EricJohnson',
Fender: 'YngwieMalmsteen',
Gibson: 'EricJohnson',
Takamine: 'SteveVai',
"Takamine2": 'SteveVai'
};
function contains(a, obj) {
for (var i = 0; i < a.length; i++) {
if (a[i] === obj) {
return true;
}
}
return false;
}
var track = [];
var exists = [];
for (var val in arr) {
if (contains(track, arr[val])) {
exists.push(arr[val]);
} else {
track.push(arr[val])
}
}
alert(exists)
You can see it working here: http://jsfiddle.net/dr09sga6/2/
As others have commented, the example array you provided isn't a valid JavaScript array. You could, however, keep a list for each guitar type:
var mapping = {
Ibanez: ['JoeSatriani','SteveVai'],
Fender: ['YngwieMalmsteen','EricJohnson']
Gibson: ['EricJohnson'],
Takamine: ['SteveVai']
];
Or a list of each guitar/musician pair:
var pairs = [
['Ibanez','JoeSatriani'],
['Ibanez','SteveVai'],
['Fender','YngwieMalmsteen'],
['Fender','EricJohnson'],
['Gibson','EricJohnson'],
['Takamine','SteveVai']
];
Your solution is going to depend on which pattern you go with. However, in the second case it can be done in one chained functional call:
pairs.map(function(e) {return e[1]}) // Discard the brand names
.sort() // Sort by artist
.reduce(function(p,c,i,a){
if (i>0 && a[i]==a[i-1] && !p.some(function(v) {return v == c;})) p.push(c);
return p;
},[]); //Return the artist names that are duplicated
http://jsfiddle.net/mkurqmqd/1/
To break that reduce call down a bit, here's the callback again:
function(p,c,i,a){
if (i>0
&& a[i]==a[i-1]
&& !p.some(function(v) {
return v == c;
}))
p.push(c);
return p;
}
reduce is going to call our callback for each element in the array, and it's going to pass the returned value for each call into the next call as the first parameter (p). It's useful for accumulating a list as you move across an array.
Because we're looking back at the previous item, we need to make sure we don't go out of bounds on item 0.
Then we're checking to see if this item matches the previous one in the (sorted) list.
Then we're checking (with Array.prototype.some()) whether the value we've found is ALREADY in our list of duplicates...to avoid having duplicate duplicates!
If all of those checks pass, we add the name to our list of duplicate values.