I have a sorted Array:
let mySortedArray = [1,1,5,6,8,8,9,25,25]
I want to remove any duplicates from this array in O(1) time and space complexity. First of all, is this even possible?
My solution is the following:
I convert the array to a set and any duplicates are just removed.
let mySet = new Set(myArray);
What would the time and space complexity of that be?
And if I were to convert the set back to an Array:
let myNewArr = Array.from(mySet);
What would the time and space complexity of the whole method then be?
Would this be the most optimal way of removing any duplicates of an array or would there be a better one?
Related
splice is time complexity O(n);
I tried this version instead of splice:
which has added space complexity but i think less time complexity.
let arr = [0,1];
arr[5] = 5;
// i need to push numbers into this index only-not to the whole array
// push wont work on a 1d array but will work on 2d array
arr[5] = [[], 5];
// now it is possible to push into this index
arr[5].push(15,125,1035);
// remove the unnecessary empty cell []
arr[5].shift();
console.log(arr) // result [0,1,[5,15,125,1035]]
So is this worse than splice, or better(in terms of time complexity)?
EDIT:
this is a bad take of the answer given, my problem was i didn't understand why you couldn't push into an index of an array.
when you try:
arr = [1,2,3,4]
and then arr[1].push(2.5);
you would get an error since you try and push into a primitive(number and not an object/array).
My mistake was that i thought JS just doesn't allow it.
If you want result [5,15,125,1035] with simply this way.
let arr = [5];
arr.push(15,125,1035);
console.log(arr)
Javascript leaves the underlying implementation of arrays up to the runtime, so the answer will depend on implementation used: How are JavaScript arrays implemented?
but I will assume this array is entirely a flat array.
Pushing to the end of an flat array is only O(1) if the array has space for more elements. Otherwise the array needs to be reallocated which is O(length_of_array). However it should still be faster than splicing on every insertion.
If you want O(1) insertion speed then you could append to the end of a doubly linked list instead, however this is at the cost of space and lookup/iteration speed.
the simpler answer i was looking for is by #Gerardo Furtado
let arr = [0,1,2,3,4,5];
arr[5] = [5];
arr[5].push(15,125,1035);
console.log(arr) // result [0,1,[5,15,125,1035]]
Is the time complexity to insert an element to a certain index in an empty array is O(1)?
For example:
let array = [];
array[5] = 5;
Since I want to insert element from the end of array to the front, using unshift method would cause O(n) in each operation. I wonder if the above operation will be better than the unshift method.
Thanks!
As array in js are objects but deleting an array element has BigO O(n) whereas deleting an obj element has BigO O(1) ? Why ?
Pls let me know where i am doing wrong !
Thanks
This not an obvious question. In object this is more clearer because delete operator has constant time of complexity, because you are deleting specific property or method and don't iterate over the object.
Array is an object with ordered indexes and we are using for deletion method which iterating over array and delete item such as Array.prototype.splice():
let arr = [1,6,10,99,44]; //If you want delete 10 you have to iterate by 1,6 and 10
arr.splice(2,1); //arr = [1,6,99,44]
And above we have linear time of complexity (BigO(n)), but we can achieve a constant time for deletion item from array:
let arr = [1,6,10,99,44]; //If you want delete last item
arr.length = 4 //arr = [1,6,10,66]
Finally one tip, never use delete operator for array, such as delete arr[2], because length of the array is not changed and you get an array [1,6, empty,99,44]
deleting an obj element has BigO O(1)
That is because objects in JS behave like hashmap
deleting an array element has BigO O(n)
That is because array is special object that keeps its elements on a chunk of memory one by one without free spaces between elements. After deleting an element with i index you should move all elements that had grater than i+1 indexes to fill the released space.
I have an array which has empty fields.
Particulary it looks as following:
array() {[4] => 3,[8] => 6,[17] => 24}
I am using it this way because I am drawing a graph with it. So for an example: at 4cm the graph has a height of 3cm. And so on.
Now the problem is that the formula only calculates half of the values, since the second part of the graph is just mirrored. So I need to remove the last part of the array and then reverse the rest, while maintaining their distance from that removed value in the original array.
I need a algorhythm which can calculate this for me.
array() {[26] =>6, [30] => 3}
I think an array not a great datatype for this, since you might have potentially thousands of undefined values in it, but filling the array with the missing values is rather easy. You just traverse the array backwards and push the values encountered to the array.
var ar = [/*your input*/];
/* ar.length-1 would be the last position of your array
so you want to go from position ar.lenght-2 */
for (var i = ar.length-2;i>=0;i--){ar.push(ar[i])}
It might be better to use an object instead, so you can work with key value pairs representing your graph nodes.
not sure if it works perfectly, but follow the logic. Lets say that your original array is a
.
a.fill(0).concat(a.reverse().slice(1));
First reverse the original array, and remove the first element (which is the last element of the original array). Then concat the filling zero original array(delete all value, you can fill undefined or other value) with the reversed one. Boom, you get your new array.
I have an array like this:
const array = ['Monthly', 'Annually', 'Quarterly', 'Annually', 'Quarterly', 'Monthly'];
After removed the duplicates I should get an array like this:
const array = ['Monthly', 'Annually', 'Quarterly'];
Now I should get the shortest period. I have thought to associate every string with a number so transforming the array in this way:
const array = [{name:'Monthly', order:1}, {name:'Annually',order:3}, {name:'Quarterly',order:2}];
and then compute the min according the order.
Do you have some other proposal to improve the algorithm? could be improved?
One small improvement:
Removing duplicates is redundant, as it will require O(n) space, or O(nlogn) time, since it is a variant of element distinctness problem, while finding minimal value can be done in O(n) time and O(1) space.
Also, not sure what your "order" exactly is, if it is something that is calculated once at compile / pre-processing and is constant, that's fine, as long as you don't sort the list for each query