Getting the time complexity - javascript

I wrote this algorithm. Can you help me calculate 'the time complexity' ?
I don't have nested function, but have .includes inside map.
function prime(num) {
for (var i = 2; i < num; i++) if (num % i === 0) return false;
return num > 1;
}
const function = (dataA, dataB) => {
let temp = {};
let tempArray = [];
dataB.forEach(function (x) {
temp[x] = (temp[x] || 0) + 1;
});
dataA.map(item => {
if (dataB.includes(item) && !prime(temp[item])) {
tempArray.push(item);
} else if (!dataB.includes(item)) {
tempArray.push(item);
}
});
return tempArray;
};
console.log('Input A:', A);
console.log('Input B:', B);
console.log('Output:', function(A, B));

Some observations:
B.includes has a time complexity of O(B.length)
isPrime has a time complexity of O(num). Since the argument is the frequency of a value in B, it is bounded by the size of B, and so it has a worst case of O(B.length), as isPrime is only called when B.includes is called, it is thus irrelevant for the overall time complexity.
As B.includes is called as many times as there are values in A, the overall time complexity is O(A.length * B.length)
The complexity can be reduced by replacing B.includes(item) with count[item], and then isPrime becomes determining. If isPrime is extended with memoization, the total cost of all isPrime calls together is O(A.length + B.length), so then that is also the overall time complexity.
This cannot be further reduced, as even without a call to isPrime, the iteration over both input arrays is necessary and already represents that time complexity.

Your isPrime function has a worst-case time complexity of O(n).
The forEach function is a separate O(n) function.
The map function that evaluates each item in dataA and does O(n) in each of includes and isPrime function.
So, the total time complexity is O(n^2).

Related

Recursion time complexity

I am looking for help in understanding what the Time/Space complexity of my solution, for finding permutations of an array, is. I know that because I am using an Array.forEach method that my time complexity includes O(n), however since I am also using recursion I do not know how the time complexity changes. The recursion seems to me to be in O(n) time complexity as well. Does that make the overall time complexity of the algorithm O(n^2)? And for space complexity is that 0(n) as well? since each recursion call returns a bigger memory array? Thanks in advance.
function getPermutations(array) {
if (array.length <= 1){
return array.length === 0 ? array : [array];
}
let lastNum = array[array.length - 1]
let arrayWithoutLastNum = array.slice(0, array.length - 1);
let permutations = getPermutations(arrayWithoutLastNum);
let memory = []
permutations.forEach(element => {
for(let i = 0; i <= element.length; i++){
let elementCopy = element.slice(0);
elementCopy.splice(i, 0, lastNum)
memory.push(elementCopy)
}
})
return memory
}

How to reduce the time complexity of this problem?

After having a long difficult coding challenge, there was a problem that bugged me. I thought about it for an adequate time but couldn't find the way to solve it. Here, I am providing a problem and example below.
Input
v : an array of numbers.
q : 2 dimensional array with 3 elements in nested array.
Description
v is an array and q is a commands that does different thing according to its nested element.
if first element of nested array is 1 => second and third element of the nested array becomes the index and it returns sum[second:third+1] (As you can see, it is inclusive)
if first element of nested array is 2 => element of second index becomes the third. same as v[second] = third
Input example
v : [1,2,3,4,5]
q : [[1,2,4], [2,3,8], [1,2,4]]
Example
With a provided example, it goes like
command is [1,2,4] => first element is 1. it should return sum from v[2] to v[4] (inclusive) => 12.
command is [2,3,8] => first element is 2. it switches v[3] to 8. (now v is [1,2,3,8,5])
command is [1,2,4] => first element is 1. it should return sum from v[2] to v[4] (inclusive) => 16, as the third index has been changed from the previous command.
So the final answer is [12, 16]
Question.
The code below is how I solved, however, this is O(n**2) complexity. I wonder how I can reduce the time complexity in this case.
I tried making a hash object, but it didn't work. I can't think of a good way to make a cache in this case.
function solution(v, q) {
let answer = [];
for (let i = 0; i < q.length; i++) {
let [a, b, c] = q[i];
if (a === 1) {
let sum = 0;
for (let i = b; i <= c; i++) {
sum += v[i];
}
answer.push(sum);
} else if (a === 2) {
v[b] = c;
}
}
return answer;
}
This type of problem can typically be solved more efficiently with a Fenwick tree
Here is an implementation:
class BinaryIndexedTree extends Array {
constructor(length) {
super(length + 1);
this.fill(0);
}
add(i, delta) {
i++; // make index 1-based
while (i < this.length) {
this[i] += delta;
i += i & -i; // add least significant bit
}
}
sumUntil(i) {
i++; // make index 1-based
let sum = 0;
while (i) {
sum += this[i];
i -= i & -i;
}
return sum;
}
}
function solution(values, queries) {
const tree = new BinaryIndexedTree(values.length);
values.forEach((value, i) => tree.add(i, value));
const answer = [];
for (const [a, b, c] of queries) {
if (a === 1) {
answer.push(tree.sumUntil(c) - tree.sumUntil(b - 1));
} else {
tree.add(b, c - values[b]);
values[b] = c;
}
}
return answer;
}
let answer = solution([1,2,3,4,5], [[1,2,4], [2,3,8], [1,2,4]]);
console.log(answer);
Time Complexity
The time complexity of running tree.add or tree.sumUntil once is O(log𝑛), where 𝑛 is the size of the input values (values.length). So this is also the time complexity of running one query.
The creation of the tree costs O(𝑛), as this is the size of the tree
The initialisation of the tree with values costs O(𝑛log𝑛), as really each value in the input acts as a query that updates a value from 0 to the actual value.
Executing the queries costs O(𝑚log𝑛) where 𝑚 is the number of queries (queries.length)
So in total, we have a time complexity of O(𝑛 + 𝑛log𝑛 + 𝑚log𝑛) = O((𝑚+𝑛)log𝑛)
Further reading
For more information on Fenwick trees, see BIT: What is the intuition behind a binary indexed tree and how was it thought about?

Smallest Common Multiple of a range of numbers - works on Codepen.io and not on freecodecamp.org

I'm making the Freecodecamp certifications, and there's one problem I cannot see a solution to: the task is to calculate the Least Common Multiple (LCM) for an array of integers (that also means a RANGE of integers, between a min and a max value).
The snippet below gives the correct answer here on SO, on Codepen.io, on my local environment. But not on freecodecamp.org.
function smallestCommons(arr) {
// sorting and cloning the array
const fullArr = createArray(arr.sort((a, b) => a - b))
// calculating the theoretical limit of the size of the LCM
const limit = fullArr.reduce((a, c) => a * c, 1)
// setting the number to start the iteration with
let i = fullArr[0]
// iteration to get the LCM
for (i; i <= limit; i++) {
// if every number in the fullArr divide
// the number being tested (i), then it's the LCM
if (fullArr.every(e => !(i % e))) {
// stop the for loop
break
}
}
// return LCM
return i;
}
// utility function to create the fullArr const in the
// main function
function createArray([a, b]) {
const r = []
for (let i = b; i >= a; i--) {
r.push(i)
}
return r
}
// displaying the results
console.log(smallestCommons([23, 18]));
The error what I see:
the code works correctly with 4 other arrays on freecodecamp.org
the code gives false results - or no results at all for the array [23, 18]. If I get a result, it's not consistent (like 1,000,000 once, then 3,654,236 - I made these numbers up, but the behavior is like that). The result of the [23, 18] input should be 6,056,820 (and it's that here on SO, but not on freecodecamp.org)
As this code is far from optimal I have a feeling that the code execution just runs out of resources at one point, but I get no error for that.
I read the hints on the page (yes, I tried the solutions, and they do work), but I'd like to submit my own code: I know my algorithm is (theoretically) good (although not optimal), but I'd like to make it work in practice too.
I also see that this question had caused problems to others (it's been asked on SO), but I don't feel it's a duplicate.
Does anyone have any ideas?
As it turned out it was a resource problem - the algorithm in my question was correct theoretically but wasn't optimal or effective.
Here's one that is more efficient in solving this problem:
// main function
function smallestCommons(arr) {
const fullArr = createArray(arr.sort((a, b) => a - b))
return findLcm(fullArr, fullArr.length);
}
// creating the range of numbers based on a min and a max value
function createArray([a, b]) {
const r = []
for (let i = b; i >= a; i--) {
r.push(i)
}
return r
}
// smallest common multiple of n numbers
function findLcm(arr, n) {
let ans = arr[0];
for (let i = 1; i < n; i++) {
ans = (((arr[i] * ans)) /
(gcd(arr[i], ans)));
}
return ans;
}
// greatest common divisor
function gcd(a, b) {
if (b == 0) return a;
return gcd(b, a % b);
}
console.log(smallestCommons([1, 5]));
console.log(smallestCommons([5, 1]));
console.log(smallestCommons([2, 10]));
console.log(smallestCommons([23, 18]));
This method was OK on the testing sandbox environment.

Can't detect the cause of infinite loop in a 'while loop' in JS

I've got an infinite loop inside my while loop and I can't find the cause.
It's a simple function that returns the sum of the argument's digits. I use a while loop because it needs to add up the digits until it lands at a one digit number.
I made sure that I added a statement that would make sure that at a certain point the loop will break., But it obviously doesn't.
function digital_root(n) {
num = n;
sum = 0;
while (num.toString().length>1){
for (i=0; i<num.toString().length; i++) {
sum += parseInt(num.toString()[i])
}
num = sum;
}
return sum;
}
digital_root(456)
I get a warning that I have an infinity loop on line 4 (while loop).
I hoped that num=sumwould reassign the new integer (with reduced number of digits) to the numvariable and thus at some point break out of the loop. Is this wrong?
What further confuses me is that most of the JS editors I've used to debug the problem return an output, but it takes ages. So is it an infinite loop or is it not?
You have a nested loop structure where the first condition is always true.
For getting only a number below 10, you could call the function again as recursion.
function digital_root(n) {
var num = n.toString(), // declare and use the string value
sum = 0,
i;
for (i = 0; i < num.length; i++) {
sum += parseInt(num[i], 10)
}
return sum > 9
? digital_root(sum)
: sum;
}
console.log(digital_root(456));
After re-reading the question I noticed that you're trying to reduce an integer down to a single number. The issue with your code was that sum was set to 0, only before the while loop. Meaning that it didn't reset for the second, third, ... run.
Moving sum = 0 into the while loop resolves this issue. I've also added variable declarations at the top to avoid setting global variables.
function digital_root(n) {
var num, sum, i;
num = n;
while (num.toString().length > 1) {
sum = 0;
for (i = 0; i < num.toString().length; i++) {
sum += parseInt(num.toString()[i]);
}
num = sum;
}
return sum;
}
console.log(digital_root(456));
Here written in a recursive manner, a style that I personally more prefer:
function digital_root(integer) {
// guard against things that might result in an infinit loop, like floats
if (!Number.isInteger(integer) || integer < 0) return;
const chars = integer.toString().split("");
if (chars.length == 1) return integer;
return digital_root(
chars.map(char => parseInt(char))
.reduce((sum, digit) => sum + digit)
);
}
console.log(digital_root(456));
Since you already got the answer, here is another way to meet the result
function digital_root(n) {
// convert the number to string
// use split to create an array of number viz ['4','5','6']
// use reduce to sum the number after converting to number
// 0 is the initial value
return n.toString().split('').reduce((a, c) => a + parseInt(c, 10), 0)
}
console.log(digital_root(456))
Avoiding all the nested loops that lead to a situation such as that you're facing, I'd rather do it in a more readable way.
function digital_root(n) {
sum = n;
while(sum.toString().length > 1) {
sum = sum.toString()
.split('')
.map(digit => parseInt(digit, 10))
.reduce((acc, cur) => acc + cur);
}
return sum;
}
console.log(digital_root(456));

Find possible numbers in array that can sum to a target value

Given I have an array of numbers for example [14,6,10] - How can I find possible combinations/pairs that can add upto a given target value.
for example I have [14,6,10], im looking for a target value of 40
my expected output will be
10 + 10 + 6 + 14
14 + 14 + 6 + 6
10 + 10 + 10 + 10
*Order is not important
With that being said, this is what I tried so far:
function Sum(numbers, target, partial) {
var s, n, remaining;
partial = partial || [];
s = partial.reduce(function (a, b) {
return a + b;
}, 0);
if (s === target) {
console.log("%s", partial.join("+"))
}
for (var i = 0; i < numbers.length; i++) {
n = numbers[i];
remaining = numbers.slice(i + 1);
Sum(remaining, target, partial.concat([n]));
}
}
>>> Sum([14,6,10],40);
// returns nothing
>>> Sum([14,6,10],24);
// return 14+10
It is actually useless since it will only return if the number can be used only once to sum.
So how to do it?
You could add the value of the actual index as long as the sum is smaller than the wanted sum or proceed with the next index.
function getSum(array, sum) {
function iter(index, temp) {
var s = temp.reduce((a, b) => a + b, 0);
if (s === sum) result.push(temp);
if (s >= sum || index >= array.length) return;
iter(index, temp.concat(array[index]));
iter(index + 1, temp);
}
var result = [];
iter(0, []);
return result;
}
console.log(getSum([14, 6, 10], 40));
.as-console-wrapper { max-height: 100% !important; top: 0; }
For getting a limited result set, you could specify the length and check it in the exit condition.
function getSum(array, sum, limit) {
function iter(index, temp) {
var s = temp.reduce((a, b) => a + b, 0);
if (s === sum) result.push(temp);
if (s >= sum || index >= array.length || temp.length >= limit) return;
iter(index, temp.concat(array[index]));
iter(index + 1, temp);
}
var result = [];
iter(0, []);
return result;
}
console.log(getSum([14, 6, 10], 40, 5));
.as-console-wrapper { max-height: 100% !important; top: 0; }
TL&DR : Skip to Part II for the real thing
Part I
#Nina Scholz answer to this fundamental problem just shows us a beautiful manifestation of an algorithm. Honestly it confused me a lot for two reasons
When i try [14,6,10,7,3] with a target 500 it makes 36,783,575 calls to the iter function without blowing the call stack. Yet memory shows no significant usage at all.
My dynamical programming solution goes a little faster (or may be not) but there is no way it can do above case without exhousting the 16GB memory.
So i shelved my solution and instead started investigating her code a little further on dev tools and discoverd both it's beauty and also a little bit of it's shortcomings.
First i believe this algorithmic approach, which includes a very clever use of recursion, might possibly deserve a name of it's own. It's very memory efficient and only uses up memory for the constructed result set. The stack dynamically grows and shrinks continuoously up to nowhere close to it's limit.
The problem is, while being very efficient it still makes huge amounts of redundant calls. So looking into that, with a slight modification the 36,783,575 calls to iter can be cut down to 20,254,744... like 45% which yields a much faster code. The thing is the input array must be sorted ascending.
So here comes a modified version of Nina's algorithm. (Be patient.. it will take like 25 secs to finalize)
function getSum(array, sum) {
function iter(index, temp) {cnt++ // counting iter calls -- remove in production code
var s = temp.reduce((a, b) => a + b, 0);
sum - s >= array[index] && iter(index, temp.concat(array[index]));
sum - s >= array[index+1] && iter(index + 1, temp);
s === sum && result.push(temp);
return;
}
var result = [];
array.sort((x,y) => x-y); // this is a very cheap operation considering the size of the inpout array should be small for reasonable output.
iter(0, []);
return result;
}
var cnt = 0,
arr = [14,6,10,7,3],
tgt = 500,
res;
console.time("combos");
res = getSum(arr,tgt);
console.timeEnd("combos");
console.log(`source numbers are ${arr}
found ${res.length} unique ways to sum up to ${tgt}
iter function has been called ${cnt} times`);
Part II
Eventhough i was impressed with the performance, I wasn't comfortable with above solution for no solid reason that i can name. The way it works on side effects and the very hard to undestand double recursion and such disturbed me.
So here comes my approach to this question. This is many times more efficient compared to the accepted solution despite i am going functional in JS. We have still have room to make it a little faster with ugly imperative ways.
The difference is;
Given numbers: [14,6,10,7,3]
Target Sum: 500
Accepted Answer:
Number of possible ansers: 172686
Resolves in: 26357ms
Recursive calls count: 36783575
Answer Below
Number of possible ansers: 172686
Resolves in: 1000ms
Recursive calls count: 542657
function items2T([n,...ns],t){cnt++ //remove cnt in production code
var c = ~~(t/n);
return ns.length ? Array(c+1).fill()
.reduce((r,_,i) => r.concat(items2T(ns, t-n*i).map(s => Array(i).fill(n).concat(s))),[])
: t % n ? []
: [Array(c).fill(n)];
};
var cnt = 0, result;
console.time("combos");
result = items2T([14, 6, 10, 7, 3], 500)
console.timeEnd("combos");
console.log(`${result.length} many unique ways to sum up to 500
and ${cnt} recursive calls are performed`);
Another important point is, if the given array is sorted descending then the amount of recursive iterations will be reduced (sometimes greatly), allowing us to squeeze out more juice out of this lemon. Compare above with the one below when the input array is sorted descending.
function items2T([n,...ns],t){cnt++ //remove cnt in production code
var c = ~~(t/n);
return ns.length ? Array(c+1).fill()
.reduce((r,_,i) => r.concat(items2T(ns, t-n*i).map(s => Array(i).fill(n).concat(s))),[])
: t % n ? []
: [Array(c).fill(n)];
};
var cnt = 0, result;
console.time("combos");
result = items2T([14, 10, 7, 6, 3], 500)
console.timeEnd("combos");
console.log(`${result.length} many unique ways to sum up to 500
and ${cnt} recursive calls are performed`);

Categories

Resources