Why does this function return NAN - javascript

If I use <= instead of <, I'll get NaN, why?
function addArgs(){
var sum = 0, count = 0;
while(count <= arguments.length){
sum += arguments[count];
count++;
}
return sum;
}

in the last iteration of your loop, count is arguments.length, therefore arguments[count] === arguments[arguments.length] === undefined, and sum += undefined results in sum === NaN

Suppose your argument is 3 elements:
arguments = [0, 1, 2]
Your count will iterate as 0 => 1 => 2 => 3 (and on 3rd you are out of bound of the array, since it has 3 elements, but indexed starting with 0.
That's basics of iterating through loop.

When you iterate through a list and use the index to access the items of the list (like you're doing), you always iterate up to length - 1 or < length. The reason is that the list index starts from zero, not one. For instance, a list of 3 items has it's length equals 3 and the indexes of its items are 0, 1, and 2. There is no item with index 3, so if you iterate up to length or <= length, the counter will reach 3 in the last iteration and the attempt to retrieve the item with the index 3 will fail and return undefined.
Finally, adding the undefined to the sum will results in a NaN because undefined is not a number.

It seems that arguments[count] is not a number (NaN). In Javascript, when the second argument in an expression is not a number, the first one is also treated as not a number.
Thus, sum ends up the function being treated as another data type.
http://www.w3schools.com/js/js_datatypes.asp

All iterations start from 0(Also, count = 0 in your code). So, max count equals arguments.length-1.
addArgs(2,5,8); -> arguments[0] = 2; arguments[1] = 5; arguments[2] = 8;
Besides that, you can use <= when count starts from 1
function addArgs(){
var sum = 0, count = 1;
while(count <= arguments.length){
sum += arguments[count-1];
count++;
}
return sum;
}
addArgs(2,3,4);//9

Related

Why is my algorithm failing at this number?

I am trying to write an algorithm to calculate the sum of prime numbers less than or equal to a given number argument. Here is my code:
function sumPrimes(num) {
// populates the array with numbers uptp the given num
let prime = [], i = 2;
while (prime.indexOf(num) < 0){
prime.push(i)
i++;
}
// filters the array leaving only the prime numbers and sums all prime numbers
let result = prime
.filter(function (a){
if(a === 2 ||a === 3 || a === 5 || a === 7){
return a
} else {
return a % 2 > 0 && a % 3 > 0 && a % 5 > 0 && a % 7 > 0
}}).reduce((a,b) => a+b)
console.log(result)
return result
}
sumPrimes(977); //outputs 108789 instead of 73156
My filter function check if a given number divisible simultaneously by 2, 3, 5, 7 returns a remainder greater than zero if so such number is a prime number. However, when the supplied argument is 977, it falls apart and outputs the wrong sum. Can anyone figure out what is going on here?
As per my comment: your check for prime is faulty. Just because a number cannot be divided by 2, 3, 5 or 7 does not mean it is prime. You can simply recycle the prime checking logic from this answer: Number prime test in JavaScript, and optimize your loop so that you only perform iteration once.
You start a for loop from 2, the smallest prime, increment it until it reaches your target number. In the for loop, check the number if it is prime: if it, add it to the sum.
The advantage of this approach is that:
You don't store an arbitrary large array of prime numbers in the prime array, which is eventually reduced by summing it up. Of course, this is under the assumption that you don't need the primes for anything else.
You don't need to perform several iterations, compared to your old code, where you perform repeated iterations in: (1) the while loop to generate all numbers between 2 and your target number, (2) the filter check for prime, and (3) the reduce method
// Adapted from: https://stackoverflow.com/questions/40200089/number-prime-test-in-javascript
function isPrime(num) {
for (let i = 2, s = Math.sqrt(num); i <= s; i++)
if (num % i === 0) return false;
return num > 1;
}
function sumPrimes(num) {
let sum = 0;
for (let n = 2; n <= num; n++) {
if (isPrime(n)) {
sum += n;
}
}
console.log(sum);
return sum;
}
sumPrimes(977); // Outputs 73156 as expected
To check whether a number is prime it should not be divisible by any other prime. So while your logic is kind of correct, you need to expand your array of primes constantly. This could be achieves like this:
const primeArray = (n) => {
var p = [2];
if (n<2) return [];
if (n===2) return [2];
for (let i=3;i<=n;i+=2) if (p.every(x=>i%x)) p.push(i);
return p;
}
We start with the first prime and then we will check every second number starting from 3, as all even numbers are divisible by 2. While iterating through i+2, we simply check whether i is divisible by any already identified primes. If this is not the case (i%x), we expand the array of primes by i.
Here is another scalable minimalist and performance optimized version of the Erastostenes sieve with a cap for testing only up to the square root of the current number:
function primeArr(n){
for (var primes=[],i=1,j=0,cap=2; ++i<=n;){
if (i>2 && i==cap) cap=primes[j]*primes[j++];
if (primes.slice(0,j-1).every(p=>i%p)) primes.push(i)
}
return primes;
}
let pa=primeArr(977);
console.log("sum:", pa.reduce((a,c)=>a+=c));
console.log("all primes:", pa.join(" "));
Complete and fast solution via prime-lib:
import {generatePrimes, stopOnValue} from 'prime-lib';
const i = stopOnValue(generatePrimes(), 977);
let sum = 0;
for (let a of i) {
sum += a;
}
console.log(sum); //=> 73156

Javascript: Check array of numbers for number of missing numbers needed to make the array consecutive

Working on some Javascript challenges on Code Signal and I'm having an issue solving this:
Ratiorg got statues of different sizes as a present from CodeMaster for his birthday, each statue having an non-negative integer size. Since he likes to make things perfect, he wants to arrange them from smallest to largest so that each statue will be bigger than the previous one exactly by 1. He may need some additional statues to be able to accomplish that. Help him figure out the minimum number of additional statues needed.
Example
For statues = [6, 2, 3, 8], the output should be
makeArrayConsecutive2(statues) = 3.
Ratiorg needs statues of sizes 4, 5 and 7.
My approach:
Sort the array smallest to largest
Create counter variable to store number of missing numbers
Iterate through array
Subtract [i + 1] element from [i] element
If it equals 1, numbers are consecutive, if not the numbers are not consecutive (increment counter variable)
Return counter variable
Here is my code:
function makeArrayConsecutive2(statues) {
// Sorts array numerically smallest to largest
statues.sort((a, b) => a - b);
let counter = 0;
// If array only contains one number return 0
if(statues.length === 1) {
return 0;
}
/* Iterate through array, subtract the current element from the next element, if it
equals 1 the numbers are consecutive, if it doesn't equal one increment the counter
variable */
for(let i = 0; i <= statues.length -1; i++) {
if(statues[i] !== statues.length -1 && statues[i + 1] - statues[i] != 1) {
counter++;
}
console.log(statues[i]);
console.log('counter : ' + counter);
}
return counter;
}
When statues contains [5, 4, 6] the output is this:
4
counter : 0
5
counter : 0
6
counter : 1
I think the problem is when array is on the last element, in this case 6, it's attempting to look at statues[i + 1] when that element doesn't exist. I added statues[i] !== statues.length -1 to my if statement to address that but it doesn't appear to be working. What's wrong with my code and why is the final element incrementing the counter variable?
I'd approach it by building the target array which goes from the min+1 to the max-1 of the input by ones, excluding members of the input.....
function missingConseq(input) {
let min = Math.min.apply(null, input)
let max = Math.max.apply(null, input)
let result = []
for (i = min+1; i < max; i++) {
if (!input.includes(i)) result.push(i)
}
return result
}
let array = [6, 2, 3, 8]
console.log(missingConseq(array))

How to increment the value of a total bye array

I know it is a noob question, but still thought of asking out of curiosity.
I have a byte array of 3 bytes for example [00,00,00]. everytime I iterate a method I have increment the last by by 0x02 times. Now once the last byte reaches the threshold value FF I need to increment the last but one byte to +1 along with incrementing last byte. For example for byte array values [00,00,FF] incrementing it with 0x02 should become [00,01,02] and finally it should reach [FF,FF,FF] what should be the ideal way of doing it rather than using the normal if conditions.
Did you try with a for loop and a if condition ?
You cycle through your elements. If the element is smaller than 0xFF, you increment it, and you return. Otherwise you set it to 0x00 and you let the for loop cycle further.
Pseudo-code:
for element in array (in reverse order):
if(element == 0xFF):
element = 0
else:
element += 0x02
return array
you should work with only one variable
var i=0; // from 0 to 4096
//every time you can increment i++;
where you want result just convert i to array like :
function convert_array(val){
var val2 = parseInt(val/4096, 16);
var val1 = parseInt(val2/256, 16);
var val0 = parseInt(val2%256, 16);
return [val2 , val1, val0];
}
var result = convert_array(i);
the advanges :
to increment number no problem just i++;
you have no loops (for) :)
EDIT (dumped the nested loops)
For a more generic approach (obviously not with colors), you should try a cascading increment like this:
// non-destructive increment of the arr
// [0xFF, 0x00] -> [0x00, 0x02]
function incremented(arr) {
var value,
index = 0,
increment = 0x02,
threshold = 0xFF,
re = [],
flip = true;
for (var i = 0; i < arr.length; i++) {
value = arr[i];
if (flip === false) {
// don't touch, just add it
re.push(value);
} else if (value + increment > threshold) {
// add zero value, but keep on flipping
re.push(0);
} else {
// add incremented value
re.push(value + increment);
}
}
if (flip) {
// if even the last number did not stop flipping add another
re.push(increment);
}
return re;
}

Why does my factorial function always return one?

I am trying to write a piece of code to solve a Coderbyte challenge, to calculate a number's factorial. Every time I run it, the factorial generated is one. What am I doing wrong?
var num
var array1 = new Array();
function FirstFactorial(num) {
for (var i = num; i>0; i--){ // 8 , 7, 6 , 5
for (var y = 0; y<num ; y++){ // 0, 1, 2, 3, 4
array1[y]=i; // we have an array that looks like [8,7,6,5,4,3,2,1]
};
};
var sum = 1
for (var x = 0; x<array1.length; x++){ // now I want to run up that array, reading the #s
sum = sum * array1[x];
return sum;
};
return sum
};
A few issues.
1/ This is minor but, when you multiply two numbers, you get a product, not a sum.
2/ You returned the value from within the loop which would mean, even if you fixed the other problems, it would return prematurely without having multiplied all the numbers.
3/ Your nested loop does not fill your array the way you describe, you should check it after population. Think about your loops expressed as pseudo-code:
for i = num downto 1 inclusive:
for y = 0 to num-1 inclusive:
array1[y] = i
You can see that the inner loop is populating the entire array with the value of the current i. So the last iteration of the outer loop, where i is one, sets the entire array to ones.
4/ In any case, you don't need an array to store all the numbers from 1 to n, just use the numbers 1 to n directly. Something like (again, pseudo-code):
def fact(n):
prod = 1
for i = 2 to n inclusive:
prod = prod * i
return prod
This is a much easier way to calculate the factorial of a number.
function factorial(num)
{
if(num === 1)
{
return num;
}
return num * factorial(num - 1);
}
However to fix your code you need to fix the initial loop that loads the numbers into the array. as well as remove the return statement in the bottom loop. Like so.
function FirstFactorial(num) {
for (var i = num; i>0; i--) {
array1[num - i] = i;
};
var sum = 1
for (var x = 0; x < array1.length; x++){ // now I want to run up that array, reading the #s
sum = sum * array1[x];
};
return sum
};

Why is my while loop not repeating?

var num = [1,1];
var total = 0;
var i = num.length;
do {
i++;
num[i] = num[num.length-1] + num[num.length-2];
total+=num[i];
console.log(total);
}
while(num[num.length] < 4000000);
I've been working on the Project Euler questions for a day or two now to hopefully expand my knowledge and usefulness. On the second question I've been figuring out a (bad) way to get the fibonacci sequence. However my code will print "2" to console as it SHOULD but then stopping. Another issue I have is that just using the "while(X IS TRUE/FALSE) { DO STUFF }" just won't work. Not a clue why.
I'm probably just making dumb mistakes but somebody please enlighten me :)
num.length will always be 1 bigger than the last index of num, i.e. if num.length is 5, num has the indices 0 through 4, num[5] doesn't exist.
The highest available index will be num.length - 1 so try num[num.length - 1] in your while's condition.
Your num array has 2 elements, therefore num.length (and also i) are 2. The 1st statement in your do block is i++. Now i is 3.
You're setting num[3], which means num is now [1, 1, undefined, 1].
Also, in your while, you are checking num[num.length]. Since arrays are zero-indexed, this will never work, as num.length is now 4.
What I suggest is: increment i after setting the element. So, you push a new element, then increment the length counter.
var num = [1, 1],
total = 0,
i = 2; // we already know the length, no need to get it
do {
// we don't need the i++ here
num[i] = num[i - 1] + num[i - 2]; // add the last 2 elements to the end
total += num[i];
console.log(total);
}
while (num[i++] < 4000000); // "i++" increments i and returns its old value

Categories

Resources