Javascript Averages - javascript

I am trying to learn Javascript. I've built the following code to find the average from an array of numbers. It works except the last returned value is always NaN. I cannot figure out why. If I move this piece outside of the block it seems to forget altogether what the variable sum is supposed to be equal to. Is there some kind of global-variable type equivalent I'm supposed to be using for JS?
var average = function(myarray) {
sum = 0;
for (counter = 0; counter <= myarray.length; counter++) {
sum = sum + myarray[counter];
average = sum / myarray.length;
console.log(average);
};
}
average([1, 2, 3])

Change
counter <= myarray.length
to
counter < myarray.length
because indexes start at 0.
Full example:
var average = function(myarray) {
var sum = 0;
for (var counter = 0; counter < myarray.length; counter++) {
sum += myarray[counter];
}
return sum / myarray.length;
}
console.log(average([1,2,3]));
JSBin Demo: http://jsbin.com/siyugi/1/edit

myarray[myarray.length] is undefined, which intoxicates your computation with NaN(Not A Number).
Just change it to
for(counter = 0; counter < myarray.length; counter ++) {
// ...
}

Since you are just learning you should know it is good practice to not use .length in a for loop like that. It causes the code to have to check the length of your array on each loop. And remember that .length is returning the number of elements in the array; but array index starts at 0.
for(var counter = 0, length = myarray.length; counter < length; counter++){
}
Would be the proper way to do it.

Don't use variables without declaring them with var keyword, otherwise they will become global properties.
The JavaScript Arrays are zero index based arrays. So, if the size of the array is 3, then the first element will be accessed with 0 and the last with 2. JavaScript is very forgiving, so when you access an element at an invalid index in the array, it will simply return undefined.
In the iteration, you are replacing the current function object with the average value. So, subsequent calls to average will fail, since average is not a function object any more.
It is a good practice to have a function return the computed value, instead of printing the value, so that it will not violate the Single Responsibility Principle.
In your case,
for (counter = 0; counter <= myarray.length; counter++) {
The counter runs till the last index of the array + 1. Since it returns undefined in the last iteration, JavaScript returns NaN in the arithmetic operation.
console.log(1 + undefined);
# NaN
So, you need to change the code, like this
function Average(myarray) {
var sum = 0, counter;
for (counter = 0; counter < myarray.length; counter++) {
sum = sum + myarray[counter];
}
return sum / myarray.length;
}
If you are interested, you can compute the sum with Array.prototype.forEach, like this
function Average(myarray) {
var sum = 0;
myarray.forEach(function(currentNumber) {
sum += currentNumber;
});
return sum / myarray.length;
}
Even better, you can calculate the sum with Array.prototype.reduce, like this
function Average(myarray) {
return myarray.reduce(function(sum, currentNumber) {
return sum + currentNumber;
}, 0) / myarray.length;
}

You can calculate the average of an array of numbers as follows:
var avg = c => c.reduce((a,b) => a +b) / c.length;
avg([1,2,3])

Related

javascript while loop correctly iterating but for loop with same logic is not, on array with integer values and some null values in there

Iterating through a javascript array which has some data in, and some null or not defined values also, is giving funny behaviors with a for loop, but not with a while loop. It is not returning when it should and is stuck in an infinite loop
I have investigated the outputs extensively, the condition whether the number exists in the array is never evaluated to be true, only ever false, but it sometimes enters the if statement region as if it is true. It is seemingly arbitrary.
//function called within this code
function randomArrayOfIndexes() {
var randNumbArray = new Array(4);
var indexToAssign = Math.floor(Math.random() * Math.floor(4));
randNumbArray[0] = indexToAssign;
for (i = 1; i < randNumbArray.length; i++) {
indexToAssign = Math.floor(Math.random() * Math.floor(4));
while (arrayContains(randNumbArray, indexToAssign)) {
indexToAssign = Math.floor(Math.random() * Math.floor(4));
}
randNumbArray[i] = indexToAssign;
}
return randNumbArray;
}
//this works
function arrayContains(arrayin, numberIn) {
var i = arrayin.length;
while (i--) { //takes one from i so highest index is accurate on first iteration
if (arrayin[i] === numberIn) {
return true;
}
}
return false;
}
//this doesn't... not even backwards like the above iteration
function arrayIncludes(arrayin, numberIn) {
for (i = 0; i < arrayin.length; i++) {
if (arrayin[i] === numberIn) {
return true;
}
}
return false;
}
At first each function above is passed in an array with [int value, null, null, null], and a random number; when the function returns, the next null value is filled with the random number that doesn't exist in it already, so [int value, int value, null, null]... until all values are filled... the final array is filled with unique random numbers from 0 to 3, to provide an index for a piece of data in another array... to make sure that it is only used once in the program I am writing.
I would expect it to return true if the number passed in is already in there, another random number then generated outside of the broken function, and the process repeated until a unique random number is found. When it is found, the array being passed back in will be populated at the next available index, and the process repeated. This is not happening. It is getting stuck in an infinite loop, and never returning
you are just missing a var before i:
function arrayIncludes(arrayin, numberIn) {
for (var i = 0; i < arrayin.length; i++) {
// in ^ here
if (arrayin[i] === numberIn) {
return true;
}
}
return false;
}
You may also declare it before loop, like
var i;
for (i = 0; i < arrayin.length; i++) {
...
By the way, this way of generating random numbers without duplicates is very inefficient, I suggest something like having an array of 0-3 (in your current example) or 0-n and then just randomly taking items out of it. then you don't have to loop through the whole array each time you find a new number. every time you just find a random index between 0 and the length of remaining items.
Imagine that the array length is 1000, and the last item remaining is a number like 100, how many times you have to find a random number and loop through whole array till your random number is 100?
var n = 5;
var a = new Array(n);
for(var i=0;i<n;i++) a[i] = i;
var result = new Array(n);
var i = n;
while(i)
{
var index = Math.floor(Math.random() * i);
result[--i] = a[index];
a.splice(index,1);
}
document.getElementById('a').innerHTML = result;
<div id="a"></div>
You need to declare variables in you loops with for i=0. if you don't do this the variable is global and when you use the same loop variable in nested loops one can change the other.
You are using i in both loops so when you call the for loop with:
function arrayIncludes(arrayin, numberIn) {
for (i = 0; i < arrayin.length; i++) {
// etc
}
You set i back to 0 ad iterate it — this is the same i you are using in randomArrayOfIndexes so it interferes with that loop. This is a common cause of hard-to-find bugs and is hy you should always declare loop variables.
Here's the bug in it's simplest form. Notice that the out loop only runs once because i is incremented in the inner loop causing the outloop to exit early:
for (i = 0; i < 4; i++){
console.log("out loop number: ", i)
for (i = 0; i < 4; i++){
console.log("inner_loop: ", i)
}
}
If you declare the variables for for let i =, each loop gets its own version of i both loops run independently:
for (let i = 0; i < 4; i++){
console.log("out loop number: ", i)
for (let i = 0; i < 4; i++){
console.log("inner_loop: ", i)
}
}

Multiplication Loop not multiplying

Trying to loop the arguments entered and return the arguments as the total multiplication:
let lightCode = { //Creates Object.
Multiply: function() { //Multiplys all arguments.
const total = 0;
for(const i = 0; i < arguments.length; i++) {
console.log(arguments[i]);
total *= arguments[i];
}
return total;
}
}
lightCode.Multiply(12, 16)
You have two problems, one of which is that you can not reassign a value to a constant. Second you're setting total = 0 first. By doing that you'll be multiplying everything by 0.
So in order to solve your problem you need a if conditional to check if total is 0, if is 0 you assign the property total to the argument in the loop, if not you multiply it.
let lightCode = { //Creates Object.
Multiply: function() { //Multiplys all arguments.
let total = 0;
for(let i = 0; i < arguments.length; i++) {
if(total === 0) total = parseFloat(arguments[i]);
else total *= parseFloat(arguments[i]);
}
return total;
}
}
console.log(lightCode.Multiply(12, 16));
I recommend you to read about const, let, var and when to use them. There are (at least) two mistakes in your code:
const total = 0 => since total is declared using the const identifier, it means that its value is going to be constant during your program. And what does constant mean? That it stays the same. But the line total *= arguments[i]; wants to change it, resulting into an error. Also, initializing the total with 0 makes the final results to be 0 (remember that the multiplication identity element is 1).
const i = 0 => same thing; i++ wants to increment the value of i, but you declared it as const.
Running your code and opening the console you can clearly say an error message: "Uncaught TypeError: Assignment to constant variable.".
Cheers!
let lightCode = {
Multiply: function() {
var total = 1;
for (var i = 0; i < arguments.length; i++) {
console.log(arguments[i]);
total *= arguments[i];
}
return total;
}
}
lightCode.Multiply(12, 16)
As pointed in the comments, there is more than one error in the code, first you are assigning the variables as const which is not correct in this case, the rule of thumb is to use let every time you need to reassign a variable (meaning, when you need to use the =symbol again), otherwise use const. Also as pointed in the comments you should not initialize the variable with zero, otherwise the loop will always return zero.
Here is a working snippet:
const lightCode = { //Creates Object.
Multiply: function() { //Multiplys all arguments.
let total = 1; // can not be zero, otherwise the loop will always return zero
for(let i = 0; i < arguments.length; i++) {
console.log(arguments[i]);
total *= arguments[i];
}
return total;
}
}
lightCode.Multiply(12, 16)
Notice how I am using const for the lightCode variable, because this object is never reassigned (meaning, you won't use the = to assign a new value again) and instead for total I'm using let, because is reassigned on every loop interaction.

How to find MAX number in unseen array using for loop?

This is the solution to the problem. What I don't understand is why is it not "if ( i > currentMax)? I also don't understand the nature of numbers[i]. I understand we can reference indexes in arrays doing numbers[0], but numbers[i] is confusing me.
function max(numbers) {
let currentMax = numbers[0];
for (let i = 0; i < numbers.length; i++) {
if (numbers[i] > currentMax) {
currentMax = numbers[i];
}
}
return currentMax;
}
numbers[i] refers to the value stored at position i. If you were to use if (i > currentMax) then you would always return the last element, since the last element always has the greatest index.
Don't reinvent the wheel, use Math.max(...numbers).
Say you have an array like:
[1, 2, 4, 2]
This starts by setting currentMax numbers[0] which is one. Then it loops through the array one element at a time. If it finds a larger number during that loop — in other words if (numbers[i] > currentMax) then it sets the currentMax that number instead. For example this will happen the second and thirds times through the loop when i equals 2 & 4. But it won't happen the last time through the loop. An easy way to watch this happen is to print some stuff to the console as it runs:
function max(numbers) {
let currentMax = numbers[0];
for (let i = 0; i < numbers.length; i++) {
console.log("i:", i, "element:", numbers[i], "max:", currentMax)
if (numbers[i] > currentMax) {
currentMax = numbers[i];
console.log("new currentMax:", currentMax)
}
}
return currentMax;
}
max([1, 2, 4, 2])
In this case i is an "index" which allows us to iterate over all the positions in the array (and accessing their values). In this case i=0, i=1,..., i=numbers.length,
if (numbers[i] > currentMax) asks if the number stored in the array in the position i is greater than the currentMax value. This guarantees that from the provided array the maximum number is returned.
If you ask if (i > currentMax) you compare the value of the "index" (i) with the value of the currentMax value. This is incorrect if you want to return the greatest value from an array of numbers.
Like you said, you can reference indexes in arrays by doing numbers. Instead of hard coding numbers you can use a variable that has a number as a value.
function max(numbers) {
// get the value in the first place in the array
let currentMax = numbers[0];
// create a variable called i
// set it to 0
// loop through, increasing i each time, for as long as i is less than the length of the array
// the first time through i = 0
// the second time through i = 1
// then i = 2
// ... repeat until the end
for (let i = 0; i < numbers.length; i++) {
// get the value from the array at the i place
// if it is greater than the current max
if (numbers[i] > currentMax) {
// then set current max to it
currentMax = numbers[i];
}
}
// return current max
return currentMax;
}

Javascript How To Concatenate Separate Characters Into One String In Array?

I wrote a code for a "Heads or Tails" game below and:
var userInput = prompt("Enter maximum number output: ");
function coinFlip() {
return (Math.floor(Math.random() * 2) === 0) ? 'Heads' ; 'Tails';
}
for (var i = 0; i < 6; i++)
{
var result = [];
result["randomNum"] = (Math.floor(Math.random()*userInput);
result["coin"] = (coinFlip());
}
I'm trying to count the sum of total heads and sum of total tails each with the code:
var headsCount = 0;
var tailsCount = 0;
for (var j = 0; j < result["coin"].length; j++)
{
if (result["coin"] == 'Heads')
headsCount++;
else
tailsCount++;
}
The only problem is that it's counting each characters of 'Heads' and 'Tails' in the result["coin"] array as separate (such as 'H'-'e'-'a'-'d'-'s') and not into a full string (like "Heads"). Thus, instead of increment by 1 each time the loop above runs, it increments by +5.
I want it to increment by +1 only.
How do I make it so that the code reads the full string stored in result["coin"] and not character-by-character?
EDITED -- changed the <2 to *2
var result = []; is inside the for loop, so it is being overwritten with an empty array each time. So when you try to loop over the results, there's one one item in it; the last one. Pull the result array out of the loop so that you can add to it in each iteration.
It seems userInput should be the number of times to loop. Not sure why you're putting it in result["randomNum"]. result is an array, not an object, so it only has integer keys.
Instead of adding the result of the coin toss to result["coin"] I think you mean to add it to the array, so after tossing it six times it might look like this: ["Heads", "Heads", "Tails", "Heads", "Tails", "Tails"]. You can do this by calling result.push with the coin toss output.
To get one of two results randomly, compare the output of Math.random() against 0.5, which is half way between the limits. Numbers less than 0.5 can be considered heads, while numbers greater than or equal to 0.5 can be considered tails.
Putting it all together, this is what I think you were going for:
function coinFlip() {
return Math.random() < 0.5 ? 'Heads' : 'Tails';
}
var result = [];
var userInput = parseInt(prompt("Enter maximum number output: "), 10);
for (var i = 0; i < userInput; i++) {
result.push(coinFlip());
}
var headsCount = 0;
var tailsCount = 0;
for (var j = 0; j < result.length; j++) {
if (result[j] == 'Heads')
headsCount++;
else
tailsCount++;
}
console.log(headsCount, "heads and", tailsCount, "tails");
All that being said, there are definitely areas for improvement. You don't need to loop once to build the results, then loop a second time to read the results.
You can count the number of heads/tails as the coins are flipped. For example:
function isCoinFlipHeads() {
return Math.random() < 0.5;
}
var numFlips = parseInt(prompt("How many flips?"), 10);
var heads = 0;
var tails = 0;
for (var i = 0; i < numFlips; i++) {
isCoinFlipHeads() ? heads++ : tails++;
}
console.log(heads, "heads and", tails, "tails");

Javascript loop infinitely up till a certain condition

// Contains a list of items in each set.
var sets[0] = [1,2,3,4,5,6,7,8,9],
sets[1] = [10,11,12,13,14,15,16,17,18],
sets[2] = [19,20,21,22,23,25,26,27]
// Contains the mins associated to each set item.
var setTimes[0] = [15,15,15,15,15,15,15,15,15],
setTimes[1] = [16,12,11,15,13,15,15,15,14],
setTimes[2] = [16,12,11,15,13,12,11,15,13]
I've got a set of arrays as given above. The sets array has a data set of values. This array can have n number of items in it. Ex, sets[n].
Each sets array has an equivalent setTimes array that has minutes stored in it. setTimes[0][0] is 15min and is the number of minutes for sets[0][0].
Given a set item(ex 12), I'd like to:
Find out which set array does the given number belong to? In our case, since 12 was the item, it belongs to sets[1].
Once I have this, I'd like to get the sum of all mins from the setTimes array for the current sets index and also the next index. In our case, that would be the sum of setTimes[1] and setTimes[2].
In the event we reach the end of sets array, I'd like to get the sum of the first set array.
For ex,
- if I pass 12, I'll need to get the sum of setTimes[1] and setTimes[2]
- If I pass 23, I'll need to get the sum of setTimes[2] and setTimes[0]
Here is the loop I've been thinking, would like to know if there is a better way of doing this.
function computeTotalMin(givenItem)
{
// represents how many sets to loop thorough. I need 2.
for (x = 0; x <= 1; x++)
{
for(i = 0; i < sets.length; i++)
{
// checking to see if i value is on the last index of the sets array.
if(i === sets.length - 1)
{
i = 0;
var item = sets[i].indexOf(givenItem);
if(item !== -1)
{
// Loops through all mins from setTimes[i] array
for(j = 0; j < setTimes[i].length; j++)
{
var total = total + setTimes[j];
}
}
}
}
}
}
You don't need two nested loops for continuing at the end. You should have a single loop that iterates the number of sets you're interested in (2), and has an index (starting at the one set you've found). Inside that loop, you'd make a modulo operation on the index; to get back to the start when you've reached the end. By only looping over the count, not the (resettable) index, you won't get into an infinite loop.
You also should divide your program in just those tasks that you've textually described (find this, then do that), instead of munching everything in one huge nested control structure.
function computeTotalMin(givenItem) {
var setIndex = -1;
for (; setIndex < sets.length; setIndex++)
if (sets[setIndex].indexOf(givenItem) > -1)
break;
if (setIndex == sets.length)
return null; // givenItem found in none of the sets
var sum = 0;
for (var count = 0; count < 2; count++) {
for (var i=0; i<setTimes[setIndex].length; i++)
sum += setTimes[setIndex][i];
setIndex++; // go to next set
setIndex %= sets.length; // which might be 0
// alternatively: if (setIndex == sets.length) setIndex = 0;
}
return sum;
}

Categories

Resources