Lets say we have an array of 200 000 elements for example...
Now we want to iterate it in different ways and check the fastest one. I've heard that if we will save array.length in variable before loop we will reduce execution time, so i tried the code below
let sum = 0
for (let i = 0; i < arr.length; ++i) sum += arr[i]
against
let sum = 0
for (let i = 0, l = arr.length; i < l; ++i) sum += arr[i]
But i got the same result as if in both cases js reads length value just once in the very beginning.
Then i decided to check, what if during loop we will change an array, removing last element.
let sum = 0
for (let i = 0; i < arr.length; ++i) {
sum += arr[i]
if (i === 100) arr.pop()
}
against
let sum = 0
for (let i = 0, l = arr.length; i < l; ++i) {
sum += arr[i]
if (i === 100) arr.pop()
}
So i expected that second case now should work faster because in first case js inevitably should check array.length each time and i was much suprised that it is not works faster but even slower - from 10 to 15 %. For me it is unexplainable. Any ideas?
Tests: https://jsbench.me/tfkefwjuw2
The problem is that
let sum = 0
for (let i = 0, l = arr.length; i < l; ++i) {
sum += arr[i]
if (i === 100) arr.pop()
}
is now incorrect, as it loops beyond the end of the array. The sum will be NaN in the end. The correct solution would have l = arr.length - 1 so that this doesn't happen.
Now why does it becomes so slow? Because array accesses in javascript are only fast (get compiled to a fast path with pointer addressing) when the element exists. When you miss, the code will get de-optimised. Since JSbench runs the same code multiple times, so even if the deoptimisation happens only at the last of 200000 iterations, the subsequent runs will be much slower.
See this talk for details, which even explicitly spells out "Avoid out-of-bounds reads" on one slide. In general, don't use non-idiomatic code to improve performance. i < arr.length is idiomatic, and JS engines will optimise it well.
Related
How would I get a standard for loop to output in pairs or other groups (like three's of four's) with the output shifting up one after the last digit of the group?
for(var i = 0: i < 8; i++){
console.log(i)
}
so instead of the output being; 0,1,2,3,4,5,6,7
In pairs it would be; 0,1,1,2,2,3,3,4
or if it went up in groups of four; 0,1,2,3,1,2,3,4
I did try doing something like this, but instead of going up in two's every time I need the loop to output the first 2 digits move up one then output the next two ect...
for(var i = 0: i < 8; i+= 2){
console.log(i)
}
Hope that makes sense
For each case you would need to come up with the right formula based on i:
so instead of the output being; 0,1,2,3,4,5,6,7 In pairs it would be; 0,1,1,2,2,3,3,4
for (let i = 1; i < 9; i++) {
console.log(i >> 1); // this bit shift is integer division by 2
}
or if it went up in groups of four; 0,1,2,3,1,2,3,4
for (let i = 0; i < 8; i++) {
// Perform division by 4 and add remainder to that integer quotient
console.log((i >> 2) + (i % 4));
}
You could work with a variable inside the loop to determine the index. This way you can specify how many times you want the loop to run:
for(let index = 0; index < 8; index++) {
const currentIndex = index - (index >> 1);
console.log(currentIndex);
}
It also makes it easy to implement it as immutable:
const array = new Array(8).fill(0).map((entry, index) => index - (index >> 1));
console.log(array);
I think a function like below where we specify the total n and the chunksize after which you want to increase a single step might work for us :-
function getByChunkSteps(n,chunkSize){
let step = -1;
let output = [];
for(let index = 0;index < n;index++){
if(index%chunkSize===0){
step+=1;
}
output.push((index%chunkSize)+step);
}
return output;
}
console.log(getByChunkSteps(10,2));
console.log(getByChunkSteps(8,4));
console.log(getByChunkSteps(9,3));
Building a sparsely-connected neural network from scratch in Javascript.
I have a 2d array. Each object in the second dimension (array[x][y]) holds 2 arrays, starts:[] and ends:[]. Each object in these arrays holds value and weight. Ends are incoming connections. Starts are outgoing connections. So, when a "synapse" is created, it is stored in the starts array of the pre-synaptic neuron and the ends array of the post-synaptic neuron. (Neurons aren't actual objects, just cells in the array).
My current predict function:
predict(array, values){
var result = [];
if(values.length <= array[0].length){
for(var a = 0; a < values.length; a++){
for(var b = 0; b < array[0][a].starts.length; b++){
array[0][a].starts[b].value = values[a];
}
}
for(var c = 0; c < array.length; c++){
for(var d = 0; d < array[c].length; d++){
var sum = 0;
for(var e = 0; e < array[c][d].ends.length; e++){
sum += array[c][d].ends[e].weight * array[c][d].ends[e].value;
}
}
for(var f = 0; f < array[c][d].starts[f].length; f++){
array[c][d].starts[f].value = util.sigmoid(sum);
}
}
for(var g = 0; g < array[array.length-1].length; g++){
var sum = 0;
for(var h = 0; h < array[array.length-1][g].ends.length; h++){
sum += util.sigmoid(array[array.length-1][g].ends[h].weight * array[array.length-1][g].ends[h].value;
}
result.push(util.sigmoid(sum);
}
}
else{
log.add("### PREDICT ### Not enough input neurons. Afferent growth signal.");
}
return result;
}
This works fine if I'm looking for a boolean output (0 or 1). I, however, have multiple outputs and I only want to back-propagate through the chain of weights that is relevant to each output. I'm aware that the same weight will appear in multiple chains - this is something that I want to test. I've read somewhere that actual back-propagation is inefficient, but I think that it's required here.
I would like to avoid looping through the entire network again to find the chain if possible. Also, if you see anything wrong with my predict function, please let me know!
For some context, I've made a little game. The game has a player box and a target box. I would like to input playerX, playerY, targetX, and targetY. Outputs are Down, Left, Up, and Right (directions the player can move). So, if input is 1, 1, 5, 5 ... and the "Up" output node is "heaviest", I want to correct the weights in the chain that led to the Up output and reinforce the weights in the chain that led to the Down output.
I need help with the following problems on determining what the Big O is of each function.
For problem one, I've tried O(log(n)) and O(n). I figured the function was linear or in other words, for N elements we will require N iterations.
For problem two, I've tried O(n^2). I figured for this kind of order, the worst case time (iterations) is the square of the number of inputs. The time grows exponentially related to the number of inputs.
For problem three, I've tried O(n^2) and O(1).
Problem One:
function foo(array){
let sum = 0;
let product = 1;
for (let i = 0; i < array.length; i++){
sum += array[i]
}
for(let i = 0; i < array.length; i++){
product *= array[i];
}
consle.log(sum + ", " + product);
}
Problem Two:
function printUnorderedParis(array){
for(let i = 0; i < array.length; i++){
for(let j = i + j; j < array.length; j++){
console.log(array[i] + ", " + array[j]);
}
}
}
Problem Three:
function printUnorderedPairs(arrayA, arrayB){
for(let i = 0; i < arrayA.length; i++){
for(let j = 0; i < arrayB.length; j++){
for(let k = 0; k < 100000; k++){
console.log(arrayA[i] + ", " + arrayB[j]);
}
}
}
}
I expected my initial thoughts to be right, but maybe I'm having a hard time grasping Big O.
You're correct that it's O(n). You have two loops, they each perform array.length iterations. You could even combine them into a single loop to make it more obvious.
for (let i = 0; i < array.length; i++) {
sum += array[i];
product *= array[i];
}
You're correct, it's O(n^2). The nested loops perform array.length * array.length iterations.
EDIT -- see my comment above asking whether this problem is copied correctly.
This is also O(n^2). The third level of nested loop doesn't change the complexity, because it performs a fixed number of iterations. Since this doesn't depend on the size of the input, it's treated as a constant. So as far as Big-O is concerned, this is equivalent to Problem 2.
Well, you kind of answered your questions, but here we go:
In the first problem, you have two for loops, each of them iterating over the entire array. For a general array of size n, you'll have O(2n) or simply O(n) since we can let go of constants. There isn't any reasons this would be O(log(n)).
For the second one, I think there is a mistake. The statement j = i + j is not valid, and you'll get Uncaught ReferenceError: j is not defined. However, let's say the statement is actually let j = i. Then, we have:
i, iterating over the entire array, starting from the first element and going all the way to the last one
j, starting from i and going all the way to the last element
With this information, we know that for i = 0, j will iterate from 0 to n (n being the array's length), so n steps. For i=1, j will go from 1 to n, so n-1 steps. Generalizing, we are going to have a sum: n + (n - 1) + (n - 2) + ... + 1 + 0 = n * (n + 1) / 2 = 1/2 * (n^2 + n). So, the complexity is O(1/2 * (n^2 + n) = O(n^2 + n) = O(n). So you were correct.
For the third problem, the answer is O(n^2) and not O(1). The reasoning is very close to the one I made for the second one. Basically, the inner k will be executed 100000 times for every j iteration, but the number of iteration does not depend of n (the size of the array).
It's easy to see that:
for i = 0, j will go from 0 to n (last value for which j body will be executed being j = n - 1).
for j = 0, we will to 100k iterations
for j = 1, another 100k iterations
...
for j = n - 1, another 100k iterations
The entire j loop will make n * 100000 = 100000n iterations.
For i = 1, the same behaviour:
for j = 0, we will to 100k iterations
for j = 1, another 100k iterations
...
for j = n - 1, another 100k iterations,
getting another 100000n iterations.
In the end, we end up with 100000n + 100000n + ... + 100000n (n times) = sum(i = 0, n, 100000n) = n * 100000n = 100000 * n^2. So, the big O is O(100000 * n^2) = O(n^2).
Cheers!
I have a simple running game where platforms scroll from right to left. These platforms are stored in an array, and when they are off screen I use array.splice(index, 1) to remove it. This however is causing an ever so slight lag on the exact second splice is called. I have also used Array.shift() but I can still see the slight dip in frame rate as this is called. Is there a better way to remove/destroy something?
for(var x = 0; x < platforms.length; x++){
var platform = platforms[x];
platform.x -= 10;
if(platform.x + platform.width < 0){
platforms.shift();
}
}
You could just not remove items from the array.
Use a fixed size array instead (large enough to store all the items you need) and flag as canceled the items you don't want to render anymore, you could reuse those flagged slots later.
Or you could directly overwrite the elements if the domain of your problem allows it.
[EDIT]
Some more considerations in response to the comments:
The evaluations of additional flag are computations stable in time, meaning that you can foresee how much time they will need and see if they fit the frame to render at a certain frame rate. Array.splice on the other hand could trigger garbage collection and that could be some order of magnitude longer than other language control flow constructs.
Garbage collection is an intensive task and should be avoided at all costs in the main loop of a game to achieve a fluid framerate. There are some resources and other questions here on SO which elaborate on this point, for example: http://buildnewgames.com/garbage-collector-friendly-code/
shift and splice are "slow" functions. They possibly rebuild your whole array.
Imagine having an area with 1000 items. A shift could possibly be 'create a new array with all items, except the first'. If your first 100 items now result in a shift, you rebuild 100 arrays, with 900-1000 items, which will result in about 100.000 inserts, 100 new array allocations.
for(var i = 0; i < array.length; i++)
{
if (array[i] == ....)
{
var newArray = new Array(array.length - 1);
for(var o = 1; o < array.length; o++)
newArray[o - 1] = array[o];
array = newArray
}
}
worst case scenario, with a length of 1000, this will result in:
for ( i = 0 to 1000 )
for ( o = i to 1000 )
recreate the array
so that would be 0.5million iterations and recreations of the array. While this could be fixed with either 1 iteration (dynamicly sized array) or with 2 passes (fixed sized array):
// dynamic size
var newArray = new Array();
for(var i = 0; i < array.length; i++)
{
if (array[i] != ....)
newArray.push(array[i]);
}
array = newArray;
// fixed size
var cnt = 0;
for(var i = 0; i < array.length; i++)
if (array[i] != ....)
cnt++;
var newArray = new Array(cnt);
for(var i = 0, o = 0; i < array.length; i++)
if (array[i] != ....)
newArray[o++] = array[i];
array = newArray;
and another simple optimization for your for loops (which obviously wont work if you modify the array in the for loop):
for(var i = 0, l = array.length; i < l; i++)
(yes, i am aware that some numbers may be off. but it gets the point across.)
I am trying to learn Javascript. I've built the following code to find the average from an array of numbers. It works except the last returned value is always NaN. I cannot figure out why. If I move this piece outside of the block it seems to forget altogether what the variable sum is supposed to be equal to. Is there some kind of global-variable type equivalent I'm supposed to be using for JS?
var average = function(myarray) {
sum = 0;
for (counter = 0; counter <= myarray.length; counter++) {
sum = sum + myarray[counter];
average = sum / myarray.length;
console.log(average);
};
}
average([1, 2, 3])
Change
counter <= myarray.length
to
counter < myarray.length
because indexes start at 0.
Full example:
var average = function(myarray) {
var sum = 0;
for (var counter = 0; counter < myarray.length; counter++) {
sum += myarray[counter];
}
return sum / myarray.length;
}
console.log(average([1,2,3]));
JSBin Demo: http://jsbin.com/siyugi/1/edit
myarray[myarray.length] is undefined, which intoxicates your computation with NaN(Not A Number).
Just change it to
for(counter = 0; counter < myarray.length; counter ++) {
// ...
}
Since you are just learning you should know it is good practice to not use .length in a for loop like that. It causes the code to have to check the length of your array on each loop. And remember that .length is returning the number of elements in the array; but array index starts at 0.
for(var counter = 0, length = myarray.length; counter < length; counter++){
}
Would be the proper way to do it.
Don't use variables without declaring them with var keyword, otherwise they will become global properties.
The JavaScript Arrays are zero index based arrays. So, if the size of the array is 3, then the first element will be accessed with 0 and the last with 2. JavaScript is very forgiving, so when you access an element at an invalid index in the array, it will simply return undefined.
In the iteration, you are replacing the current function object with the average value. So, subsequent calls to average will fail, since average is not a function object any more.
It is a good practice to have a function return the computed value, instead of printing the value, so that it will not violate the Single Responsibility Principle.
In your case,
for (counter = 0; counter <= myarray.length; counter++) {
The counter runs till the last index of the array + 1. Since it returns undefined in the last iteration, JavaScript returns NaN in the arithmetic operation.
console.log(1 + undefined);
# NaN
So, you need to change the code, like this
function Average(myarray) {
var sum = 0, counter;
for (counter = 0; counter < myarray.length; counter++) {
sum = sum + myarray[counter];
}
return sum / myarray.length;
}
If you are interested, you can compute the sum with Array.prototype.forEach, like this
function Average(myarray) {
var sum = 0;
myarray.forEach(function(currentNumber) {
sum += currentNumber;
});
return sum / myarray.length;
}
Even better, you can calculate the sum with Array.prototype.reduce, like this
function Average(myarray) {
return myarray.reduce(function(sum, currentNumber) {
return sum + currentNumber;
}, 0) / myarray.length;
}
You can calculate the average of an array of numbers as follows:
var avg = c => c.reduce((a,b) => a +b) / c.length;
avg([1,2,3])