Sizzle push apply - javascript

Why would the Sizzle selector engine use push.apply( results.... ) over results.push(...) it seems unnecessary to me. Can someone explain the motivation?
To elaborate, I've become interested in writing/borrowing bits from sizzle for a lighter weight selector engine. I figure I don't need some things like :contains(text) which would reduce the weight even further. So reading through the source I see
var arr = [],
push = arr.push
results = results || [];
....
push.apply( results, context.getElementsByTagName( selector ) );
The code makes sense, except wouldn't it be simpler to use
results.push( context.getElementsByTagName( selector ) );
I don't intend to be naggy about such a minor convention, I just want to know if I'm missing something like a context issue.

It is instead of:
results.concat(array)
Because concat creates an extra array, but push.apply won't:
push.apply(results, array)
The results array is cached and no extra arrays are created.
But you could also do:
results.push.apply(results, array)
I'm not sure why the need for arr.
Edit:
I'm thinking the need for the extra arr might be to convert the pseudo-array that getElementsByTagName returns into a real array.

Looking over the code again (after taking a break). Around line 205, Sizzle checks if the selector pattern is an ID and uses results.push
elem = context.getElementById( m );
results.push( elem );
return results;
Line 237 onwards is for Elements or Classes and uses getElementsByTagName or getElementsByClassName along with push.apply( results, ... ).
I assume its a short hand version of
for( elem in context.getElementsByClassName( m ) ) {
results.push( elem );
}
As is the case in the Mozzila docs example https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Function/apply
// short hand
var max = Math.max.apply(null, numbers);
var min = Math.min.apply(null, numbers);
/* vs. simple loop based algorithm */
max = -Infinity, min = +Infinity;
for (var i = 0; i < numbers.length; i++) {
if (numbers[i] > max)
max = numbers[i];
if (numbers[i] < min)
min = numbers[i];
}
EDIT:
From my original question results.push( context.getElementsByTagName( selector ) ); would result in an unwanted Object. This pushes the one argument of type NodeList into results.
Example:
var a = [1, 2, 3], b = [], c =[];
b.push( a ); // b.length = 1, now we have a multidimensional array
[].push.apply( c, a ); // c.length = 3, we now have a clean array, not a NodeList

Related

Javascript Set vs. Array performance

It may be because Sets are relatively new to Javascript but I haven't been able to find an article, on StackO or anywhere else, that talks about the performance difference between the two in Javascript. So, what is the difference, in terms of performance, between the two? Specifically, when it comes to removing, adding and iterating.
Ok, I have tested adding, iterating and removing elements from both an array and a set. I ran a "small" test, using 10 000 elements and a "big" test, using 100 000 elements. Here are the results.
Adding elements to a collection
It would seem that the .push array method is about 4 times faster than the .add set method, no matter the number of elements being added.
Iterating over and modifying elements in a collection
For this part of the test I used a for loop to iterate over the array and a for of loop to iterate over the set. Again, iterating over the array was faster. This time it would seem that it is exponentially so as it took twice as long during the "small" tests and almost four times longer during the "big" tests.
Removing elements from a collection
Now this is where it gets interesting. I used a combination of a for loop and .splice to remove some elements from the array and I used for of and .delete to remove some elements from the set. For the "small" tests, it was about three times faster to remove items from the set (2.6 ms vs 7.1 ms) but things changed drastically for the "big" test where it took 1955.1 ms to remove items from the array while it only took 83.6 ms to remove them from the set, 23 times faster.
Conclusions
At 10k elements, both tests ran comparable times (array: 16.6 ms, set: 20.7 ms) but when dealing with 100k elements, the set was the clear winner (array: 1974.8 ms, set: 83.6 ms) but only because of the removing operation. Otherwise the array was faster. I couldn't say exactly why that is.
I played around with some hybrid scenarios where an array was created and populated and then converted into a set where some elements would be removed, the set would then be reconverted into an array. Although doing this will give much better performance than removing elements in the array, the additional processing time needed to transfer to and from a set outweighs the gains of populating an array instead of a set. In the end, it is faster to only deal with a set. Still, it is an interesting idea, that if one chooses to use an array as a data collection for some big data that doesn't have duplicates, it could be advantageous performance wise, if there is ever a need to remove many elements in one operation, to convert the array to a set, perform the removal operation, and convert the set back to an array.
Array code:
var timer = function(name) {
var start = new Date();
return {
stop: function() {
var end = new Date();
var time = end.getTime() - start.getTime();
console.log('Timer:', name, 'finished in', time, 'ms');
}
}
};
var getRandom = function(min, max) {
return Math.random() * (max - min) + min;
};
var lastNames = ['SMITH', 'JOHNSON', 'WILLIAMS', 'JONES', 'BROWN', 'DAVIS', 'MILLER', 'WILSON', 'MOORE', 'TAYLOR', 'ANDERSON', 'THOMAS'];
var genLastName = function() {
var index = Math.round(getRandom(0, lastNames.length - 1));
return lastNames[index];
};
var sex = ["Male", "Female"];
var genSex = function() {
var index = Math.round(getRandom(0, sex.length - 1));
return sex[index];
};
var Person = function() {
this.name = genLastName();
this.age = Math.round(getRandom(0, 100))
this.sex = "Male"
};
var genPersons = function() {
for (var i = 0; i < 100000; i++)
personArray.push(new Person());
};
var changeSex = function() {
for (var i = 0; i < personArray.length; i++) {
personArray[i].sex = genSex();
}
};
var deleteMale = function() {
for (var i = 0; i < personArray.length; i++) {
if (personArray[i].sex === "Male") {
personArray.splice(i, 1)
i--
}
}
};
var t = timer("Array");
var personArray = [];
genPersons();
changeSex();
deleteMale();
t.stop();
console.log("Done! There are " + personArray.length + " persons.")
Set code:
var timer = function(name) {
var start = new Date();
return {
stop: function() {
var end = new Date();
var time = end.getTime() - start.getTime();
console.log('Timer:', name, 'finished in', time, 'ms');
}
}
};
var getRandom = function (min, max) {
return Math.random() * (max - min) + min;
};
var lastNames = ['SMITH','JOHNSON','WILLIAMS','JONES','BROWN','DAVIS','MILLER','WILSON','MOORE','TAYLOR','ANDERSON','THOMAS'];
var genLastName = function() {
var index = Math.round(getRandom(0, lastNames.length - 1));
return lastNames[index];
};
var sex = ["Male", "Female"];
var genSex = function() {
var index = Math.round(getRandom(0, sex.length - 1));
return sex[index];
};
var Person = function() {
this.name = genLastName();
this.age = Math.round(getRandom(0,100))
this.sex = "Male"
};
var genPersons = function() {
for (var i = 0; i < 100000; i++)
personSet.add(new Person());
};
var changeSex = function() {
for (var key of personSet) {
key.sex = genSex();
}
};
var deleteMale = function() {
for (var key of personSet) {
if (key.sex === "Male") {
personSet.delete(key)
}
}
};
var t = timer("Set");
var personSet = new Set();
genPersons();
changeSex();
deleteMale();
t.stop();
console.log("Done! There are " + personSet.size + " persons.")
OBSERVATIONS:
Set operations can be understood as snapshots within the execution stream.
We are not before a definitive substitute.
The elements of a Set class have no accessible indexes.
Set class is an Array class complement, useful in those scenarios where we need to store a collection on which to apply basic addition,
Deletion, checking and iteration operations.
I share some test of performance. Try to open your console and copypaste the code below.
Creating an array (125000)
var n = 125000;
var arr = Array.apply( null, Array( n ) ).map( ( x, i ) => i );
console.info( arr.length ); // 125000
1. Locating an Index
We compared the has method of Set with Array indexOf:
Array/indexOf (0.281ms) | Set/has (0.053ms)
// Helpers
var checkArr = ( arr, item ) => arr.indexOf( item ) !== -1;
var checkSet = ( set, item ) => set.has( item );
// Vars
var set, result;
console.time( 'timeTest' );
result = checkArr( arr, 123123 );
console.timeEnd( 'timeTest' );
set = new Set( arr );
console.time( 'timeTest' );
checkSet( set, 123123 );
console.timeEnd( 'timeTest' );
2. Adding a new element
We compare the add and push methods of the Set and Array objects respectively:
Array/push (1.612ms) | Set/add (0.006ms)
console.time( 'timeTest' );
arr.push( n + 1 );
console.timeEnd( 'timeTest' );
set = new Set( arr );
console.time( 'timeTest' );
set.add( n + 1 );
console.timeEnd( 'timeTest' );
console.info( arr.length ); // 125001
console.info( set.size ); // 125001
3. Deleting an element
When deleting elements, we have to keep in mind that Array and Set do not start under equal conditions. Array does not have a native method, so an external function is necessary.
Array/deleteFromArr (0.356ms) | Set/remove (0.019ms)
var deleteFromArr = ( arr, item ) => {
var i = arr.indexOf( item );
i !== -1 && arr.splice( i, 1 );
};
console.time( 'timeTest' );
deleteFromArr( arr, 123123 );
console.timeEnd( 'timeTest' );
set = new Set( arr );
console.time( 'timeTest' );
set.delete( 123123 );
console.timeEnd( 'timeTest' );
Read the full article here
Just the Property Lookup, little or zero writes
If property lookup is your main concern, here are some numbers.
JSBench tests https://jsbench.me/3pkjlwzhbr/1
// https://jsbench.me/3pkjlwzhbr/1
// https://docs.google.com/spreadsheets/d/1WucECh5uHlKGCCGYvEKn6ORrQ_9RS6BubO208nXkozk/edit?usp=sharing
// JSBench forked from https://jsbench.me/irkhdxnoqa/2
var theArr = Array.from({ length: 10000 }, (_, el) => el)
var theSet = new Set(theArr)
var theObject = Object.assign({}, ...theArr.map(num => ({ [num]: true })))
var theMap = new Map(theArr.map(num => [num, true]))
var theTarget = 9000
// Array
function isTargetThereFor(arr, target) {
const len = arr.length
for (let i = 0; i < len; i++) {
if (arr[i] === target) {
return true
}
}
return false
}
function isTargetThereForReverse(arr, target) {
const len = arr.length
for (let i = len; i > 0; i--) {
if (arr[i] === target) {
return true
}
}
return false
}
function isTargetThereIncludes(arr, target) {
return arr.includes(target)
}
// Set
function isTargetThereSet(numberSet, target) {
return numberSet.has(target)
}
// Object
function isTargetThereHasOwnProperty(obj, target) {
return obj.hasOwnProperty(target)
}
function isTargetThereIn(obj, target) {
return target in obj
}
function isTargetThereSelectKey(obj, target) {
return obj[target]
}
// Map
function isTargetThereMap(numberMap, target) {
return numberMap.has(target)
}
Array
for loop
for loop (reversed)
array.includes(target)
Set
set.has(target)
Object
obj.hasOwnProperty(target)
target in obj <- 1.29% slower
obj[target] <- fastest
Map
map.has(target) <- 2.94% slower
Results from January 2021, Chrome 87
Results from other browsers are most welcome, please update this answer.
You can use this spreadsheet to make a nice screenshot.
JSBench test forked from Zargold's answer.
For the iteration part of your question, I recently ran this test and found that Set much outperformed an Array of 10,000 items (around 10x the operations could happen in the same timeframe). And depending on the browser either beat or lost to Object.hasOwnProperty in a like for like test. Another interesting point is that Objects do not have officially guaranteed order, whereas Set in JavaScript is implemented as an OrderedSet and does maintain the order of insertion.
Both Set and Object have their "has" method performing in what seems to be amortized to O(1), but depending on the browser's implementation a single operation could take longer or faster. It seems that most browsers implement key in Object faster than Set.has(). Even Object.hasOwnProperty which includes an additional check on the key is about 5% faster than Set.has() at least for me on Chrome v86.
https://jsperf.com/set-has-vs-object-hasownproperty-vs-array-includes/1
Update: 11/11/2020: https://jsbench.me/irkhdxnoqa/2
In case you want to run your own tests with different browsers/environments.
At this period of time (when this test ran): Chrome's V8 clearly only optimized for Objects: The following is a snapshot for Chrome v86 in November 2020.
For loop: 104167.14 ops/s ‡ 0.22% Slowest
Array.includes: 111524.8 ops/s ‡ 0.24% 1.07x more ops/s than for loop (9k iterations for both)
For loop reversed: 218074.48 ops/s ‡ 0.59% 1.96x more ops/s than non-reversed Array.includes (9k iterations)
Set.has: 154744804.61 ops/s ‡ 1.88% 709.6x more ops/s than for loop reverse (only 1k iterations since target is on right side)
hasOwnProperty: 161399953.02 ops/s ‡ 1.81% 1.043x more ops/s than Set.has
key in myObject: 883055194.54 ops/s ‡ 2.08% ... 5x more ops/sec than myObject.hasOwnProperty.
Update: 11/10/2022: I re-ran (2 years after my original image) the same tests on Safari and Chrome today and had some interesting results: TLDR Set is equally fast if not faster than using key in Object and way faster than using Object.hasOwnProperty for both browsers. Chrome also has somehow dramatically optimized Array.includes to the extent that it is in the same realm of speed as Object/Set look up time (whereas for loops take 1000+x longer to complete).
For Safari Set is significantly faster than key in Object and Object.hasOwnProperty is barely in the same realm of speed. All array variants (for loops/includes) are as expected dramatically slower than set/object look ups.
Snapshot 11/10/2022: Tested On Safari v16.1 Operations per second (higher = faster):
mySet.has(key): 1,550,924,292.31
key in myObject: 942,192,599.63 (39.25% slower aka using Set you can perform around 1.6x more operations per second
myObject.hasOwnProperty(key): 21,363,224.51 (98.62% slower) aka you can perform about 72.6x more Set.has operations as hasOwnProperty checks in 1 second.
Reverse For loop 619,876.17 ops/s (target is 9,000 out of 10,000-so reverse for loop means iterating only 1,000 times vs 9,000) meaning you can do 2502x more Set look ups than for loop checks even when you know the item's position is advantageous.
for loop: 137,434 ops/s: as expected is even slower but surprisingly not much slower: Reverse for loop which involves 1/9th the loop iterations is only about 4.5x faster than for loop.
Array.includes(target) 111,076 ops/s is a bit slower still than the for loop manually checking for target you can perform 1.23x checks manually for each check of includes.
On Chrome v107.0.5304.87 11/10/2022: It is no longer true that Set significantly underperforms Object in operation: they now nearly tie. (Though the expected behavior is that set would outperform Object in due to the smaller possible of options with a set vs an object and how this is the behavior in Safari.) Notably impressive Array.includes has apparently been significantly optimized in Chrome (v8) for at least this type of test:
Object in finished 792894327.81 ops/s ‡ 2.51% Fastest
Set.prototype.has finished 790523864.11 ops/s ‡ 2.22% Fastest
Array.prototype.includes finished 679373215.29 ops/s ‡ 1.82% 14.32% slower
Object.hasOwnProperty finished 154217006.71 ops/s ‡ 1.31% 80.55% slower
for loop finished 103015.26 ops/s + 0.98% 99.99% slower
My observation is that a Set is always better with two pitfalls for large arrays in mind :
a) The creation of Sets from Arrays must be done in a for loop with a precached length.
slow (e.g. 18ms) new Set(largeArray)
fast (e.g. 6ms)
const SET = new Set();
const L = largeArray.length;
for(var i = 0; i<L; i++) { SET.add(largeArray[i]) }
b) Iterating could be done in the same way because it is also faster than a for of loop ...
See https://jsfiddle.net/0j2gkae7/5/
for a real life comparison to
difference(), intersection(), union() and uniq() ( + their iteratee companions etc.) with 40.000 elements
Let's consider the case where you want to maintain a set of unique values. Using a Set:
set.add(value);
and with an array:
if (arr.indexOf(value) === -1)
arr.push(value);
While Set has better algorithmic complexity (O(1) or O(log(n)) implementation depending), it likely has a bit more overhead in maintaining its internal tree/table. At what size does the overhead of the Set become worth it? Here is the data I gathered from benchmarking for an average use case (see benchmarking code):
Less than about ~60 unique elements, the array is faster. Greater than ~60 elements and a Set becomes faster.
console.time("set")
var s = new Set()
for(var i = 0; i < 10000; i++)
s.add(Math.random())
s.forEach(function(e){
s.delete(e)
})
console.timeEnd("set")
console.time("array")
var s = new Array()
for(var i = 0; i < 10000; i++)
s.push(Math.random())
s.forEach(function(e,i){
s.splice(i)
})
console.timeEnd("array")
Those three operations on 10K items gave me:
set: 7.787ms
array: 2.388ms

Get first element in array with index not starting from 0

I'm using a javascript library which returns arrays not starting from zero like starting from 26 or 1500, what i want to do is a method to get the first element in that array regardless of the index number starting with 0 or any other number.
Are they any method to do this in javascript ?
I suggest to use Array#some. You get the first nonsparse element and the index. The iteration stops immediately if you return true in the callback:
var a = [, , 22, 33],
value,
index;
a.some(function (v, i) {
value = v;
index = i;
return true;
});
console.log(index, value);
The information below is generally useful, but for the problem the OP listed, Nina's answer is by far a better solution.
Those are called sparse arrays and they're one of the few situations where you may want to use for-in on an array.
Remember that arrays are objects in JavaScript, and array entries are properties keyed by names (array indexes) that meet certain criteria. So we can use the features that let us discover the properties on an object to find the indexes on your sparse array.
for-in example:
for (var n in theArray) {
if (theArray.hasOwnProperty(n) && isArrayIndex(n)) {
// Use theArray[n]
}
}
This answer shows how you can determine that n is an array index as opposed to being some other property. A very technical definition would be
function isArrayIndex(n) {
return /^0$|^[1-9]\d*$/.test(n) &&
n <= 4294967294;
}
...but a definition that's good enough for most of us would be
function isArrayIndex(n) {
return !isNaN(parseInt(n, 10));
}
Similarly, you can use Object.keys; since it only looks at own enumerable properties, you don't need the hasOwnProperty check:
Object.keys(theArray).forEach(function(n) {
if (isArrayIndex(n)) {
// ...
}
});
Note that officially, neither of those is in any particular order, not even in ES2015 ("ES6"). So in theory, you could see the indexes out of numeric order. In the real world, I've never seen an even vaguely-modern JavaScript engine that returned array indexes out of order. They're not required to, but every one I've tried does.
So officially, you would need to get a full list and then find the minimum value in it:
var min = Object.keys(theArray).reduce(function(min, n) {
var i = parseInt(n, 10);
return isNaN(i) || (min !== undefined && min > i) ? min : i;
}, undefined);
That'll given you undefined if the array is empty, or the min index if it isn't. But if you want to make the assumption you'll get the keys in numeric order:
// Makes an assumption that may not be true
var min = +Object.keys(theArray).filter(isArrayIndex)[0];
If you're using a JavaScript engine that's entirely up-to-date, you can rely on the order returned by Object.getOwnPropertyNames, which is required to list the array indexes in order.
var min = +Object.getOwnPropertyNames(theArray).filter(isArrayIndex)[0];
It may be useful to use a filter function on the array to get back a normalised array.
var fullArray = array.filter(function(n){
return n != undefined;
});
fullArray[0]
The answers here may help you decide Remove empty elements from an array in Javascript
I guess one alternative to Array.prototype.some() is the Array.prototype.findIndex() method. These are much faster than filter alone and will keep your array and indices untouched.
var arr = new Array(1000),
fi = -1;
arr[777] = 1453; // now we have a nice sparse array
fi = arr.findIndex(f => f !== void 0); // void 0 is the perfect undefined
console.log(fi);
console.log(arr[fi]);
With this piece of code you can find first assigned value index and then get the value from your array:
var a = [, , 22, 33];
var value = a.find((v, i) => i in a);
console.log(value);
/* ---------------------------------------------- */
var i = 0
while (!(i in a) && i < a.length) i++; // If i === a.length then the array is emtpy
console.info(i, a[i]);
First implementation uses Array.prototype.find which makes less variable usage so this is cleaner but to find the index you should call indexOf over the array.
But the second one is a little bit old fashioned but give the chance of having index without extra efforts.
BTW Nina's seems better. (can make it shorter?)
const arr = [0,1,2]
// using destructuring to get the first element
let [first] = arr
// plus: using destructuring to get the last element
let [first] = [...arr].reverse()

Javascript for-loop returning "null" instead of my value

I'm trying to get the function below to return the average of all elements in array1, but I keep getting null as the result. I can't seem to figure out why.
var array1 = [46,73,-18,0,-442,779,5,1400];
var arrayAverage = function(arrayavg) {
for (var average = 0,answer=0, arrayavg = arrayavg.length;array1 > answer;answer++)
average +=parseInt(arrayavg[answer]);
var calc = average/arrayavg.length;
return calc
};
There are a number of errors, I don't have time to point them all out, hopefully the following is sufficient:
var array1 = [46,73,-18,0,-442,779,5,1400];
var arrayAverage = function(arrayavg) {
I don't know why you using a function expression rather than a function declaration. It doesn't affect the issue, but is more code to write. It's also good to give variables names that express what they are for, so given that the function expects an array:
function arrayAverage(array) {
then:
for (var average = 0,answer=0, arrayavg = arrayavg.length;array1 > answer;answer++)
It's not a good idea to pile all those variable declarations into the for condition, far better to separate concerns and only create variables that you need:
var total = 0;
Now iterate over the array to get the total value. The '{' brackets can be omitted, but it's clearer to include them:
for (var i=0, iLen=array.length; i<iLen; i++) {
total += array[i];
}
Now calculate the average and return it in one statement:
return total/iLen;
}
console.log(arrayAverage(array1)); // 230.375
You need to put brackets after your for loop
I was too fast to answer.
You are re-assigning the passed array to the length of the passed array.
arrayavg = arrayavg.length
this breaks everything.
in the for loop you have assigned arrayavg=arrayavg.length and in the body ,you are accessing average+=arrayavg[answer]. arrayavg is now a primitive type . it will return undefined.
And your loop condition is array1 > answer array1 is an array .you cant compare it like that.it will return false.
modified code.
var array1 = [46,73,-18,0,-442,779,5,1400];
var arrayAverage = function(arrayavg) {
var sum=0;
for (var i=0;i<arrayavg.length;i++)
sum +=parseInt(arrayavg[i]);
return sum/arrayavg.length;
};
You are comparing a number to your array in your for loop. You want to stop the for when answer is the same as array1 length.
Also, don't change your parameter array to its length if you want to get its values in the loop.
var array1 = [46,73,-18,0,-442,779,5,1400];
var arrayAverage = function(arrayavg) {
for (var average = 0,answer=0, len = arrayavg.length;len > answer;answer++)
average +=parseInt(arrayavg[answer]);
var calc = average/len;
return calc
};
And to call it:
arrayAverage(array1);
Your code has two problems in the for loop.
for (var average = 0,answer=0, arrayavg = arrayavg.length;array1 > answer;answer++)
^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^
First thing is you set arrayavg to arrayavg's length BUT in the next line you try to read the index of the array. Well you overwrote the array with a number! Not going to happen.
Second issue you are comparing an array 'array1' to a number 'answer' . What does that check do? Not what you think it is going. You want to be checking the length, but wouldn't you want to be checking the passed in array, not the hardcoded one?
I think the other answers (particularly RobG) have covered most of it. It might help to follow a couple of standard rules (that I use) for your loops:
1) Always have the index as the first declared element, the length of the array (for caching purposes) as the second, and any other variables after them.
2) Always use brackets to separate your loop code from the code in the rest of the function. That way you know when to return your averaged product (ie after the }).
So this is my slightly rewritten code of your problem:
for (var index = 0, len = arrayavg.length, avg = 0; index < len; index++) {
avg += parseInt(arrayavg[index], 10) / len;
}
return avg;
Note also that parseInt should contain a radix (in this case 10). You can leave it out but it's good practice to always include it.
By the way, here's an alternative to your function you might find useful that uses a functional approach using reduce:
var arrayAverage = function (arr) {
return arr.reduce(function (a, b) { return a + b; }) / arr.length;
}

Eloquent Javascript: The sum of an array

I want a sum function that takes an array of numbers and returns the sum of these numbers.
I wrote the following, but it always returns undefined:
function sum(array){
var sumVar;
for(i = 0; i <= array[array.length]; i++){
sumVar += array[i];
}
return sumVar;
}
console.log(sum([1,2,3]));
// → undefined
Could anyone help explain why this is happening? I'm not so much concerned with the solution as I am with what I'm doing wrong.
While there are multiple areas of your code which seems incorrect (as addressed by #meagar), the reason why you still get undefined after changing your loop iteration to stop at array.length is because you didn't initialize sumVar to 0.
function sum (array) {
var sumVar = 0;
for(var i = 0; i < array.length; i += 1) {
sumVar += array[i];
}
return sumVar;
}
sum( [1, 2] ); // results in 3
Your condition, i <= array[array.length], makes no sense.
You're looping while i is less than array[array.length], which is one-past the last element of the array. This will typically be undefined. i <= undefined is always false, and obviously isn't what you want.
If you want to iterate over each element of the array, i is the index and you need to loop from 0 to one less than array.length. Your condition should be i < array.length.
As #zzzzBov hinted in a comment, there is a more eloquent solution than the imperative loop, a solution involving some functional techniques:
var add = function(a, b) {return a + b;};
var sum = function(nums) {return nums.reduce(add, 0);};
In a library like Ramda, where functions like reduce are curried and the list parameter comes last, this would be even easier (although redundant, since Ramda already includes sum):
var sum = R.reduce(add, 0);
You can try this:
function sumOfArray(array) {
return eval(array.join('+'));
}
console.log(sumOfArray([4,4])); // result 4+4 = 8

JavaScript: Is there a better way to retain your array but efficiently concat or replace items?

I am looking for the best way to replace or add to elements of an array without deleting the original reference. Here is the set up:
var a = [], b = [], c, i, obj;
for ( i = 0; i < 100000; i++ ) { a[ i ] = i; b[ i ] = 10000 - i; }
obj.data_list = a;
Now we want to concatenate b INTO a without changing the reference to a, since it is used in obj.data_list. Here is one method:
for ( i = 0; i < b.length; i++ ) { a.push( b[ i ] ); }
This seems to be a somewhat terser and 8x (on V8) faster method:
a.splice.apply( a, [ a.length, 0 ].concat( b ) );
I have found this useful when iterating over an "in-place" array and don't want to touch the elements as I go (a good practice). I start a new array (let's call it keep_list) with the initial arguments and then add the elements I wish to retain. Finally I use this apply method to quickly replace the truncated array:
var keep_list = [ 0, 0 ];
for ( i = 0; i < a.length; i++ ){
if ( some_condition ){ keep_list.push( a[ i ] );
}
// truncate array
a.length = 0;
// And replace contents
a.splice.apply( a, keep_list );
There are a few problems with this solution:
there is a max call stack size limit of around 50k on V8
I have not tested on other JS engines yet.
This solution is a bit cryptic
Has anyone found a better way?
Probably that the most comprehensive way yet still efficient would be to use push.
var source = [],
newItems = [1, 2, 3];
source.push.apply(source, newItems);
If you ever reach the maximum call stack, you could seperate the operation into multiple batches.
var source = [],
newItems = new Array(500000),
i = 0,
len = newItems.length,
batch;
for (; i < len; i++) newItems[i] = i;
//You need to find a real cross-browser stack size, here I just used 50k
while((batch = newItems.splice(0, 50000)).length) {
source.push.apply(source, batch);
}
console.log(source[499999]);
Also keep in mind that expensive operations might hang the browser, especially in old browsers that have slow JS engines. To circumvent the issue, you could further split the process into smaller batches and let the browser breath by using setTimeout.
Finally another approach I thought of would be to use a wrapper object around your array which would allow you to replace the array directly since your references would be retained through the object.
var arrWrapper = { list: [] },
obj1 = { items: arrWrapper },
obj2 = { items: arrWrapper };
//update the array
obj2.items.list = [1, 2, 3, 4];
//access the array
obj1.items.list;
The only restriction would be to avoid keeping a reference to arrWrapper.list directly.
Note: If you are targetting modern browsers only, you could probably make use of WebWorkers. However as far as I know, you can only pass serialized data which means that the worker wouldn't be able to modify the source array directly.

Categories

Resources