Related
I'm looking for any alternatives to the below for creating a JavaScript array containing 1 through to N where N is only known at runtime.
var foo = [];
for (var i = 1; i <= N; i++) {
foo.push(i);
}
To me it feels like there should be a way of doing this without the loop.
In ES6 using Array from() and keys() methods.
Array.from(Array(10).keys())
//=> [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
Shorter version using spread operator.
[...Array(10).keys()]
//=> [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
Start from 1 by passing map function to Array from(), with an object with a length property:
Array.from({length: 10}, (_, i) => i + 1)
//=> [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
You can do so:
var N = 10;
Array.apply(null, {length: N}).map(Number.call, Number)
result: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
or with random values:
Array.apply(null, {length: N}).map(Function.call, Math.random)
result: [0.7082694901619107, 0.9572225909214467, 0.8586748542729765,
0.8653848143294454, 0.008339877473190427, 0.9911756622605026, 0.8133423360995948, 0.8377588465809822, 0.5577575915958732, 0.16363654541783035]
Explanation
First, note that Number.call(undefined, N) is equivalent to Number(N), which just returns N. We'll use that fact later.
Array.apply(null, [undefined, undefined, undefined]) is equivalent to Array(undefined, undefined, undefined), which produces a three-element array and assigns undefined to each element.
How can you generalize that to N elements? Consider how Array() works, which goes something like this:
function Array() {
if ( arguments.length == 1 &&
'number' === typeof arguments[0] &&
arguments[0] >= 0 && arguments &&
arguments[0] < 1 << 32 ) {
return [ … ]; // array of length arguments[0], generated by native code
}
var a = [];
for (var i = 0; i < arguments.length; i++) {
a.push(arguments[i]);
}
return a;
}
Since ECMAScript 5, Function.prototype.apply(thisArg, argsArray) also accepts a duck-typed array-like object as its second parameter. If we invoke Array.apply(null, { length: N }), then it will execute
function Array() {
var a = [];
for (var i = 0; i < /* arguments.length = */ N; i++) {
a.push(/* arguments[i] = */ undefined);
}
return a;
}
Now we have an N-element array, with each element set to undefined. When we call .map(callback, thisArg) on it, each element will be set to the result of callback.call(thisArg, element, index, array). Therefore, [undefined, undefined, …, undefined].map(Number.call, Number) would map each element to (Number.call).call(Number, undefined, index, array), which is the same as Number.call(undefined, index, array), which, as we observed earlier, evaluates to index. That completes the array whose elements are the same as their index.
Why go through the trouble of Array.apply(null, {length: N}) instead of just Array(N)? After all, both expressions would result an an N-element array of undefined elements. The difference is that in the former expression, each element is explicitly set to undefined, whereas in the latter, each element was never set. According to the documentation of .map():
callback is invoked only for indexes of the array which have assigned values; it is not invoked for indexes which have been deleted or which have never been assigned values.
Therefore, Array(N) is insufficient; Array(N).map(Number.call, Number) would result in an uninitialized array of length N.
Compatibility
Since this technique relies on behaviour of Function.prototype.apply() specified in ECMAScript 5, it will not work in pre-ECMAScript 5 browsers such as Chrome 14 and Internet Explorer 9.
Multiple ways using ES6
Using spread operator (...) and keys method
[ ...Array(N).keys() ].map( i => i+1);
Fill/Map
Array(N).fill().map((_, i) => i+1);
Array.from
Array.from(Array(N), (_, i) => i+1)
Array.from and { length: N } hack
Array.from({ length: N }, (_, i) => i+1)
Note about generalised form
All the forms above can produce arrays initialised to pretty much any desired values by changing i+1 to expression required (e.g. i*2, -i, 1+i*2, i%2 and etc). If expression can be expressed by some function f then the first form becomes simply
[ ...Array(N).keys() ].map(f)
Examples:
Array.from({length: 5}, (v, k) => k+1);
// [1,2,3,4,5]
Since the array is initialized with undefined on each position, the value of v will be undefined
Example showcasing all the forms
let demo= (N) => {
console.log(
[ ...Array(N).keys() ].map(( i) => i+1),
Array(N).fill().map((_, i) => i+1) ,
Array.from(Array(N), (_, i) => i+1),
Array.from({ length: N }, (_, i) => i+1)
)
}
demo(5)
More generic example with custom initialiser function f i.e.
[ ...Array(N).keys() ].map((i) => f(i))
or even simpler
[ ...Array(N).keys() ].map(f)
let demo= (N,f) => {
console.log(
[ ...Array(N).keys() ].map(f),
Array(N).fill().map((_, i) => f(i)) ,
Array.from(Array(N), (_, i) => f(i)),
Array.from({ length: N }, (_, i) => f(i))
)
}
demo(5, i=>2*i+1)
If I get what you are after, you want an array of numbers 1..n that you can later loop through.
If this is all you need, can you do this instead?
var foo = new Array(45); // create an empty array with length 45
then when you want to use it... (un-optimized, just for example)
for(var i = 0; i < foo.length; i++){
document.write('Item: ' + (i + 1) + ' of ' + foo.length + '<br/>');
}
e.g. if you don't need to store anything in the array, you just need a container of the right length that you can iterate over... this might be easier.
See it in action here: http://jsfiddle.net/3kcvm/
Arrays innately manage their lengths. As they are traversed, their indexes can be held in memory and referenced at that point. If a random index needs to be known, the indexOf method can be used.
This said, for your needs you may just want to declare an array of a certain size:
var foo = new Array(N); // where N is a positive integer
/* this will create an array of size, N, primarily for memory allocation,
but does not create any defined values
foo.length // size of Array
foo[ Math.floor(foo.length/2) ] = 'value' // places value in the middle of the array
*/
ES6
Spread
Making use of the spread operator (...) and keys method, enables you to create a temporary array of size N to produce the indexes, and then a new array that can be assigned to your variable:
var foo = [ ...Array(N).keys() ];
Fill/Map
You can first create the size of the array you need, fill it with undefined and then create a new array using map, which sets each element to the index.
var foo = Array(N).fill().map((v,i)=>i);
Array.from
This should be initializing to length of size N and populating the array in one pass.
Array.from({ length: N }, (v, i) => i)
In lieu of the comments and confusion, if you really wanted to capture the values from 1..N in the above examples, there are a couple options:
if the index is available, you can simply increment it by one (e.g., ++i).
in cases where index is not used -- and possibly a more efficient way -- is to create your array but make N represent N+1, then shift off the front.
So if you desire 100 numbers:
let arr; (arr=[ ...Array(101).keys() ]).shift()
In ES6 you can do:
Array(N).fill().map((e,i)=>i+1);
http://jsbin.com/molabiluwa/edit?js,console
Edit:
Changed Array(45) to Array(N) since you've updated the question.
console.log(
Array(45).fill(0).map((e,i)=>i+1)
);
Use the very popular Underscore _.range method
// _.range([start], stop, [step])
_.range(10); // => [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
_.range(1, 11); // => [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
_.range(0, 30, 5); // => [0, 5, 10, 15, 20, 25]
_.range(0, -10, -1); // => [0, -1, -2, -3, -4, -5, -6, -7, -8, -9]
_.range(0); // => []
function range(start, end) {
var foo = [];
for (var i = start; i <= end; i++) {
foo.push(i);
}
return foo;
}
Then called by
var foo = range(1, 5);
There is no built-in way to do this in Javascript, but it's a perfectly valid utility function to create if you need to do it more than once.
Edit: In my opinion, the following is a better range function. Maybe just because I'm biased by LINQ, but I think it's more useful in more cases. Your mileage may vary.
function range(start, count) {
if(arguments.length == 1) {
count = start;
start = 0;
}
var foo = [];
for (var i = 0; i < count; i++) {
foo.push(start + i);
}
return foo;
}
the fastest way to fill an Array in v8 is:
[...Array(5)].map((_,i) => i);
result will be: [0, 1, 2, 3, 4]
Performance
Today 2020.12.11 I performed tests on macOS HighSierra 10.13.6 on Chrome v87, Safari v13.1.2 and Firefox v83 for chosen solutions.
Results
For all browsers
solution O (based on while) is the fastest (except Firefox for big N - but it's fast there)
solution T is fastest on Firefox for big N
solutions M,P is fast for small N
solution V (lodash) is fast for big N
solution W,X are slow for small N
solution F is slow
Details
I perform 2 tests cases:
for small N = 10 - you can run it HERE
for big N = 1000000 - you can run it HERE
Below snippet presents all tested solutions A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
function A(N) {
return Array.from({length: N}, (_, i) => i + 1)
}
function B(N) {
return Array(N).fill().map((_, i) => i+1);
}
function C(N) {
return Array(N).join().split(',').map((_, i) => i+1 );
}
function D(N) {
return Array.from(Array(N), (_, i) => i+1)
}
function E(N) {
return Array.from({ length: N }, (_, i) => i+1)
}
function F(N) {
return Array.from({length:N}, Number.call, i => i + 1)
}
function G(N) {
return (Array(N)+'').split(',').map((_,i)=> i+1)
}
function H(N) {
return [ ...Array(N).keys() ].map( i => i+1);
}
function I(N) {
return [...Array(N).keys()].map(x => x + 1);
}
function J(N) {
return [...Array(N+1).keys()].slice(1)
}
function K(N) {
return [...Array(N).keys()].map(x => ++x);
}
function L(N) {
let arr; (arr=[ ...Array(N+1).keys() ]).shift();
return arr;
}
function M(N) {
var arr = [];
var i = 0;
while (N--) arr.push(++i);
return arr;
}
function N(N) {
var a=[],b=N;while(b--)a[b]=b+1;
return a;
}
function O(N) {
var a=Array(N),b=0;
while(b<N) a[b++]=b;
return a;
}
function P(N) {
var foo = [];
for (var i = 1; i <= N; i++) foo.push(i);
return foo;
}
function Q(N) {
for(var a=[],b=N;b--;a[b]=b+1);
return a;
}
function R(N) {
for(var i,a=[i=0];i<N;a[i++]=i);
return a;
}
function S(N) {
let foo,x;
for(foo=[x=N]; x; foo[x-1]=x--);
return foo;
}
function T(N) {
return new Uint8Array(N).map((item, i) => i + 1);
}
function U(N) {
return '_'.repeat(5).split('').map((_, i) => i + 1);
}
function V(N) {
return _.range(1, N+1);
}
function W(N) {
return [...(function*(){let i=0;while(i<N)yield ++i})()]
}
function X(N) {
function sequence(max, step = 1) {
return {
[Symbol.iterator]: function* () {
for (let i = 1; i <= max; i += step) yield i
}
}
}
return [...sequence(N)];
}
[A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X].forEach(f=> {
console.log(`${f.name} ${f(5)}`);
})
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.20/lodash.min.js" integrity="sha512-90vH1Z83AJY9DmlWa8WkjkV79yfS2n2Oxhsi2dZbIv0nC4E6m5AbH8Nh156kkM7JePmqD6tcZsfad1ueoaovww==" crossorigin="anonymous"> </script>
This snippet only presents functions used in performance tests - it does not perform tests itself!
And here are example results for chrome
This question has a lot of complicated answers, but a simple one-liner:
[...Array(255).keys()].map(x => x + 1)
Also, although the above is short (and neat) to write, I think the following is a bit faster
(for a max length of:
127, Int8,
255, Uint8,
32,767, Int16,
65,535, Uint16,
2,147,483,647, Int32,
4,294,967,295, Uint32.
(based on the max integer values), also here's more on Typed Arrays):
(new Uint8Array(255)).map(($,i) => i + 1);
Although this solution is also not so ideal, because it creates two arrays, and uses the extra variable declaration "$" (not sure any way to get around that using this method). I think the following solution is the absolute fastest possible way to do this:
for(var i = 0, arr = new Uint8Array(255); i < arr.length; i++) arr[i] = i + 1;
Anytime after this statement is made, you can simple use the variable "arr" in the current scope;
If you want to make a simple function out of it (with some basic verification):
function range(min, max) {
min = min && min.constructor == Number ? min : 0;
!(max && max.constructor == Number && max > min) && // boolean statements can also be used with void return types, like a one-line if statement.
((max = min) & (min = 0)); //if there is a "max" argument specified, then first check if its a number and if its graeter than min: if so, stay the same; if not, then consider it as if there is no "max" in the first place, and "max" becomes "min" (and min becomes 0 by default)
for(var i = 0, arr = new (
max < 128 ? Int8Array :
max < 256 ? Uint8Array :
max < 32768 ? Int16Array :
max < 65536 ? Uint16Array :
max < 2147483648 ? Int32Array :
max < 4294967296 ? Uint32Array :
Array
)(max - min); i < arr.length; i++) arr[i] = i + min;
return arr;
}
//and you can loop through it easily using array methods if you want
range(1,11).forEach(x => console.log(x));
//or if you're used to pythons `for...in` you can do a similar thing with `for...of` if you want the individual values:
for(i of range(2020,2025)) console.log(i);
//or if you really want to use `for..in`, you can, but then you will only be accessing the keys:
for(k in range(25,30)) console.log(k);
console.log(
range(1,128).constructor.name,
range(200).constructor.name,
range(400,900).constructor.name,
range(33333).constructor.name,
range(823, 100000).constructor.name,
range(10,4) // when the "min" argument is greater than the "max", then it just considers it as if there is no "max", and the new max becomes "min", and "min" becomes 0, as if "max" was never even written
);
so, with the above function, the above super-slow "simple one-liner" becomes the super-fast, even-shorter:
range(1,14000);
Using ES2015/ES6 spread operator
[...Array(10)].map((_, i) => i + 1)
console.log([...Array(10)].map((_, i) => i + 1))
You can use this:
new Array(/*any number which you want*/)
.join().split(',')
.map(function(item, index){ return ++index;})
for example
new Array(10)
.join().split(',')
.map(function(item, index){ return ++index;})
will create following array:
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
If you happen to be using d3.js in your app as I am, D3 provides a helper function that does this for you.
So to get an array from 0 to 4, it's as easy as:
d3.range(5)
[0, 1, 2, 3, 4]
and to get an array from 1 to 5, as you were requesting:
d3.range(1, 5+1)
[1, 2, 3, 4, 5]
Check out this tutorial for more info.
This is probably the fastest way to generate an array of numbers
Shortest
var a=[],b=N;while(b--)a[b]=b+1;
Inline
var arr=(function(a,b){while(a--)b[a]=a;return b})(10,[]);
//arr=[0,1,2,3,4,5,6,7,8,9]
If you want to start from 1
var arr=(function(a,b){while(a--)b[a]=a+1;return b})(10,[]);
//arr=[1,2,3,4,5,6,7,8,9,10]
Want a function?
function range(a,b,c){c=[];while(a--)c[a]=a+b;return c}; //length,start,placeholder
var arr=range(10,5);
//arr=[5,6,7,8,9,10,11,12,13,14]
WHY?
while is the fastest loop
Direct setting is faster than push
[] is faster than new Array(10)
it's short... look the first code. then look at all other functions in here.
If you like can't live without for
for(var a=[],b=7;b>0;a[--b]=b+1); //a=[1,2,3,4,5,6,7]
or
for(var a=[],b=7;b--;a[b]=b+1); //a=[1,2,3,4,5,6,7]
If you are using lodash, you can use _.range:
_.range([start=0], end, [step=1])
Creates an array of numbers
(positive and/or negative) progressing from start up to, but not
including, end. A step of -1 is used if a negative start is specified
without an end or step. If end is not specified, it's set to start
with start then set to 0.
Examples:
_.range(4);
// ➜ [0, 1, 2, 3]
_.range(-4);
// ➜ [0, -1, -2, -3]
_.range(1, 5);
// ➜ [1, 2, 3, 4]
_.range(0, 20, 5);
// ➜ [0, 5, 10, 15]
_.range(0, -4, -1);
// ➜ [0, -1, -2, -3]
_.range(1, 4, 0);
// ➜ [1, 1, 1]
_.range(0);
// ➜ []
the new way to filling Array is:
const array = [...Array(5).keys()]
console.log(array)
result will be: [0, 1, 2, 3, 4]
with ES6 you can do:
// `n` is the size you want to initialize your array
// `null` is what the array will be filled with (can be any other value)
Array(n).fill(null)
Very simple and easy to generate exactly 1 - N
const [, ...result] = Array(11).keys();
console.log('Result:', result);
Final Summary report .. Drrruummm Rolll -
This is the shortest code to generate an Array of size N (here 10) without using ES6. Cocco's version above is close but not the shortest.
(function(n){for(a=[];n--;a[n]=n+1);return a})(10)
But the undisputed winner of this Code golf(competition to solve a particular problem in the fewest bytes of source code) is Niko Ruotsalainen . Using Array Constructor and ES6 spread operator . (Most of the ES6 syntax is valid typeScript, but following is not. So be judicious while using it)
[...Array(10).keys()]
https://stackoverflow.com/a/49577331/8784402
With Delta
For javascript
smallest and one-liner
[...Array(N)].map((v, i) => from + i * step);
Examples and other alternatives
Array.from(Array(10).keys()).map(i => 4 + i * 2);
//=> [4, 6, 8, 10, 12, 14, 16, 18, 20, 22]
[...Array(10).keys()].map(i => 4 + i * -2);
//=> [4, 2, 0, -2, -4, -6, -8, -10, -12, -14]
Array(10).fill(0).map((v, i) => 4 + i * 2);
//=> [4, 6, 8, 10, 12, 14, 16, 18, 20, 22]
Array(10).fill().map((v, i) => 4 + i * -2);
//=> [4, 2, 0, -2, -4, -6, -8, -10, -12, -14]
[...Array(10)].map((v, i) => 4 + i * 2);
//=> [4, 6, 8, 10, 12, 14, 16, 18, 20, 22]
Range Function
const range = (from, to, step) =>
[...Array(Math.floor((to - from) / step) + 1)].map((_, i) => from + i * step);
range(0, 9, 2);
//=> [0, 2, 4, 6, 8]
// can also assign range function as static method in Array class (but not recommended )
Array.range = (from, to, step) =>
[...Array(Math.floor((to - from) / step) + 1)].map((_, i) => from + i * step);
Array.range(2, 10, 2);
//=> [2, 4, 6, 8, 10]
Array.range(0, 10, 1);
//=> [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Array.range(2, 10, -1);
//=> []
Array.range(3, 0, -1);
//=> [3, 2, 1, 0]
As Iterators
class Range {
constructor(total = 0, step = 1, from = 0) {
this[Symbol.iterator] = function* () {
for (let i = 0; i < total; yield from + i++ * step) {}
};
}
}
[...new Range(5)]; // Five Elements
//=> [0, 1, 2, 3, 4]
[...new Range(5, 2)]; // Five Elements With Step 2
//=> [0, 2, 4, 6, 8]
[...new Range(5, -2, 10)]; // Five Elements With Step -2 From 10
//=>[10, 8, 6, 4, 2]
[...new Range(5, -2, -10)]; // Five Elements With Step -2 From -10
//=> [-10, -12, -14, -16, -18]
// Also works with for..of loop
for (i of new Range(5, -2, 10)) console.log(i);
// 10 8 6 4 2
As Generators Only
const Range = function* (total = 0, step = 1, from = 0) {
for (let i = 0; i < total; yield from + i++ * step) {}
};
Array.from(Range(5, -2, -10));
//=> [-10, -12, -14, -16, -18]
[...Range(5, -2, -10)]; // Five Elements With Step -2 From -10
//=> [-10, -12, -14, -16, -18]
// Also works with for..of loop
for (i of Range(5, -2, 10)) console.log(i);
// 10 8 6 4 2
// Lazy loaded way
const number0toInf = Range(Infinity);
number0toInf.next().value;
//=> 0
number0toInf.next().value;
//=> 1
// ...
From-To with steps/delta
using iterators
class Range2 {
constructor(to = 0, step = 1, from = 0) {
this[Symbol.iterator] = function* () {
let i = 0,
length = Math.floor((to - from) / step) + 1;
while (i < length) yield from + i++ * step;
};
}
}
[...new Range2(5)]; // First 5 Whole Numbers
//=> [0, 1, 2, 3, 4, 5]
[...new Range2(5, 2)]; // From 0 to 5 with step 2
//=> [0, 2, 4]
[...new Range2(5, -2, 10)]; // From 10 to 5 with step -2
//=> [10, 8, 6]
using Generators
const Range2 = function* (to = 0, step = 1, from = 0) {
let i = 0,
length = Math.floor((to - from) / step) + 1;
while (i < length) yield from + i++ * step;
};
[...Range2(5, -2, 10)]; // From 10 to 5 with step -2
//=> [10, 8, 6]
let even4to10 = Range2(10, 2, 4);
even4to10.next().value;
//=> 4
even4to10.next().value;
//=> 6
even4to10.next().value;
//=> 8
even4to10.next().value;
//=> 10
even4to10.next().value;
//=> undefined
For Typescript
class _Array<T> extends Array<T> {
static range(from: number, to: number, step: number): number[] {
return Array.from(Array(Math.floor((to - from) / step) + 1)).map(
(v, k) => from + k * step
);
}
}
_Array.range(0, 9, 1);
Solution for empty array and with just number in array
const arrayOne = new Array(10);
console.log(arrayOne);
const arrayTwo = [...Array(10).keys()];
console.log(arrayTwo);
var arrayThree = Array.from(Array(10).keys());
console.log(arrayThree);
const arrayStartWithOne = Array.from(Array(10).keys(), item => item + 1);
console.log(arrayStartWithOne)
✅ Simply, this worked for me:
[...Array(5)].map(...)
There is another way in ES6, using Array.from which takes 2 arguments, the first is an arrayLike (in this case an object with length property), and the second is a mapping function (in this case we map the item to its index)
Array.from({length:10}, (v,i) => i)
this is shorter and can be used for other sequences like generating even numbers
Array.from({length:10}, (v,i) => i*2)
Also this has better performance than most other ways because it only loops once through the array.
Check the snippit for some comparisons
// open the dev console to see results
count = 100000
console.time("from object")
for (let i = 0; i<count; i++) {
range = Array.from({length:10}, (v,i) => i )
}
console.timeEnd("from object")
console.time("from keys")
for (let i =0; i<count; i++) {
range = Array.from(Array(10).keys())
}
console.timeEnd("from keys")
console.time("apply")
for (let i = 0; i<count; i++) {
range = Array.apply(null, { length: 10 }).map(function(element, index) { return index; })
}
console.timeEnd("apply")
Fast
This solution is probably fastest it is inspired by lodash _.range function (but my is simpler and faster)
let N=10, i=0, a=Array(N);
while(i<N) a[i++]=i;
console.log(a);
Performance advantages over current (2020.12.11) existing answers based on while/for
memory is allocated once at the beginning by a=Array(N)
increasing index i++ is used - which looks is about 30% faster than decreasing index i-- (probably because CPU cache memory faster in forward direction)
Speed tests with more than 20 other solutions was conducted in this answer
Using new Array methods and => function syntax from ES6 standard (only Firefox at the time of writing).
By filling holes with undefined:
Array(N).fill().map((_, i) => i + 1);
Array.from turns "holes" into undefined so Array.map works as expected:
Array.from(Array(5)).map((_, i) => i + 1)
In ES6:
Array.from({length: 1000}, (_, i) => i).slice(1);
or better yet (without the extra variable _ and without the extra slice call):
Array.from({length:1000}, Number.call, i => i + 1)
Or for slightly faster results, you can use Uint8Array, if your list is shorter than 256 results (or you can use the other Uint lists depending on how short the list is, like Uint16 for a max number of 65535, or Uint32 for a max of 4294967295 etc. Officially, these typed arrays were only added in ES6 though). For example:
Uint8Array.from({length:10}, Number.call, i => i + 1)
ES5:
Array.apply(0, {length: 1000}).map(function(){return arguments[1]+1});
Alternatively, in ES5, for the map function (like second parameter to the Array.from function in ES6 above), you can use Number.call
Array.apply(0,{length:1000}).map(Number.call,Number).slice(1)
Or, if you're against the .slice here also, you can do the ES5 equivalent of the above (from ES6), like:
Array.apply(0,{length:1000}).map(Number.call, Function("i","return i+1"))
Array(...Array(9)).map((_, i) => i);
console.log(Array(...Array(9)).map((_, i) => i))
for(var i,a=[i=0];i<10;a[i++]=i);
a = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
It seems the only flavor not currently in this rather complete list of answers is one featuring a generator; so to remedy that:
const gen = N => [...(function*(){let i=0;while(i<N)yield i++})()]
which can be used thus:
gen(4) // [0,1,2,3]
The nice thing about this is you don't just have to increment... To take inspiration from the answer #igor-shubin gave, you could create an array of randoms very easily:
const gen = N => [...(function*(){let i=0;
while(i++<N) yield Math.random()
})()]
And rather than something lengthy operationally expensive like:
const slow = N => new Array(N).join().split(',').map((e,i)=>i*5)
// [0,5,10,15,...]
you could instead do:
const fast = N => [...(function*(){let i=0;while(i++<N)yield i*5})()]
Let A and B be two sets. I'm looking for really fast or elegant ways to compute the set difference (A - B or A \B, depending on your preference) between them. The two sets are stored and manipulated as Javascript arrays, as the title says.
Notes:
Gecko-specific tricks are okay
I'd prefer sticking to native functions (but I am open to a lightweight library if it's way faster)
I've seen, but not tested, JS.Set (see previous point)
Edit: I noticed a comment about sets containing duplicate elements. When I say "set" I'm referring to the mathematical definition, which means (among other things) that they do not contain duplicate elements.
I don't know if this is most effective, but perhaps the shortest:
var A = [1, 2, 3, 4];
var B = [1, 3, 4, 7];
var diff = A.filter(function(x) {
return B.indexOf(x) < 0;
});
console.log(diff); // [2]
Updated to ES6:
const A = [1, 2, 3, 4];
const B = [1, 3, 4, 7];
const diff = A.filter(x => !B.includes(x));
console.log(diff); // [2]
Well, 7 years later, with ES6's Set object it's quite easy (but still not as compact as python's A - B), and reportedly faster than indexOf for large arrays:
console.clear();
let a = new Set([1, 2, 3, 4]);
let b = new Set([5, 4, 3, 2]);
let a_minus_b = new Set([...a].filter(x => !b.has(x)));
let b_minus_a = new Set([...b].filter(x => !a.has(x)));
let a_intersect_b = new Set([...a].filter(x => b.has(x)));
let a_union_b = new Set([...a, ...b]);
console.log(...a_minus_b); // {1}
console.log(...b_minus_a); // {5}
console.log(...a_intersect_b); // {2,3,4}
console.log(...a_union_b); // {1,2,3,4,5}
Looking at a lof of these solutions, they do fine for small cases. But, when you blow them up to a million items, the time complexity starts getting silly.
A.filter(v => B.includes(v))
That starts looking like an O(N^2) solution. Since there is an O(N) solution, let's use it, you can easily modify to not be a generator if you're not up to date on your JS runtime.
function *setMinus(A, B) {
const setA = new Set(A);
const setB = new Set(B);
for (const v of setB.values()) {
if (!setA.delete(v)) {
yield v;
}
}
for (const v of setA.values()) {
yield v;
}
}
a = [1,2,3];
b = [2,3,4];
console.log(Array.from(setMinus(a, b)));
While this is a bit more complex than many of the other solutions, when you have large lists this will be far faster.
Let's take a quick look at the performance difference, running it on a set of 1,000,000 random integers between 0...10,000 we see the following performance results.
setMinus time = 181 ms
diff time = 19099 ms
function buildList(count, range) {
result = [];
for (i = 0; i < count; i++) {
result.push(Math.floor(Math.random() * range))
}
return result;
}
function *setMinus(A, B) {
const setA = new Set(A);
const setB = new Set(B);
for (const v of setB.values()) {
if (!setA.delete(v)) {
yield v;
}
}
for (const v of setA.values()) {
yield v;
}
}
function doDiff(A, B) {
return A.filter(function(x) { return B.indexOf(x) < 0 })
}
const listA = buildList(100_000, 100_000_000);
const listB = buildList(100_000, 100_000_000);
let t0 = process.hrtime.bigint()
const _x = Array.from(setMinus(listA, listB))
let t1 = process.hrtime.bigint()
const _y = doDiff(listA, listB)
let t2 = process.hrtime.bigint()
console.log("setMinus time = ", (t1 - t0) / 1_000_000n, "ms");
console.log("diff time = ", (t2 - t1) / 1_000_000n, "ms");
You can use an object as a map to avoid linearly scanning B for each element of A as in user187291's answer:
function setMinus(A, B) {
var map = {}, C = [];
for(var i = B.length; i--; )
map[B[i].toSource()] = null; // any other value would do
for(var i = A.length; i--; ) {
if(!map.hasOwnProperty(A[i].toSource()))
C.push(A[i]);
}
return C;
}
The non-standard toSource() method is used to get unique property names; if all elements already have unique string representations (as is the case with numbers), you can speed up the code by dropping the toSource() invocations.
If you're using Sets, it can be quite simple and performant:
function setDifference(a, b) {
return new Set(Array.from(a).filter(item => !b.has(item)));
}
Since Sets use Hash functions* under the hood, the has function is much faster than indexOf (this matters if you have, say, more than 100 items).
The shortest, using jQuery, is:
var A = [1, 2, 3, 4];
var B = [1, 3, 4, 7];
var diff = $(A).not(B);
console.log(diff.toArray());
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
I would hash the array B, then keep values from the array A not present in B:
function getHash(array){
// Hash an array into a set of properties
//
// params:
// array - (array) (!nil) the array to hash
//
// return: (object)
// hash object with one property set to true for each value in the array
var hash = {};
for (var i=0; i<array.length; i++){
hash[ array[i] ] = true;
}
return hash;
}
function getDifference(a, b){
// compute the difference a\b
//
// params:
// a - (array) (!nil) first array as a set of values (no duplicates)
// b - (array) (!nil) second array as a set of values (no duplicates)
//
// return: (array)
// the set of values (no duplicates) in array a and not in b,
// listed in the same order as in array a.
var hash = getHash(b);
var diff = [];
for (var i=0; i<a.length; i++){
var value = a[i];
if ( !hash[value]){
diff.push(value);
}
}
return diff;
}
Using Underscore.js (Library for functional JS)
>>> var foo = [1,2,3]
>>> var bar = [1,2,4]
>>> _.difference(foo, bar);
[4]
Some simple functions, borrowing from #milan's answer:
const setDifference = (a, b) => new Set([...a].filter(x => !b.has(x)));
const setIntersection = (a, b) => new Set([...a].filter(x => b.has(x)));
const setUnion = (a, b) => new Set([...a, ...b]);
Usage:
const a = new Set([1, 2]);
const b = new Set([2, 3]);
setDifference(a, b); // Set { 1 }
setIntersection(a, b); // Set { 2 }
setUnion(a, b); // Set { 1, 2, 3 }
Incorporating the idea from Christoph and assuming a couple of non-standard iteration methods on arrays and objects/hashes (each and friends), we can get set difference, union and intersection in linear time in about 20 lines total:
var setOPs = {
minusAB : function (a, b) {
var h = {};
b.each(function (v) { h[v] = true; });
return a.filter(function (v) { return !h.hasOwnProperty(v); });
},
unionAB : function (a, b) {
var h = {}, f = function (v) { h[v] = true; };
a.each(f);
b.each(f);
return myUtils.keys(h);
},
intersectAB : function (a, b) {
var h = {};
a.each(function (v) { h[v] = 1; });
b.each(function (v) { h[v] = (h[v] || 0) + 1; });
var fnSel = function (v, count) { return count > 1; };
var fnVal = function (v, c) { return v; };
return myUtils.select(h, fnSel, fnVal);
}
};
This assumes that each and filter are defined for arrays, and that we have two utility methods:
myUtils.keys(hash): returns an
array with the keys of the hash
myUtils.select(hash, fnSelector,
fnEvaluator): returns an array with
the results of calling fnEvaluator
on the key/value pairs for which
fnSelector returns true.
The select() is loosely inspired by Common Lisp, and is merely filter() and map() rolled into one. (It would be better to have them defined on Object.prototype, but doing so wrecks havoc with jQuery, so I settled for static utility methods.)
Performance: Testing with
var a = [], b = [];
for (var i = 100000; i--; ) {
if (i % 2 !== 0) a.push(i);
if (i % 3 !== 0) b.push(i);
}
gives two sets with 50,000 and 66,666 elements. With these values A-B takes about 75ms, while union and intersection are about 150ms each. (Mac Safari 4.0, using Javascript Date for timing.)
I think that's decent payoff for 20 lines of code.
As for the fasted way, this isn't so elegant but I've run some tests to be sure. Loading one array as an object is far faster to process in large quantities:
var t, a, b, c, objA;
// Fill some arrays to compare
a = Array(30000).fill(0).map(function(v,i) {
return i.toFixed();
});
b = Array(20000).fill(0).map(function(v,i) {
return (i*2).toFixed();
});
// Simple indexOf inside filter
t = Date.now();
c = b.filter(function(v) { return a.indexOf(v) < 0; });
console.log('completed indexOf in %j ms with result %j length', Date.now() - t, c.length);
// Load `a` as Object `A` first to avoid indexOf in filter
t = Date.now();
objA = {};
a.forEach(function(v) { objA[v] = true; });
c = b.filter(function(v) { return !objA[v]; });
console.log('completed Object in %j ms with result %j length', Date.now() - t, c.length);
Results:
completed indexOf in 1219 ms with result 5000 length
completed Object in 8 ms with result 5000 length
However, this works with strings only. If you plan to compare numbered sets you'll want to map results with parseFloat.
The function below are ports of the methods found in Python's set() class and follows the TC39 Set methods proposal.
const
union = (a, b) => new Set([...a, ...b]),
intersection = (a, b) => new Set([...a].filter(x => b.has(x))),
difference = (a, b) => new Set([...a].filter(x => !b.has(x))),
symetricDifference = (a, b) => union(difference(a, b), difference(b, a)),
isSubsetOf = (a, b) => [...b].every(x => a.has(x)),
isSupersetOf = (a, b) => [...a].every(x => b.has(x)),
isDisjointFrom = (a, b) => !intersection(a, b).size;
const
a = new Set([1, 2, 3, 4]),
b = new Set([5, 4, 3, 2]);
console.log(...union(a, b)); // [1, 2, 3, 4, 5]
console.log(...intersection(a, b)); // [2, 3, 4]
console.log(...difference(a, b)); // [1]
console.log(...difference(b, a)); // [5]
console.log(...symetricDifference(a, b)); // [1, 5]
const
c = new Set(['A', 'B', 'C', 'D', 'E']),
d = new Set(['B', 'D']);
console.log(isSubsetOf(c, d)); // true
console.log(isSupersetOf(d, c)); // true
const
e = new Set(['A', 'B', 'C']),
f = new Set(['X', 'Y', 'Z']);
console.log(isDisjointFrom(e, f)); // true
.as-console-wrapper { top: 0; max-height: 100% !important; }
This works, but I think another one is much more shorter, and elegant too
A = [1, 'a', 'b', 12];
B = ['a', 3, 4, 'b'];
diff_set = {
ar : {},
diff : Array(),
remove_set : function(a) { ar = a; return this; },
remove: function (el) {
if(ar.indexOf(el)<0) this.diff.push(el);
}
}
A.forEach(diff_set.remove_set(B).remove,diff_set);
C = diff_set.diff;
Using core-js to polyfill the New Set methods proposal:
import "core-js"
new Set(A).difference(B)
In theory, the time complexity should be Θ(n), where n is the number of elements in B.
Let's say that I have an Javascript array looking as following:
["Element 1","Element 2","Element 3",...]; // with close to a hundred elements.
What approach would be appropriate to chunk (split) the array into many smaller arrays with, lets say, 10 elements at its most?
The array.slice() method can extract a slice from the beginning, middle, or end of an array for whatever purposes you require, without changing the original array.
const chunkSize = 10;
for (let i = 0; i < array.length; i += chunkSize) {
const chunk = array.slice(i, i + chunkSize);
// do whatever
}
The last chunk may be smaller than chunkSize. For example when given an array of 12 elements the first chunk will have 10 elements, the second chunk only has 2.
Note that a chunkSize of 0 will cause an infinite loop.
Here's a ES6 version using reduce
const perChunk = 2 // items per chunk
const inputArray = ['a','b','c','d','e']
const result = inputArray.reduce((resultArray, item, index) => {
const chunkIndex = Math.floor(index/perChunk)
if(!resultArray[chunkIndex]) {
resultArray[chunkIndex] = [] // start a new chunk
}
resultArray[chunkIndex].push(item)
return resultArray
}, [])
console.log(result); // result: [['a','b'], ['c','d'], ['e']]
And you're ready to chain further map/reduce transformations.
Your input array is left intact
If you prefer a shorter but less readable version, you can sprinkle some concat into the mix for the same end result:
inputArray.reduce((all,one,i) => {
const ch = Math.floor(i/perChunk);
all[ch] = [].concat((all[ch]||[]),one);
return all
}, [])
You can use remainder operator to put consecutive items into different chunks:
const ch = (i % perChunk);
Modified from an answer by dbaseman: https://stackoverflow.com/a/10456344/711085
Object.defineProperty(Array.prototype, 'chunk_inefficient', {
value: function(chunkSize) {
var array = this;
return [].concat.apply([],
array.map(function(elem, i) {
return i % chunkSize ? [] : [array.slice(i, i + chunkSize)];
})
);
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk_inefficient(3)
)
// [[1, 2, 3], [4, 5, 6], [7]]
minor addendum:
I should point out that the above is a not-that-elegant (in my mind) workaround to use Array.map. It basically does the following, where ~ is concatenation:
[[1,2,3]]~[]~[]~[] ~ [[4,5,6]]~[]~[]~[] ~ [[7]]
It has the same asymptotic running time as the method below, but perhaps a worse constant factor due to building empty lists. One could rewrite this as follows (mostly the same as Blazemonger's method, which is why I did not originally submit this answer):
More efficient method:
// refresh page if experimenting and you already defined Array.prototype.chunk
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize) {
var R = [];
for (var i = 0; i < this.length; i += chunkSize)
R.push(this.slice(i, i + chunkSize));
return R;
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk(3)
)
My preferred way nowadays is the above, or one of the following:
Array.range = function(n) {
// Array.range(5) --> [0,1,2,3,4]
return Array.apply(null,Array(n)).map((x,i) => i)
};
Object.defineProperty(Array.prototype, 'chunk', {
value: function(n) {
// ACTUAL CODE FOR CHUNKING ARRAY:
return Array.range(Math.ceil(this.length/n)).map((x,i) => this.slice(i*n,i*n+n));
}
});
Demo:
> JSON.stringify( Array.range(10).chunk(3) );
[[1,2,3],[4,5,6],[7,8,9],[10]]
Or if you don't want an Array.range function, it's actually just a one-liner (excluding the fluff):
var ceil = Math.ceil;
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array(ceil(this.length/n)).fill().map((_,i) => this.slice(i*n,i*n+n));
}});
or
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array.from(Array(ceil(this.length/n)), (_,i)=>this.slice(i*n,i*n+n));
}});
Try to avoid mucking with native prototypes, including Array.prototype, if you don't know who will be consuming your code (3rd parties, coworkers, yourself at a later date, etc.).
There are ways to safely extend prototypes (but not in all browsers) and there are ways to safely consume objects created from extended prototypes, but a better rule of thumb is to follow the Principle of Least Surprise and avoid these practices altogether.
If you have some time, watch Andrew Dupont's JSConf 2011 talk, "Everything is Permitted: Extending Built-ins", for a good discussion about this topic.
But back to the question, while the solutions above will work, they are overly complex and requiring unnecessary computational overhead. Here is my solution:
function chunk (arr, len) {
var chunks = [],
i = 0,
n = arr.length;
while (i < n) {
chunks.push(arr.slice(i, i += len));
}
return chunks;
}
// Optionally, you can do the following to avoid cluttering the global namespace:
Array.chunk = chunk;
Using generators
function* chunks(arr, n) {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
let someArray = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
console.log([...chunks(someArray, 2)]) // [[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]]
Can be typed with Typescript like so:
function* chunks<T>(arr: T[], n: number): Generator<T[], void> {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
I tested the different answers into jsperf.com. The result is available there: https://web.archive.org/web/20150909134228/https://jsperf.com/chunk-mtds
And the fastest function (and that works from IE8) is this one:
function chunk(arr, chunkSize) {
if (chunkSize <= 0) throw "Invalid chunk size";
var R = [];
for (var i=0,len=arr.length; i<len; i+=chunkSize)
R.push(arr.slice(i,i+chunkSize));
return R;
}
Splice version using ES6
let [list,chunkSize] = [[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15], 6];
list = [...Array(Math.ceil(list.length / chunkSize))].map(_ => list.splice(0,chunkSize))
console.log(list);
I'd prefer to use splice method:
var chunks = function(array, size) {
var results = [];
while (array.length) {
results.push(array.splice(0, size));
}
return results;
};
Nowadays you can use lodash' chunk function to split the array into smaller arrays https://lodash.com/docs#chunk No need to fiddle with the loops anymore!
Old question: New answer! I actually was working with an answer from this question and had a friend improve on it! So here it is:
Array.prototype.chunk = function ( n ) {
if ( !this.length ) {
return [];
}
return [ this.slice( 0, n ) ].concat( this.slice(n).chunk(n) );
};
[1,2,3,4,5,6,7,8,9,0].chunk(3);
> [[1,2,3],[4,5,6],[7,8,9],[0]]
One more solution using Array.prototype.reduce():
const chunk = (array, size) =>
array.reduce((acc, _, i) => {
if (i % size === 0) acc.push(array.slice(i, i + size))
return acc
}, [])
// Usage:
const numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
const chunked = chunk(numbers, 3)
console.log(chunked)
This solution is very similar to the solution by Steve Holgado. However, because this solution doesn't utilize array spreading and doesn't create new arrays in the reducer function, it's faster (see jsPerf test) and subjectively more readable (simpler syntax) than the other solution.
At every nth iteration (where n = size; starting at the first iteration), the accumulator array (acc) is appended with a chunk of the array (array.slice(i, i + size)) and then returned. At other iterations, the accumulator array is returned as-is.
If size is zero, the method returns an empty array. If size is negative, the method returns broken results. So, if needed in your case, you may want to do something about negative or non-positive size values.
If speed is important in your case, a simple for loop would be faster than using reduce() (see the jsPerf test), and some may find this style more readable as well:
function chunk(array, size) {
// This prevents infinite loops
if (size < 1) throw new Error('Size must be positive')
const result = []
for (let i = 0; i < array.length; i += size) {
result.push(array.slice(i, i + size))
}
return result
}
There have been many answers but this is what I use:
const chunk = (arr, size) =>
arr
.reduce((acc, _, i) =>
(i % size)
? acc
: [...acc, arr.slice(i, i + size)]
, [])
// USAGE
const numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
chunk(numbers, 3)
// [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]
First, check for a remainder when dividing the index by the chunk size.
If there is a remainder then just return the accumulator array.
If there is no remainder then the index is divisible by the chunk size, so take a slice from the original array (starting at the current index) and add it to the accumulator array.
So, the returned accumulator array for each iteration of reduce looks something like this:
// 0: [[1, 2, 3]]
// 1: [[1, 2, 3]]
// 2: [[1, 2, 3]]
// 3: [[1, 2, 3], [4, 5, 6]]
// 4: [[1, 2, 3], [4, 5, 6]]
// 5: [[1, 2, 3], [4, 5, 6]]
// 6: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 7: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 8: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 9: [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]
I think this a nice recursive solution with ES6 syntax:
const chunk = function(array, size) {
if (!array.length) {
return [];
}
const head = array.slice(0, size);
const tail = array.slice(size);
return [head, ...chunk(tail, size)];
};
console.log(chunk([1,2,3], 2));
ONE-LINER
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
For TypeScript
const chunk = <T>(arr: T[], size: number): T[][] =>
[...Array(Math.ceil(arr.length / size))].map((_, i) =>
arr.slice(size * i, size + size * i)
);
DEMO
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
document.write(JSON.stringify(chunk([1, 2, 3, 4], 2)));
Chunk By Number Of Groups
const part=(a,n)=>[...Array(n)].map((_,i)=>a.slice(i*Math.ceil(a.length/n),(i+1)*Math.ceil(a.length/n)));
For TypeScript
const part = <T>(a: T[], n: number): T[][] => {
const b = Math.ceil(a.length / n);
return [...Array(n)].map((_, i) => a.slice(i * b, (i + 1) * b));
};
DEMO
const part = (a, n) => {
const b = Math.ceil(a.length / n);
return [...Array(n)].map((_, i) => a.slice(i * b, (i + 1) * b));
};
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6], 2))+'<br/>');
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6, 7], 2)));
Ok, let's start with a fairly tight one:
function chunk(arr, n) {
return arr.slice(0,(arr.length+n-1)/n|0).
map(function(c,i) { return arr.slice(n*i,n*i+n); });
}
Which is used like this:
chunk([1,2,3,4,5,6,7], 2);
Then we have this tight reducer function:
function chunker(p, c, i) {
(p[i/this|0] = p[i/this|0] || []).push(c);
return p;
}
Which is used like this:
[1,2,3,4,5,6,7].reduce(chunker.bind(3),[]);
Since a kitten dies when we bind this to a number, we can do manual currying like this instead:
// Fluent alternative API without prototype hacks.
function chunker(n) {
return function(p, c, i) {
(p[i/n|0] = p[i/n|0] || []).push(c);
return p;
};
}
Which is used like this:
[1,2,3,4,5,6,7].reduce(chunker(3),[]);
Then the still pretty tight function which does it all in one go:
function chunk(arr, n) {
return arr.reduce(function(p, cur, i) {
(p[i/n|0] = p[i/n|0] || []).push(cur);
return p;
},[]);
}
chunk([1,2,3,4,5,6,7], 3);
I aimed at creating a simple non-mutating solution in pure ES6. Peculiarities in javascript make it necessary to fill the empty array before mapping :-(
function chunk(a, l) {
return new Array(Math.ceil(a.length / l)).fill(0)
.map((_, n) => a.slice(n*l, n*l + l));
}
This version with recursion seem simpler and more compelling:
function chunk(a, l) {
if (a.length == 0) return [];
else return [a.slice(0, l)].concat(chunk(a.slice(l), l));
}
The ridiculously weak array functions of ES6 makes for good puzzles :-)
Created a npm package for this https://www.npmjs.com/package/array.chunk
var result = [];
for (var i = 0; i < arr.length; i += size) {
result.push(arr.slice(i, size + i));
}
return result;
When using a TypedArray
var result = [];
for (var i = 0; i < arr.length; i += size) {
result.push(arr.subarray(i, size + i));
}
return result;
Using Array.prototype.splice() and splice it until the array has element.
Array.prototype.chunk = function(size) {
let result = [];
while(this.length) {
result.push(this.splice(0, size));
}
return result;
}
const arr = [1, 2, 3, 4, 5, 6, 7, 8, 9];
console.log(arr.chunk(2));
Update
Array.prototype.splice() populates the original array and after performing the chunk() the original array (arr) becomes [].
So if you want to keep the original array untouched, then copy and keep the arr data into another array and do the same thing.
Array.prototype.chunk = function(size) {
let data = [...this];
let result = [];
while(data.length) {
result.push(data.splice(0, size));
}
return result;
}
const arr = [1, 2, 3, 4, 5, 6, 7, 8, 9];
console.log('chunked:', arr.chunk(2));
console.log('original', arr);
P.S: Thanks to #mts-knn for mentioning the matter.
I recommend using lodash. Chunking is one of many useful functions there.
Instructions:
npm i --save lodash
Include in your project:
import * as _ from 'lodash';
Usage:
const arrayOfElements = ["Element 1","Element 2","Element 3", "Element 4", "Element 5","Element 6","Element 7","Element 8","Element 9","Element 10","Element 11","Element 12"]
const chunkedElements = _.chunk(arrayOfElements, 10)
You can find my sample here:
https://playcode.io/659171/
The following ES2015 approach works without having to define a function and directly on anonymous arrays (example with chunk size 2):
[11,22,33,44,55].map((_, i, all) => all.slice(2*i, 2*i+2)).filter(x=>x.length)
If you want to define a function for this, you could do it as follows (improving on K._'s comment on Blazemonger's answer):
const array_chunks = (array, chunk_size) => array
.map((_, i, all) => all.slice(i*chunk_size, (i+1)*chunk_size))
.filter(x => x.length)
If you use EcmaScript version >= 5.1, you can implement a functional version of chunk() using array.reduce() that has O(N) complexity:
function chunk(chunkSize, array) {
return array.reduce(function(previous, current) {
var chunk;
if (previous.length === 0 ||
previous[previous.length -1].length === chunkSize) {
chunk = []; // 1
previous.push(chunk); // 2
}
else {
chunk = previous[previous.length -1]; // 3
}
chunk.push(current); // 4
return previous; // 5
}, []); // 6
}
console.log(chunk(2, ['a', 'b', 'c', 'd', 'e']));
// prints [ [ 'a', 'b' ], [ 'c', 'd' ], [ 'e' ] ]
Explanation of each // nbr above:
Create a new chunk if the previous value, i.e. the previously returned array of chunks, is empty or if the last previous chunk has chunkSize items
Add the new chunk to the array of existing chunks
Otherwise, the current chunk is the last chunk in the array of chunks
Add the current value to the chunk
Return the modified array of chunks
Initialize the reduction by passing an empty array
Currying based on chunkSize:
var chunk3 = function(array) {
return chunk(3, array);
};
console.log(chunk3(['a', 'b', 'c', 'd', 'e']));
// prints [ [ 'a', 'b', 'c' ], [ 'd', 'e' ] ]
You can add the chunk() function to the global Array object:
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize) {
return this.reduce(function(previous, current) {
var chunk;
if (previous.length === 0 ||
previous[previous.length -1].length === chunkSize) {
chunk = [];
previous.push(chunk);
}
else {
chunk = previous[previous.length -1];
}
chunk.push(current);
return previous;
}, []);
}
});
console.log(['a', 'b', 'c', 'd', 'e'].chunk(4));
// prints [ [ 'a', 'b', 'c' 'd' ], [ 'e' ] ]
Use chunk from lodash
lodash.chunk(arr,<size>).forEach(chunk=>{
console.log(chunk);
})
js
function splitToBulks(arr, bulkSize = 20) {
const bulks = [];
for (let i = 0; i < Math.ceil(arr.length / bulkSize); i++) {
bulks.push(arr.slice(i * bulkSize, (i + 1) * bulkSize));
}
return bulks;
}
console.log(splitToBulks([1, 2, 3, 4, 5, 6, 7], 3));
typescript
function splitToBulks<T>(arr: T[], bulkSize: number = 20): T[][] {
const bulks: T[][] = [];
for (let i = 0; i < Math.ceil(arr.length / bulkSize); i++) {
bulks.push(arr.slice(i * bulkSize, (i + 1) * bulkSize));
}
return bulks;
}
results = []
chunk_size = 10
while(array.length > 0){
results.push(array.splice(0, chunk_size))
}
The one line in pure javascript:
function chunks(array, size) {
return Array.apply(0,{length: Math.ceil(array.length / size)}).map((_, index) => array.slice(index*size, (index+1)*size))
}
// The following will group letters of the alphabet by 4
console.log(chunks([...Array(26)].map((x,i)=>String.fromCharCode(i + 97)), 4))
Here is an example where I split an array into chunks of 2 elements, simply by splicing chunks out of the array until the original array is empty.
const array = [86,133,87,133,88,133,89,133,90,133];
const new_array = [];
const chunksize = 2;
while (array.length) {
const chunk = array.splice(0,chunksize);
new_array.push(chunk);
}
console.log(new_array)
You can use the Array.prototype.reduce function to do this in one line.
let arr = [1,2,3,4];
function chunk(arr, size)
{
let result = arr.reduce((rows, key, index) => (index % size == 0 ? rows.push([key]) : rows[rows.length-1].push(key)) && rows, []);
return result;
}
console.log(chunk(arr,2));
const array = ['a', 'b', 'c', 'd', 'e'];
const size = 2;
const chunks = [];
while (array.length) {
chunks.push(array.splice(0, size));
}
console.log(chunks);
in coffeescript:
b = (a.splice(0, len) while a.length)
demo
a = [1, 2, 3, 4, 5, 6, 7]
b = (a.splice(0, 2) while a.length)
[ [ 1, 2 ],
[ 3, 4 ],
[ 5, 6 ],
[ 7 ] ]
And this would be my contribution to this topic. I guess .reduce() is the best way.
var segment = (arr, n) => arr.reduce((r,e,i) => i%n ? (r[r.length-1].push(e), r)
: (r.push([e]), r), []),
arr = Array.from({length: 31}).map((_,i) => i+1);
res = segment(arr,7);
console.log(JSON.stringify(res));
But the above implementation is not very efficient since .reduce() runs through all arr function. A more efficient approach (very close to the fastest imperative solution) would be, iterating over the reduced (to be chunked) array since we can calculate it's size in advance by Math.ceil(arr/n);. Once we have the empty result array like Array(Math.ceil(arr.length/n)).fill(); the rest is to map slices of the arr array into it.
function chunk(arr,n){
var r = Array(Math.ceil(arr.length/n)).fill();
return r.map((e,i) => arr.slice(i*n, i*n+n));
}
arr = Array.from({length: 31},(_,i) => i+1);
res = chunk(arr,7);
console.log(JSON.stringify(res));
So far so good but we can still simplify the above snipet further.
var chunk = (a,n) => Array.from({length: Math.ceil(a.length/n)}, (_,i) => a.slice(i*n, i*n+n)),
arr = Array.from({length: 31},(_,i) => i+1),
res = chunk(arr,7);
console.log(JSON.stringify(res));
Let's say that I have an Javascript array looking as following:
["Element 1","Element 2","Element 3",...]; // with close to a hundred elements.
What approach would be appropriate to chunk (split) the array into many smaller arrays with, lets say, 10 elements at its most?
The array.slice() method can extract a slice from the beginning, middle, or end of an array for whatever purposes you require, without changing the original array.
const chunkSize = 10;
for (let i = 0; i < array.length; i += chunkSize) {
const chunk = array.slice(i, i + chunkSize);
// do whatever
}
The last chunk may be smaller than chunkSize. For example when given an array of 12 elements the first chunk will have 10 elements, the second chunk only has 2.
Note that a chunkSize of 0 will cause an infinite loop.
Here's a ES6 version using reduce
const perChunk = 2 // items per chunk
const inputArray = ['a','b','c','d','e']
const result = inputArray.reduce((resultArray, item, index) => {
const chunkIndex = Math.floor(index/perChunk)
if(!resultArray[chunkIndex]) {
resultArray[chunkIndex] = [] // start a new chunk
}
resultArray[chunkIndex].push(item)
return resultArray
}, [])
console.log(result); // result: [['a','b'], ['c','d'], ['e']]
And you're ready to chain further map/reduce transformations.
Your input array is left intact
If you prefer a shorter but less readable version, you can sprinkle some concat into the mix for the same end result:
inputArray.reduce((all,one,i) => {
const ch = Math.floor(i/perChunk);
all[ch] = [].concat((all[ch]||[]),one);
return all
}, [])
You can use remainder operator to put consecutive items into different chunks:
const ch = (i % perChunk);
Modified from an answer by dbaseman: https://stackoverflow.com/a/10456344/711085
Object.defineProperty(Array.prototype, 'chunk_inefficient', {
value: function(chunkSize) {
var array = this;
return [].concat.apply([],
array.map(function(elem, i) {
return i % chunkSize ? [] : [array.slice(i, i + chunkSize)];
})
);
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk_inefficient(3)
)
// [[1, 2, 3], [4, 5, 6], [7]]
minor addendum:
I should point out that the above is a not-that-elegant (in my mind) workaround to use Array.map. It basically does the following, where ~ is concatenation:
[[1,2,3]]~[]~[]~[] ~ [[4,5,6]]~[]~[]~[] ~ [[7]]
It has the same asymptotic running time as the method below, but perhaps a worse constant factor due to building empty lists. One could rewrite this as follows (mostly the same as Blazemonger's method, which is why I did not originally submit this answer):
More efficient method:
// refresh page if experimenting and you already defined Array.prototype.chunk
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize) {
var R = [];
for (var i = 0; i < this.length; i += chunkSize)
R.push(this.slice(i, i + chunkSize));
return R;
}
});
console.log(
[1, 2, 3, 4, 5, 6, 7].chunk(3)
)
My preferred way nowadays is the above, or one of the following:
Array.range = function(n) {
// Array.range(5) --> [0,1,2,3,4]
return Array.apply(null,Array(n)).map((x,i) => i)
};
Object.defineProperty(Array.prototype, 'chunk', {
value: function(n) {
// ACTUAL CODE FOR CHUNKING ARRAY:
return Array.range(Math.ceil(this.length/n)).map((x,i) => this.slice(i*n,i*n+n));
}
});
Demo:
> JSON.stringify( Array.range(10).chunk(3) );
[[1,2,3],[4,5,6],[7,8,9],[10]]
Or if you don't want an Array.range function, it's actually just a one-liner (excluding the fluff):
var ceil = Math.ceil;
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array(ceil(this.length/n)).fill().map((_,i) => this.slice(i*n,i*n+n));
}});
or
Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
return Array.from(Array(ceil(this.length/n)), (_,i)=>this.slice(i*n,i*n+n));
}});
Try to avoid mucking with native prototypes, including Array.prototype, if you don't know who will be consuming your code (3rd parties, coworkers, yourself at a later date, etc.).
There are ways to safely extend prototypes (but not in all browsers) and there are ways to safely consume objects created from extended prototypes, but a better rule of thumb is to follow the Principle of Least Surprise and avoid these practices altogether.
If you have some time, watch Andrew Dupont's JSConf 2011 talk, "Everything is Permitted: Extending Built-ins", for a good discussion about this topic.
But back to the question, while the solutions above will work, they are overly complex and requiring unnecessary computational overhead. Here is my solution:
function chunk (arr, len) {
var chunks = [],
i = 0,
n = arr.length;
while (i < n) {
chunks.push(arr.slice(i, i += len));
}
return chunks;
}
// Optionally, you can do the following to avoid cluttering the global namespace:
Array.chunk = chunk;
Using generators
function* chunks(arr, n) {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
let someArray = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
console.log([...chunks(someArray, 2)]) // [[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]]
Can be typed with Typescript like so:
function* chunks<T>(arr: T[], n: number): Generator<T[], void> {
for (let i = 0; i < arr.length; i += n) {
yield arr.slice(i, i + n);
}
}
I tested the different answers into jsperf.com. The result is available there: https://web.archive.org/web/20150909134228/https://jsperf.com/chunk-mtds
And the fastest function (and that works from IE8) is this one:
function chunk(arr, chunkSize) {
if (chunkSize <= 0) throw "Invalid chunk size";
var R = [];
for (var i=0,len=arr.length; i<len; i+=chunkSize)
R.push(arr.slice(i,i+chunkSize));
return R;
}
Splice version using ES6
let [list,chunkSize] = [[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15], 6];
list = [...Array(Math.ceil(list.length / chunkSize))].map(_ => list.splice(0,chunkSize))
console.log(list);
I'd prefer to use splice method:
var chunks = function(array, size) {
var results = [];
while (array.length) {
results.push(array.splice(0, size));
}
return results;
};
Nowadays you can use lodash' chunk function to split the array into smaller arrays https://lodash.com/docs#chunk No need to fiddle with the loops anymore!
Old question: New answer! I actually was working with an answer from this question and had a friend improve on it! So here it is:
Array.prototype.chunk = function ( n ) {
if ( !this.length ) {
return [];
}
return [ this.slice( 0, n ) ].concat( this.slice(n).chunk(n) );
};
[1,2,3,4,5,6,7,8,9,0].chunk(3);
> [[1,2,3],[4,5,6],[7,8,9],[0]]
One more solution using Array.prototype.reduce():
const chunk = (array, size) =>
array.reduce((acc, _, i) => {
if (i % size === 0) acc.push(array.slice(i, i + size))
return acc
}, [])
// Usage:
const numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
const chunked = chunk(numbers, 3)
console.log(chunked)
This solution is very similar to the solution by Steve Holgado. However, because this solution doesn't utilize array spreading and doesn't create new arrays in the reducer function, it's faster (see jsPerf test) and subjectively more readable (simpler syntax) than the other solution.
At every nth iteration (where n = size; starting at the first iteration), the accumulator array (acc) is appended with a chunk of the array (array.slice(i, i + size)) and then returned. At other iterations, the accumulator array is returned as-is.
If size is zero, the method returns an empty array. If size is negative, the method returns broken results. So, if needed in your case, you may want to do something about negative or non-positive size values.
If speed is important in your case, a simple for loop would be faster than using reduce() (see the jsPerf test), and some may find this style more readable as well:
function chunk(array, size) {
// This prevents infinite loops
if (size < 1) throw new Error('Size must be positive')
const result = []
for (let i = 0; i < array.length; i += size) {
result.push(array.slice(i, i + size))
}
return result
}
There have been many answers but this is what I use:
const chunk = (arr, size) =>
arr
.reduce((acc, _, i) =>
(i % size)
? acc
: [...acc, arr.slice(i, i + size)]
, [])
// USAGE
const numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
chunk(numbers, 3)
// [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]
First, check for a remainder when dividing the index by the chunk size.
If there is a remainder then just return the accumulator array.
If there is no remainder then the index is divisible by the chunk size, so take a slice from the original array (starting at the current index) and add it to the accumulator array.
So, the returned accumulator array for each iteration of reduce looks something like this:
// 0: [[1, 2, 3]]
// 1: [[1, 2, 3]]
// 2: [[1, 2, 3]]
// 3: [[1, 2, 3], [4, 5, 6]]
// 4: [[1, 2, 3], [4, 5, 6]]
// 5: [[1, 2, 3], [4, 5, 6]]
// 6: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 7: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 8: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
// 9: [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]
I think this a nice recursive solution with ES6 syntax:
const chunk = function(array, size) {
if (!array.length) {
return [];
}
const head = array.slice(0, size);
const tail = array.slice(size);
return [head, ...chunk(tail, size)];
};
console.log(chunk([1,2,3], 2));
ONE-LINER
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
For TypeScript
const chunk = <T>(arr: T[], size: number): T[][] =>
[...Array(Math.ceil(arr.length / size))].map((_, i) =>
arr.slice(size * i, size + size * i)
);
DEMO
const chunk = (a,n)=>[...Array(Math.ceil(a.length/n))].map((_,i)=>a.slice(n*i,n+n*i));
document.write(JSON.stringify(chunk([1, 2, 3, 4], 2)));
Chunk By Number Of Groups
const part=(a,n)=>[...Array(n)].map((_,i)=>a.slice(i*Math.ceil(a.length/n),(i+1)*Math.ceil(a.length/n)));
For TypeScript
const part = <T>(a: T[], n: number): T[][] => {
const b = Math.ceil(a.length / n);
return [...Array(n)].map((_, i) => a.slice(i * b, (i + 1) * b));
};
DEMO
const part = (a, n) => {
const b = Math.ceil(a.length / n);
return [...Array(n)].map((_, i) => a.slice(i * b, (i + 1) * b));
};
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6], 2))+'<br/>');
document.write(JSON.stringify(part([1, 2, 3, 4, 5, 6, 7], 2)));
Ok, let's start with a fairly tight one:
function chunk(arr, n) {
return arr.slice(0,(arr.length+n-1)/n|0).
map(function(c,i) { return arr.slice(n*i,n*i+n); });
}
Which is used like this:
chunk([1,2,3,4,5,6,7], 2);
Then we have this tight reducer function:
function chunker(p, c, i) {
(p[i/this|0] = p[i/this|0] || []).push(c);
return p;
}
Which is used like this:
[1,2,3,4,5,6,7].reduce(chunker.bind(3),[]);
Since a kitten dies when we bind this to a number, we can do manual currying like this instead:
// Fluent alternative API without prototype hacks.
function chunker(n) {
return function(p, c, i) {
(p[i/n|0] = p[i/n|0] || []).push(c);
return p;
};
}
Which is used like this:
[1,2,3,4,5,6,7].reduce(chunker(3),[]);
Then the still pretty tight function which does it all in one go:
function chunk(arr, n) {
return arr.reduce(function(p, cur, i) {
(p[i/n|0] = p[i/n|0] || []).push(cur);
return p;
},[]);
}
chunk([1,2,3,4,5,6,7], 3);
I aimed at creating a simple non-mutating solution in pure ES6. Peculiarities in javascript make it necessary to fill the empty array before mapping :-(
function chunk(a, l) {
return new Array(Math.ceil(a.length / l)).fill(0)
.map((_, n) => a.slice(n*l, n*l + l));
}
This version with recursion seem simpler and more compelling:
function chunk(a, l) {
if (a.length == 0) return [];
else return [a.slice(0, l)].concat(chunk(a.slice(l), l));
}
The ridiculously weak array functions of ES6 makes for good puzzles :-)
Created a npm package for this https://www.npmjs.com/package/array.chunk
var result = [];
for (var i = 0; i < arr.length; i += size) {
result.push(arr.slice(i, size + i));
}
return result;
When using a TypedArray
var result = [];
for (var i = 0; i < arr.length; i += size) {
result.push(arr.subarray(i, size + i));
}
return result;
Using Array.prototype.splice() and splice it until the array has element.
Array.prototype.chunk = function(size) {
let result = [];
while(this.length) {
result.push(this.splice(0, size));
}
return result;
}
const arr = [1, 2, 3, 4, 5, 6, 7, 8, 9];
console.log(arr.chunk(2));
Update
Array.prototype.splice() populates the original array and after performing the chunk() the original array (arr) becomes [].
So if you want to keep the original array untouched, then copy and keep the arr data into another array and do the same thing.
Array.prototype.chunk = function(size) {
let data = [...this];
let result = [];
while(data.length) {
result.push(data.splice(0, size));
}
return result;
}
const arr = [1, 2, 3, 4, 5, 6, 7, 8, 9];
console.log('chunked:', arr.chunk(2));
console.log('original', arr);
P.S: Thanks to #mts-knn for mentioning the matter.
I recommend using lodash. Chunking is one of many useful functions there.
Instructions:
npm i --save lodash
Include in your project:
import * as _ from 'lodash';
Usage:
const arrayOfElements = ["Element 1","Element 2","Element 3", "Element 4", "Element 5","Element 6","Element 7","Element 8","Element 9","Element 10","Element 11","Element 12"]
const chunkedElements = _.chunk(arrayOfElements, 10)
You can find my sample here:
https://playcode.io/659171/
The following ES2015 approach works without having to define a function and directly on anonymous arrays (example with chunk size 2):
[11,22,33,44,55].map((_, i, all) => all.slice(2*i, 2*i+2)).filter(x=>x.length)
If you want to define a function for this, you could do it as follows (improving on K._'s comment on Blazemonger's answer):
const array_chunks = (array, chunk_size) => array
.map((_, i, all) => all.slice(i*chunk_size, (i+1)*chunk_size))
.filter(x => x.length)
If you use EcmaScript version >= 5.1, you can implement a functional version of chunk() using array.reduce() that has O(N) complexity:
function chunk(chunkSize, array) {
return array.reduce(function(previous, current) {
var chunk;
if (previous.length === 0 ||
previous[previous.length -1].length === chunkSize) {
chunk = []; // 1
previous.push(chunk); // 2
}
else {
chunk = previous[previous.length -1]; // 3
}
chunk.push(current); // 4
return previous; // 5
}, []); // 6
}
console.log(chunk(2, ['a', 'b', 'c', 'd', 'e']));
// prints [ [ 'a', 'b' ], [ 'c', 'd' ], [ 'e' ] ]
Explanation of each // nbr above:
Create a new chunk if the previous value, i.e. the previously returned array of chunks, is empty or if the last previous chunk has chunkSize items
Add the new chunk to the array of existing chunks
Otherwise, the current chunk is the last chunk in the array of chunks
Add the current value to the chunk
Return the modified array of chunks
Initialize the reduction by passing an empty array
Currying based on chunkSize:
var chunk3 = function(array) {
return chunk(3, array);
};
console.log(chunk3(['a', 'b', 'c', 'd', 'e']));
// prints [ [ 'a', 'b', 'c' ], [ 'd', 'e' ] ]
You can add the chunk() function to the global Array object:
Object.defineProperty(Array.prototype, 'chunk', {
value: function(chunkSize) {
return this.reduce(function(previous, current) {
var chunk;
if (previous.length === 0 ||
previous[previous.length -1].length === chunkSize) {
chunk = [];
previous.push(chunk);
}
else {
chunk = previous[previous.length -1];
}
chunk.push(current);
return previous;
}, []);
}
});
console.log(['a', 'b', 'c', 'd', 'e'].chunk(4));
// prints [ [ 'a', 'b', 'c' 'd' ], [ 'e' ] ]
Use chunk from lodash
lodash.chunk(arr,<size>).forEach(chunk=>{
console.log(chunk);
})
js
function splitToBulks(arr, bulkSize = 20) {
const bulks = [];
for (let i = 0; i < Math.ceil(arr.length / bulkSize); i++) {
bulks.push(arr.slice(i * bulkSize, (i + 1) * bulkSize));
}
return bulks;
}
console.log(splitToBulks([1, 2, 3, 4, 5, 6, 7], 3));
typescript
function splitToBulks<T>(arr: T[], bulkSize: number = 20): T[][] {
const bulks: T[][] = [];
for (let i = 0; i < Math.ceil(arr.length / bulkSize); i++) {
bulks.push(arr.slice(i * bulkSize, (i + 1) * bulkSize));
}
return bulks;
}
results = []
chunk_size = 10
while(array.length > 0){
results.push(array.splice(0, chunk_size))
}
The one line in pure javascript:
function chunks(array, size) {
return Array.apply(0,{length: Math.ceil(array.length / size)}).map((_, index) => array.slice(index*size, (index+1)*size))
}
// The following will group letters of the alphabet by 4
console.log(chunks([...Array(26)].map((x,i)=>String.fromCharCode(i + 97)), 4))
Here is an example where I split an array into chunks of 2 elements, simply by splicing chunks out of the array until the original array is empty.
const array = [86,133,87,133,88,133,89,133,90,133];
const new_array = [];
const chunksize = 2;
while (array.length) {
const chunk = array.splice(0,chunksize);
new_array.push(chunk);
}
console.log(new_array)
You can use the Array.prototype.reduce function to do this in one line.
let arr = [1,2,3,4];
function chunk(arr, size)
{
let result = arr.reduce((rows, key, index) => (index % size == 0 ? rows.push([key]) : rows[rows.length-1].push(key)) && rows, []);
return result;
}
console.log(chunk(arr,2));
const array = ['a', 'b', 'c', 'd', 'e'];
const size = 2;
const chunks = [];
while (array.length) {
chunks.push(array.splice(0, size));
}
console.log(chunks);
in coffeescript:
b = (a.splice(0, len) while a.length)
demo
a = [1, 2, 3, 4, 5, 6, 7]
b = (a.splice(0, 2) while a.length)
[ [ 1, 2 ],
[ 3, 4 ],
[ 5, 6 ],
[ 7 ] ]
And this would be my contribution to this topic. I guess .reduce() is the best way.
var segment = (arr, n) => arr.reduce((r,e,i) => i%n ? (r[r.length-1].push(e), r)
: (r.push([e]), r), []),
arr = Array.from({length: 31}).map((_,i) => i+1);
res = segment(arr,7);
console.log(JSON.stringify(res));
But the above implementation is not very efficient since .reduce() runs through all arr function. A more efficient approach (very close to the fastest imperative solution) would be, iterating over the reduced (to be chunked) array since we can calculate it's size in advance by Math.ceil(arr/n);. Once we have the empty result array like Array(Math.ceil(arr.length/n)).fill(); the rest is to map slices of the arr array into it.
function chunk(arr,n){
var r = Array(Math.ceil(arr.length/n)).fill();
return r.map((e,i) => arr.slice(i*n, i*n+n));
}
arr = Array.from({length: 31},(_,i) => i+1);
res = chunk(arr,7);
console.log(JSON.stringify(res));
So far so good but we can still simplify the above snipet further.
var chunk = (a,n) => Array.from({length: Math.ceil(a.length/n)}, (_,i) => a.slice(i*n, i*n+n)),
arr = Array.from({length: 31},(_,i) => i+1),
res = chunk(arr,7);
console.log(JSON.stringify(res));
Is there a JavaScript equivalent to Clojure's "reductions" function or Python's itertools.accumulate? In other words, given an array [x_0, x_1, x_2 ... x_n-1] and a function f(prev, next), it would return an array of length n with values:
[x_0, f(x_0, x_1), f(f(x_0, x_1), x_2)... f(f(f(...)), x_n)]
I'm simulating the desired behavior below:
function accumsum(prev, next) {
last = prev[prev.length - 1] || 0;
prev.push(last + next);
return prev;
}
var x = [1, 1, 1, 1];
var y = x.reduce(accumsum, []);
var z = y.reduce(accumsum, []);
console.log(x);
console.log(y);
console.log(z);
which displays:
[ 1, 1, 1, 1 ]
[ 1, 2, 3, 4 ]
[ 1, 3, 6, 10 ]
But I'm wondering if there is a way to write something simpler like
[1, 1, 1, 1].reductions(function(prev, next) {return prev + next;});
If not, is there a more idiomatic way to do this in JavaScript than what I wrote?
var a = [1, 1, 1, 1];
var c = 0;
a.map(function(x) { return c += x; })
// => [1, 2, 3, 4]
a.reduce(function(c, a) {
c.push(c[c.length - 1] + a);
return c;
}, [0]).slice(1);
// => [1, 2, 3, 4]
I'd use the first one, personally.
EDIT:
Is there a way of doing your first suggestion that doesn't require me to have a random global variable (c in this case) floating around? If I forgot to re-initialize c back to 0, the second time I wrote a.map(...) it would give the wrong answer.
Sure - you can encapsulate it.
function cumulativeReduce(fn, start, array) {
var c = start;
return array.map(function(x) {
return (c = fn(c, x));
});
}
cumulativeReduce(function(c, a) { return c + a; }, 0, [1, 1, 1, 1]);
// => [1, 2, 3, 4]
c
// => ReferenceError - no dangling global variables
I wrote a stateless version
function reductions(coll, reducer, init) {
if (!coll.length) {
return [init]
}
if (init === undefined) {
return reductions(_.drop(coll, 1), reducer, _.first(coll))
}
return [init].concat(reductions(_.drop(coll, 1), reducer, reducer(init, _.first(coll))))
}
For posterity, if you're in a situation where you're using an older version of JavaScript, or don't have access to Underscore.
It's not difficult to implement from scratch and has some educational value.
Here's one way to do it:
function reduce(a, fn, memo) {
var i;
for (i = 0; i < a.length; ++i) {
if ( typeof memo === 'undefined' && i === 0 ) memo = a[i];
else memo = fn(memo, a[i]);
}
return memo;
}
Also, other higher order functions can be written in terms of reduce, e.g. "map", shown here:
function map(a, fn) {
return reduce(a, function(memo, x) {
return memo.concat(fn(a));
}, []);
}
for reference the equivalent imperative (and faster) version of map would be:
function map2(a, fn) {
var newA = [], i;
for (i = 0; i < a.length; ++i) {
newA.push(fn(a[i]));
}
return newA;
}