Using map() on an iterator - javascript

Say we have a Map: let m = new Map();, using m.values() returns a map iterator.
But I can't use forEach() or map() on that iterator and implementing a while loop on that iterator seems like an anti-pattern since ES6 offer functions like map().
So is there a way to use map() on an iterator?

The simplest and least performant way to do this is:
Array.from(m).map(([key,value]) => /* whatever */)
Better yet
Array.from(m, ([key, value]) => /* whatever */))
Array.from takes any iterable or array-like thing and converts it into an array! As Daniel points out in the comments, we can add a mapping function to the conversion to remove an iteration and subsequently an intermediate array.
Using Array.from will move your performance from O(1) to O(n) as #hraban points out in the comments. Since m is a Map, and they can't be infinite, we don't have to worry about an infinite sequence. For most instances, this will suffice.
There are a couple of other ways to loop through a map.
Using forEach
m.forEach((value,key) => /* stuff */ )
Using for..of
var myMap = new Map();
myMap.set(0, 'zero');
myMap.set(1, 'one');
for (var [key, value] of myMap) {
console.log(key + ' = ' + value);
}
// 0 = zero
// 1 = one

You could define another iterator function to loop over this:
function* generator() {
for (let i = 0; i < 10; i++) {
console.log(i);
yield i;
}
}
function* mapIterator(iterator, mapping) {
for (let i of iterator) {
yield mapping(i);
}
}
let values = generator();
let mapped = mapIterator(values, (i) => {
let result = i*2;
console.log(`x2 = ${result}`);
return result;
});
console.log('The values will be generated right now.');
console.log(Array.from(mapped).join(','));
Now you might ask: why not just use Array.from instead? Because this will run through the entire iterator, save it to a (temporary) array, iterate it again and then do the mapping. If the list is huge (or even potentially infinite) this will lead to unnecessary memory usage.
Of course, if the list of items is fairly small, using Array.from should be more than sufficient.

Other answers here are... Weird. They seem to be re-implementing parts of the iteration protocol. You can just do this:
function* mapIter(iterable, callback) {
for (let x of iterable) {
yield callback(x);
}
}
and if you want a concrete result just use the spread operator ....
[...mapIter([1, 2, 3], x => x**2)]

This simplest and most performant way is to use the second argument to Array.from to achieve this:
const map = new Map()
map.set('a', 1)
map.set('b', 2)
Array.from(map, ([key, value]) => `${key}:${value}`)
// ['a:1', 'b:2']
This approach works for any non-infinite iterable. And it avoids having to use a separate call to Array.from(map).map(...) which would iterate through the iterable twice and be worse for performance.

There is a proposal, that brings multiple helper functions to Iterator: https://github.com/tc39/proposal-iterator-helpers (rendered)
You can use it today by utilizing core-js-pure:
import { from as iterFrom } from "core-js-pure/features/iterator";
// or if it's working for you (it should work according to the docs,
// but hasn't for me for some reason):
// import iterFrom from "core-js-pure/features/iterator/from";
let m = new Map();
m.set("13", 37);
m.set("42", 42);
const arr = iterFrom(m.values())
.map((val) => val * 2)
.toArray();
// prints "[74, 84]"
console.log(arr);

You could retrieve an iterator over the iterable, then return another iterator that calls the mapping callback function on each iterated element.
const map = (iterable, callback) => {
return {
[Symbol.iterator]() {
const iterator = iterable[Symbol.iterator]();
return {
next() {
const r = iterator.next();
if (r.done)
return r;
else {
return {
value: callback(r.value),
done: false,
};
}
}
}
}
}
};
// Arrays are iterable
console.log(...map([0, 1, 2, 3, 4], (num) => 2 * num)); // 0 2 4 6 8

Take a look at https://www.npmjs.com/package/fluent-iterable
Works with all of iterables (Map, generator function, array) and async iterables.
const map = new Map();
...
console.log(fluent(map).filter(..).map(..));

You could use itiriri that implements array-like methods for iterables:
import { query } from 'itiriri';
let m = new Map();
// set map ...
query(m).filter([k, v] => k < 10).forEach([k, v] => console.log(v));
let arr = query(m.values()).map(v => v * 10).toArray();

In case someone needs the typescript version:
function* mapIter<T1, T2>(iterable: IterableIterator<T1>, callback: (value: T1) => T2) {
for (let x of iterable) {
yield callback(x);
}
}

Based on the answer from MartyO256 (https://stackoverflow.com/a/53159921/7895659), a refactored typescript approach could be the following one:
function mapIterator<TIn, TOut>(
iterator: Iterator<TIn>,
callback: (input: TIn) => TOut,
): Iterator<TOut> {
return {
next() {
const result: IteratorResult<TIn> = iterator.next();
if (result.done === true) {
return result;
} else {
return {
done: false,
value: callback(result.value),
};
}
},
};
}
export function mapIterable<TIn, TOut>(
iterable: Iterable<TIn>,
callback: (input: TIn) => TOut,
): Iterable<TOut> {
const iterator: Iterator<TIn> = iterable[Symbol.iterator]();
const mappedIterator: Iterator<TOut> = mapIterator(iterator, callback);
return {
[Symbol.iterator]: () => mappedIterator,
};
}

Related

Is my understanding of transducers correct?

Let's start with a definition: A transducer is a function that takes a reducer function and returns a reducer function.
A reducer is a binary function that takes an accumulator and a value and returns an accumulator. A reducer can be executed with a reduce function (note: all function are curried but I've cat out this as well as definitions for pipe and compose for the sake of readability - you can see them in live demo):
const reduce = (reducer, init, data) => {
let result = init;
for (const item of data) {
result = reducer(result, item);
}
return result;
}
With reduce we can implement map and filter functions:
const mapReducer = xf => (acc, item) => [...acc, xf(item)];
const map = (xf, arr) => reduce(mapReducer(xf), [], arr);
const filterReducer = predicate => (acc, item) => predicate(item) ?
[...acc, item] :
acc;
const filter = (predicate, arr) => reduce(filterReducer(predicate), [], arr);
As we can see there're a few similarities between map and filter and both of those functions work only with arrays. Another disadvantage is that when we compose those two functions, in each step a temporary array is created that gets passed to another function.
const even = n => n % 2 === 0;
const double = n => n * 2;
const doubleEven = pipe(filter(even), map(double));
doubleEven([1,2,3,4,5]);
// first we get [2, 4] from filter
// then final result: [4, 8]
Transducers help us solve that concerns: when we use a transducer there are no temporary arrays created and we can generalize our functions to work not only with arrays. Transducers need a transduce function to work Transducers are generally executed by passing to transduce function:
const transduce = (xform, iterator, init, data) =>
reduce(xform(iterator), init, data);
const mapping = (xf, reducer) => (acc, item) => reducer(acc, xf(item));
const filtering = (predicate, reducer) => (acc, item) => predicate(item) ?
reducer(acc, item) :
acc;
const arrReducer = (acc, item) => [...acc, item];
const transformer = compose(filtering(even), mapping(double));
const performantDoubleEven = transduce(transformer, arrReducer, [])
performantDoubleEven([1, 2, 3, 4, 5]); // -> [4, 8] with no temporary arrays created
We can even define array map and filter using transducer because it's so composable:
const map = (xf, data) => transduce(mapping(xf), arrReducer, [], data);
const filter = (predicate, data) => transduce(filtering(predicate), arrReducer, [], data);
live version if you'd like to run the code -> https://runkit.com/marzelin/transducers
Does my reasoning makes sense?
Your understanding is correct but incomplete.
In addition to the concepts you've described, transducers can do the following:
Support a early exit semantic
Support a completion semantic
Be stateful
Support an init value for the step function.
So for instance, an implementation in JavaScript would need to do this:
// Ensure reduce preserves early termination
let called = 0;
let updatesCalled = map(a => { called += 1; return a; });
let hasTwo = reduce(compose(take(2), updatesCalled)(append), [1,2,3]).toString();
console.assert(hasTwo === '1,2', hasTwo);
console.assert(called === 2, called);
Here because of the call to take the reducing operation bails early.
It needs to be able to (optionally) call the step function with no arguments for an initial value:
// handles lack of initial value
let mapDouble = map(n => n * 2);
console.assert(reduce(mapDouble(sum), [1,2]) === 6);
Here a call to sum with no arguments returns the additive identity (zero) to seed the reduction.
In order to accomplish this, here's a helper function:
const addArities = (defaultValue, reducer) => (...args) => {
switch (args.length) {
case 0: return typeof defaultValue === 'function' ? defaultValue() : defaultValue;
case 1: return args[0];
default: return reducer(...args);
}
};
This takes an initial value (or a function that can provide one) and a reducer to seed for:
const sum = addArities(0, (a, b) => a + b);
Now sum has the proper semantics, and it's also how append in the first example is defined. For a stateful transducer, look at take (including helper functions):
// Denotes early completion
class _Wrapped {
constructor (val) { this[DONE] = val }
};
const isReduced = a => a instanceof _Wrapped;
// ensures reduced for bubbling
const reduced = a => a instanceof _Wrapped ? a : new _Wrapped(a);
const unWrap = a => isReduced(a) ? a[DONE] : a;
const enforceArgumentContract = f => (xform, reducer, accum, input, state) => {
// initialization
if (!exists(input)) return reducer();
// Early termination, bubble
if (isReduced(accum)) return accum;
return f(xform, reducer, accum, input, state);
};
/*
* factory
*
* Helper for creating transducers.
*
* Takes a step process, intial state and returns a function that takes a
* transforming function which returns a transducer takes a reducing function,
* optional collection, optional initial value. If collection is not passed
* returns a modified reducing function, otherwise reduces the collection.
*/
const factory = (process, initState) => xform => (reducer, coll, initValue) => {
let state = {};
state.value = typeof initState === 'function' ? initState() : initState;
let step = enforceArgumentContract(process);
let trans = (accum, input) => step(xform, reducer, accum, input, state);
if (coll === undefined) {
return trans; // return transducer
} else if (typeof coll[Symbol.iterator] === 'function') {
return unWrap(reduce(...[trans, coll, initValue].filter(exists)));
} else {
throw NON_ITER;
}
};
const take = factory((n, reducer, accum, input, state) => {
if (state.value >= n) {
return reduced(accum);
} else {
state.value += 1;
}
return reducer(accum, input);
}, () => 0);
If you want to see all of this in action I made a little library a while back. Although I ignored the interop protocol from Cognitect (I just wanted to get the concepts) I did try to implement the semantics as accurately as possible based on Rich Hickey's talks from Strange Loop and Conj.

How to clone an Iterator in javascript?

In ES6, is there any possible to clone an iterator states?
var ma=[1,2,3,4];
var it=ma[Symbol.iterator]();
it.next();
if I want to remember here the it states how should I do in javascritp?
what is remebered in it?
since the
JSON.stringify(it) //it would just return {}
You can’t clone an arbitrary iterator, but you can create many distinct iterators from one by holding onto some state:
function tee(iterable) {
const source = iterable[Symbol.iterator]();
const buffers = [[], []]; // substitute in queue type for efficiency
const DONE = Object.create(null);
const next = i => {
if (buffers[i].length !== 0) {
return buffers[i].shift();
}
const x = source.next();
if (x.done) {
return DONE;
}
buffers[1 - i].push(x.value);
return x.value;
};
return buffers.map(function* (_, i) {
for (;;) {
const x = next(i);
if (x === DONE) {
break;
}
yield x;
}
});
}
Usage:
const [a, b] = tee(iterator);
assert(a.next().value === b.next().value);
It's not possible to clone an iterator. Iterator state is basically completely arbitrary and any given iterator may require or produce side effects (e.g. reading from or writing to a network stream) which are not repeatable on demand.
I built a library that allows you to fork an iterator here: https://github.com/tjenkinson/forkable-iterator
Means you can do something like:
import { buildForkableIterator, fork } from 'forkable-iterator';
function* Source() {
yield 1;
yield 2;
return 'return';
}
const forkableIterator = buildForkableIterator(Source());
console.log(forkableIterator.next()); // { value: 1, done: false }
const child1 = fork(forkableIterator);
// { value: 2, done: false }
console.log(child1.next());
// { value: 2, done: false }
console.log(forkableIterator.next());
// { value: 'return', done: true }
console.log(child1.next());
// { value: 'return', done: true }
console.log(forkableIterator.next());
If you no longer need to keep consuming from a fork providing you loose references to it there also shouldn’t be a memory leak.
It's not official yet, but I think there might be a solution in a stage 2 proposal for Iterator Helpers. If these methods don't affect the original iterator, then doing something like iter.take(Infinity) or iter.drop(0) would have the same effect as cloning.

understanding the iterator protocol

In the notes it states:
The iterable protocol allows JavaScript objects to define or customize
their iteration behavior, such as what values are looped over in a
for..of construct.
I don’t see what benefit this has when I can already use: Object.degineProperty to make something enumerable.
function withValue(value) {
var d = withValue.d || (
withValue.d = {
enumerable: false,
writeable: false,
configuration: false,
value: null
}
)
// other code;
}
What benefit do these protocols have? If this is just some new syntax to appease the new for…of loop, what benefit does it have other than simply checking the length and seeing if its ran out of items in the “list”.
Think of Iterable as an interface. You can be assured implementations contain an Symbol.iterator property, which implements a next() method. If you implement yourself, you can produce the values you want to iterate over at runtime. As a simple example, produce a list and decide later how many (or which, or whatever criteria) you would like to iterate over:
function List (...args) {
this.getOnly = function (limit) (
const effectiveLimit = Math.min(args.length, limit + 1);
const iterable = {
[Symbol.iterator]() {
let count = 0;
const iterator = {
next() {
if (count < effectiveLimit) {
return { value: args[count++] };
} else {
return { done: true };
}
}
};
return iterator;
}
}
return iterable;
};
}
const list = List(0, 1, 2, 3, 4);
for (const x of list.getOnly(3)) {
console.log(x);
}
// returns 0, 1, 2
If you use a Generator function, which implements the Iterable interface, the same gets really simple:
function List (...args) {
this.getOnly = function* (limit) {
const effectiveLimit = Math.min(args.length, limit + 1);
for (let count = 0; count < effectiveLimit; count++) {
yield args[count];
}
}
}
More examples of what you can do with Iterables are listed here.

Object literal (hash) with Promise.all

I have a situation where it would be quite convenient to use Promise.all like so Promise.all({}) instead of the more standard Promise.all([]).
but this doesn't seem to work
Promise.all({a:1,b:2}).then(function(val){
console.log('val:',val);
});
whilst this does of course
Promise.all([1,2,3]).then(function(val){
console.log('val:',val);
});
(what I would expect would be for Promise.all to map the values of the Object literal, but leave the keys intact.)
But the MDN docs for Promise seem to indicate that Promise all will work for any iterable. To my knowledge, an object literal {} is an iterable. So what am I missing?
Here is another async / await ES6 solution:
async function allOf(hash = {}) {
const promises = Object.keys(hash).map(async key => ({[key]: await hash[key]}));
const resolved = await Promise.all(promises);
return resolved.reduce((hash, part) => ({...hash, ...part}), {});
}
This converts the keys into a promise that produces a single element hash. Then at the end we combine all the hashes in the array to a single hash. You could compact this to a one-liner even, at the cost of readability.
async function allOfOneLiner(hash = {}) {
return (await Promise.all(Object.keys(hash).map(async k => ({[k]: await hash[k]})))).reduce((h, p) => ({...h, ...p}), {});
}
Object does not have an Iterator symbol if you look at the mdn documentation for those.
What you can do, is use a tool function to create an object iterable and later consume it.
reference to objectEntries source, however nodejs does not implement Reflect, so for the purpose of using it with node I just change it into using Object.keys()
function objectEntries(obj) {
let index = 0;
// In ES6, you can use strings or symbols as property keys,
// Reflect.ownKeys() retrieves both
let propKeys = Object.keys(obj);
return {
[Symbol.iterator]() {
return this;
},
next() {
if (index < propKeys.length) {
let key = propKeys[index];
index++;
return { value: [key, obj[key]] };
} else {
return { done: true };
}
}
};
}
Use Object.values. Works in Firefox Nightly:
Promise.all(Object.values({a:1,b:2}))
.then(vals => console.log('vals: ' + vals)) // vals: 1,2
.catch(e => console.log(e));
var console = { log: msg => div.innerHTML += msg + "<br>" };
<div id="div"></div>
Then, to put the results back in an object, we can make a Promise.allParams function:
Promise.allParams = o =>
Promise.all(Object.values(o)).then(promises =>
Object.keys(o).reduce((o2, key, i) => (o2[key] = promises[i], o2), {}));
// Demo:
Promise.allParams({a:1,b:2}).then(function(val){
console.log('val: ' + JSON.stringify(val)); // val: {"a":1,"b":2}
});
var console = { log: msg => div.innerHTML += msg + "<br>" };
<div id="div"></div>
Syntax
Promise.all(iterable);
Parameters
iterable
An iterable object, such as an Array. See iterable.
This function does the trick:
Promise.allAssoc = function(object){
var values = [], keys = [];
for(var key in object){
values.push(object[key]);
keys.push(key);
}
return Promise.all(values).then(function(results){
var out = {};
for(var i=0; i<results.length; i++) out[keys[i]] = results[i];
return out;
});
};
Not all objects are iterable by default. You can make an object iterable by defining a ##iterator method. ##iterator is a Well-Known Symbol available as Symbol.iterator:
Specification Name
##iterator
[[Description]]
"Symbol.iterator"
Value and Purpose
A method that returns the default Iterator for an object. Called by the semantics of the for-of statement.
For example, this will make all object iterable (probably not a good idea):
Object.prototype[Symbol.iterator] = function*() {
for(let key of Object.keys(this))
yield this[key];
};
Then you will be able to use
Promise.all({a:1,b:2}).then(function(val){
console.log('val:', val); // [ 1, 2 ]
});
With Babel/ES2015 you can use Object.keys and map to get the values like this:
const obj = {a:1,b:2};
const vals = Object.keys(obj).map(k=>obj[k]);
Promise.all(vals).then( vals => { console.log('vals', vals) });
ES6 way
Promise.hashProperties = async function(object) {
const keys = [];
const values = [];
for (const key in object) {
keys.push(key);
values.push(object[key]);
}
const results = await Promise.all(values);
for (var i=0; i<results.length; i++)
object[keys[i]] = results[i];
return object;
};

How to clone ES6 generator?

I'm trying to create a List monad in ES6 using generators. To make it work I need to create a copy of an iterator that has already consumed several states. How do I clone an iterator in ES6?
function* test() {
yield 1;
yield 2;
yield 3;
}
var x = test();
console.log(x.next().value); // 1
var y = clone(x);
console.log(x.next().value); // 2
console.log(y.next().value); // 2 (sic)
I've tried clone and cloneDeep from lodash, but they were of no use. Iterators that are returned in this way are native functions and keep their state internally, so it seems there's no way to do it with own JS code.
Iterators […] keep their state internally, so it seems there's no way
Yes, and that for a good reason. You cannot clone the state, or otherwise you could tamper too much with the generator.
It might be possible however to create a second iterator that runs alongside of the first one, by memorizing its sequence and yielding it later again. However, there should be only one iterator that really drives the generator - otherwise, which of your clones would be allowed to send next() arguments?
I wrote a do-notation library for JavaScript, burrido. To get around the mutable generator problem I made immutagen, which emulates an immutable generator by maintaining a history of input values and replaying them to clone the generator at any particular state.
You can't clone a generator--it's just a function with no state. What could have state, and therefore what could be cloned, is the iterator resulting from invoking the generator function.
This approach caches intermediate results, so that cloned iterators can access them if necessary until they "catch up". It returns an object which is both an iterator and an iterable, so you can either call next on it or for...of over it. Any iterator may be passed in, so you could in theory have cloned iterators over an array by passing in array.values(). Whichever clone calls next first at a given point in the iteration will have the argument passed to next, if any, reflected in the value of the yield in the underlying generator.
function clonableIterator(it) {
var vals = [];
return function make(n) {
return {
next(arg) {
const len = vals.length;
if (n >= len) vals[len] = it.next(arg);
return vals[n++];
},
clone() { return make(n); },
throw(e) { if (it.throw) it.throw(e); },
return(v) { if (it.return) it.return(v); },
[Symbol.iterator]() { return this; }
};
}(0);
}
function *gen() {
yield 1;
yield 2;
yield 3;
}
var it = clonableIterator(gen());
console.log(it.next());
var clone = it.clone();
console.log(clone.next());
console.log(it.next());
Obviously this approach has the problem that it keeps the entire history of the iterator. One optimization would be to keep a WeakMap of all the cloned iterators and how far they have progressed, and then clean up the history to eliminate all the past values that have already been consumed by all clones.
Thanks for the comments on my previous question. Inspired by those and some of the answers here I've made a cloneable_generator_factory to solve the problem:
function cloneable_generator_factory (args, generator_factory, next_calls = [])
{
let generator = generator_factory(args)
const cloneable_generator = {
next: (...args) =>
{
next_calls.push(args)
return generator.next(...args)
},
throw: e => generator.throw(e),
return: e => generator.return(e),
[Symbol.iterator]: () => cloneable_generator,
clone: () =>
{
// todo, use structuredClone when supported
const partial_deep_cloned_next_args = [...next_calls].map(args => [...args])
return cloneable_generator_factory(args, generator_factory, partial_deep_cloned_next_args)
},
}
// Call `generator` not `cloneable_generator`
next_calls.forEach(args => generator.next(...args))
return cloneable_generator
}
// Demo
function* jumpable_sequence (args) {
let i = args.start
while (true)
{
let jump = yield ++i
if (jump !== undefined) i += jump
}
}
let iter = cloneable_generator_factory({ start: 10 }, jumpable_sequence)
console.log(iter.next().value) // 11
console.log(iter.next(3).value) // 15 (from 11 + 1 + 3)
let saved = iter.clone()
console.log("Saved. Continuing...")
console.log(iter.next().value) // 16
console.log(iter.next(10).value) // 27 (from 16 + 1 + 10)
console.log("Restored")
iter = saved
console.log(iter.next().value) // 16
console.log(iter.next().value) // 17
console.log(iter.next().value) // 18
For those using TypeScript, here's a link to the playground of the following code:
interface CloneableGenerator <A, B, C> extends Generator<A, B, C>
{
clone: () => CloneableGenerator <A, B, C>
}
function cloneable_generator_factory <R, A, B, C> (args: R, generator_factory: (args: R) => Generator<A, B, C>, next_calls: ([] | [C])[] = []): CloneableGenerator<A, B, C>
{
let generator = generator_factory(args)
const cloneable_generator: CloneableGenerator<A, B, C> = {
next: (...args: [] | [C]) =>
{
next_calls.push(args)
return generator.next(...args)
},
throw: e => generator.throw(e),
return: e => generator.return(e),
[Symbol.iterator]: () => cloneable_generator,
clone: () =>
{
// todo, use structuredClone when supported
const partial_deep_cloned_next_args: ([] | [C])[] = [...next_calls].map(args => [...args])
return cloneable_generator_factory(args, generator_factory, partial_deep_cloned_next_args)
},
}
// Call `generator` not `cloneable_generator` to avoid args for `next` being multiplied indefinitely
next_calls.forEach(args => generator.next(...args))
return cloneable_generator
}
// Demo
function* jumpable_sequence (args: {start: number}): Generator<number, number, number | undefined> {
let i = args.start
while (true)
{
let jump = yield ++i
if (jump !== undefined) i += jump
}
}
let iter = cloneable_generator_factory({ start: 10 }, jumpable_sequence)
console.log(iter.next().value) // 11
console.log(iter.next(3).value) // 15 (from 11 + 1 + 3)
let saved = iter.clone()
console.log("Saved. Continuing...")
console.log(iter.next().value) // 16
console.log(iter.next(10).value) // 27 (from 16 + 1 + 10)
console.log("Restored")
iter = saved
console.log(iter.next().value) // 16
console.log(iter.next().value) // 17
console.log(iter.next().value) // 18
You could do something like is provided in Python itertools.tee, i.e. let a function return multiple iterators that take off from where the given iterator is at.
Once you call tee, you should no longer touch the original iterator, since tee is now managing it. But you can continue with the 2 or more "copies" you got back from it, which will have their independent iterations.
Here is how that function tee can be defined, with a simple example use of it:
function tee(iter, length=2) {
const buffers = Array.from({length}, () => []);
return buffers.map(function* makeIter(buffer) {
while (true) {
if (buffer.length == 0) {
let result = iter.next();
for (let buffer of buffers) {
buffer.push(result);
}
}
if (buffer[0].done) return;
yield buffer.shift().value;
}
});
}
// Demo
function* naturalNumbers() {
let i = 0;
while (true) yield ++i;
}
let iter = naturalNumbers();
console.log(iter.next().value); // 1
console.log(iter.next().value); // 2
let saved;
[iter, saved] = tee(iter);
console.log("Saved. Continuing...");
console.log(iter.next().value); // 3
console.log(iter.next().value); // 4
console.log("Restored");
iter = saved;
console.log(iter.next().value); // 3
console.log(iter.next().value); // 4
console.log(iter.next().value); // 5

Categories

Resources