Recursive function to return overall overall results - javascript

I need a recursive method to roll up all results from a series of paginated calls and returns the complete list of results. Something feels off in the way I am doing it and feel there is a better way to do this, possibly with Array.reduce
Any recommendations appreciated.
interface Result {
users: Widget[];
start: number;
}
interface Widget {
id: number;
}
// create 3 widgets for test
const widgets = Array(3).fill(null).map((i, index: number) => {
return {
id: index + 1,
} as Widget;
});
const getFromAPI = (start: number = 0): Result => {
// return 1 at a time from a specified position
const current = widgets.slice(start, start + 1);
let nextStart: number | undefined;
if (start < widgets.length - 1) {
nextStart = start + 1;
}
return {
users: current,
start: nextStart,
}
}
// I don't like that em is outside the scope here
let em: Widget[] = [];
const getWidgets = (start?: number): Widget[] => {
const result = getFromAPI(start);
em = [...em, ...result.users];
if (result.start) {
getWidgets(result.start);
}
return em;
}
const all = getWidgets();

You don't like that em is outside the scope of getWidget() and that you are reassigning it inside the function body and thus relying on side effects. Instead it seems that you want something more purely functional, without relying on state changes.
If so, then one approach you can take is to make em another argument to the function, and have it initially empty, and then returning the recursive result of getWidgets() called with the next version of em:
const getWidgets = (em: Widget[] = [], start?: number): Widget[] => {
const result = getFromAPI(start);
const nextEm = [...em, ...result.users];
return result.start ? getWidgets(nextEm, result.start) : nextEm;
}
I've changed the reassignment of em to a new variable nextEm just in case you want to avoid side effects even within the body of getWidgets as well. It makes the algorithm a little clearer anyway.
You can verify that a call to getWidgets() yields the same result as in your example:
const all = getWidgets();
console.log(all); // [{ "id": 1}, { "id": 2}, { "id": 3}]
Playground link to code

Related

Using function of an object after grabbing it from array

When I try to grab the object from the array, the type is undefined. Therefore I cannot use a method from the undefined object as it doesn't exist. I am relatively new to JavaScript and I have come straight from Java so the way of retrieving objects is kind of new to me. This is what I currently have.
var fleetAmount = 0;
var fleets = [];
function Fleet(number) {
this.number = number;
this.activities = [];
this.addActivity = function (activity) {
this.activities.push(activity);
};
fleets.push(this);
}
var getFleet = function(fleetNumber) {
return fleets[fleetAmount - fleetNumber];
}
This is where I try to grab the object and preform the function
const Fl = require(‘fleet.js’);
const fleet = Fl.getFleet(fleetNumber);
fleet.addActivity(activity);
I am also working in Node.js, which is how I am using the require method.
In combination with the answer from #audzzy I changed the getFleet() function so that it would be more efficient. I tested it out and it worked. This is what I used
function getFleet(fleetNumber) {
let result = fleets.filter(function (e) {
return e.number = fleetNumber;
})
return result[0];
}
Thanks for the help! I appreciate it.
you want to create a new fleet object and add it, not "this"..
adding "this" would cause a circular reference, where
this.fleets[i] = this (and all fleets would have the same value)
when calling get fleet, I would check that a fleet was returned from get fleet
in case amount is less than the number you send to getFleet (where according to what you posted: 1 returns the last, 2 returns second to last etc..)..
I hope this explanation makes sense.. anyways, this should work..
var fleets = [];
doStuff();
function doStuff(){
addFleet(1);
addFleet(2);
addFleet(7);
addFleet(3);
// should return null
let fleet1 = getFleetByNumber(5);
// should return the fleet with number 7, and not change the fleet with number 1
let fleet2 = getFleetByNumber(7);
if(fleet2){
fleet2.addActivity("activity");
}
console.log(`fleets: ${JSON.stringify(fleets)} \nfleet1: ${JSON.stringify(fleet1)} \nfleet2: ${JSON.stringify(fleet2)}`);
}
function addFleet(number) {
let fleet = { number: number,
activities: [] };
fleet.addActivity = function (activity) {
this.activities.push(activity);
};
fleets.push(fleet);
}
function getFleetByNumber(fleetNumber) {
return fleets.find(function (e) {
return e.number == fleetNumber;
});
}
function getFleet(fleetNumber) {
let result = null;
if(fleets.length - fleetNumber >= 0){
result = fleets[fleets.length - fleetNumber];
}
return result;
}

How do I manage context when exposing object methods in JS modules?

Okay, I realize this can be considered subjective, but I'm trying to better understand how to consider scope when writing modules that only expose what's needed publicly. I have a string utility that I've written as an object literal below:
const substrings = {
query: {},
text: "",
results: [],
exists: function (index) {
const exists = index >= 0
return exists
},
check: function () {
const q = this.query
const start = q.openIndex
const stop = q.closeIndex
if (q.hasOpen && !q.hasClose) {
console.log("Missing closing delimiter.")
}
if (!q.hasOpen && q.hasClose) {
console.log("Missing opening delimiter.")
}
if (q.hasOpen && q.hasClose && start > stop) {
console.log("Closing delimiter found before opening.")
}
if (!q.hasOpen && !q.hasClose && this.results.length == 0) {
console.log("No results found.")
}
const order = start < stop
const check = q.hasOpen && q.hasClose && order
return check
},
update: function () {
const q = this.query
const text = this.text
q.before = this.text.indexOf(q.open)
q.start = q.before + q.open.length
this.text = text.slice(q.start, text.length)
q.stop = this.text.indexOf(q.close)
q.after = q.stop + q.close.length
q.openIndex = q.before
q.closeIndex = q.before + q.stop
q.hasOpen = this.exists(q.openIndex)
q.hasClose = this.exists(q.stop)
const newPosition = q.start + q.after
q.position = q.position + newPosition
this.query = q
},
substrings: function () {
const q = this.query
const current = this.text.slice(0, q.stop)
const fullLength = this.text.length
this.text = this.text.slice(q.after, fullLength)
this.results.push(current)
this.update()
if (this.check()) {
this.substrings()
}
},
init: function (open, close, text) {
this.results = []
this.query = {
open,
close,
position: 0,
}
this.text = text
this.update()
},
getSubstrings: function (open, close, text) {
this.init(open, close, text)
if (this.check()) {
this.substrings()
return this.results
}
},
getSubstring: function (open, close, text) {
this.init(open, close, text)
if (this.check()) {
return this.text.slice(0, this.query.stop)
}
}
}
I want to use it as a Node module and expose the getSubstring and getSubstrings methods, but if I were to do:
module.exports = {
all: substrings.getSubstrings,
one: substrings.getSubstring
}
I would get an error due to the usage of this. I realize that if I replace this with the object var name substrings to reference it directly, it works. I could also refactor it to be one big function or smaller functions and just export the 2 I need.
I am trying to go about learning things the right way and am struggling with how I should be thinking about context. I understand how this changes here, but I feel like I'm not fully wrapping my head around how I should consider context when structuring my code.
Is there a more elegant solution to expose methods with code like this that wasn't written to separate private and public methods?
A simple solution would be to bind the exported functions to the proper calling context inside the exports object:
module.exports = {
all: substrings.getSubstrings.bind(substrings),
one: substrings.getSubstring.bind(substrings)
}
Personally, I prefer using the revealing module pattern over object literals for situations like this. With the revealing module pattern, create an IIFE that returns the desired functions, referring to local variables instead of properties on this. For example:
const { getSubstrings, getSubstring } = (() => {
let query = {}
let text = ''
let results = []
function exists(index) {
return index >= 0
}
function check() {
const q = query;
// ...
}
...
function getSubstrings(open, close, text) {
}
...
return { getSubstrings, getSubstring };
})();
module.exports = {
all: getSubstrings,
one: getSubstring
}
This is somewhat opinion-based, but code can be easier to read when there aren't any this references to worry about.

How to clone an Iterator in javascript?

In ES6, is there any possible to clone an iterator states?
var ma=[1,2,3,4];
var it=ma[Symbol.iterator]();
it.next();
if I want to remember here the it states how should I do in javascritp?
what is remebered in it?
since the
JSON.stringify(it) //it would just return {}
You can’t clone an arbitrary iterator, but you can create many distinct iterators from one by holding onto some state:
function tee(iterable) {
const source = iterable[Symbol.iterator]();
const buffers = [[], []]; // substitute in queue type for efficiency
const DONE = Object.create(null);
const next = i => {
if (buffers[i].length !== 0) {
return buffers[i].shift();
}
const x = source.next();
if (x.done) {
return DONE;
}
buffers[1 - i].push(x.value);
return x.value;
};
return buffers.map(function* (_, i) {
for (;;) {
const x = next(i);
if (x === DONE) {
break;
}
yield x;
}
});
}
Usage:
const [a, b] = tee(iterator);
assert(a.next().value === b.next().value);
It's not possible to clone an iterator. Iterator state is basically completely arbitrary and any given iterator may require or produce side effects (e.g. reading from or writing to a network stream) which are not repeatable on demand.
I built a library that allows you to fork an iterator here: https://github.com/tjenkinson/forkable-iterator
Means you can do something like:
import { buildForkableIterator, fork } from 'forkable-iterator';
function* Source() {
yield 1;
yield 2;
return 'return';
}
const forkableIterator = buildForkableIterator(Source());
console.log(forkableIterator.next()); // { value: 1, done: false }
const child1 = fork(forkableIterator);
// { value: 2, done: false }
console.log(child1.next());
// { value: 2, done: false }
console.log(forkableIterator.next());
// { value: 'return', done: true }
console.log(child1.next());
// { value: 'return', done: true }
console.log(forkableIterator.next());
If you no longer need to keep consuming from a fork providing you loose references to it there also shouldn’t be a memory leak.
It's not official yet, but I think there might be a solution in a stage 2 proposal for Iterator Helpers. If these methods don't affect the original iterator, then doing something like iter.take(Infinity) or iter.drop(0) would have the same effect as cloning.

How to create a memoize function

I am stumped with this memoize problem. I need to create a function that will check to see if a value has already been calculated for a given argument, return the previous result, or run the calculation and return that value.
I have spent hours on this and while I am new to JS. I cannot get my head around how to do this. I cannot use any built in functions and would really like to understand what I need to do.
Here is what I have so far, which is so wrong it feels like pseudo-code at this point. I have searched existing memoize questions out here but I cannot seem to make any solution work yet. Any help is much appreciated.
myMemoizeFunc = function(passedFunc) {
var firstRun = passedFunc;
function check(passedFunc){
if(firstRun === undefined){
return passedFunc;
}else{return firstRun;}
}
};
Sorry, I should have been more clear. Here are my specific requirements:
myMemoizeFunc must return a function that will check if the calculation has already been calculated for the given arg and return that val if possible. The passedFunc is a function that holds the result of a calculation.
I understand this may seem like a duplicate, but I am marking as not so, as I am having some serious difficulty understanding what I should do here, and need further help than is given in other posts.
This is what my thought process is bringing me towards but again, I am way off.
myMemoizeFunc = function(passedFunc) {
var allValues = [];
return function(){
for(var i = 0; i < myValues.length; i++){
if(myValues[i] === passedFunc){
return i;
}
else{
myValues.push(passedFunc);
return passedFunc;
}
}
}
};
I should not be returning i or passedFunc here, but what else could I do within the if/else while checking for a value? I have been looking at this problem for so long, I am starting to implement code that is ridiculous and need some fresh advice.
I think the main trick for this is to make an object that stores arguments that have been passed in before as keys with the result of the function as the value.
For memoizing functions of a single argument, I would implement it like so:
var myMemoizeFunc = function (passedFunc) {
var cache = {};
return function (x) {
if (x in cache) return cache[x];
return cache[x] = passedFunc(x);
};
};
Then you could use this to memoize any function that takes a single argument, say for example, a recursive function for calculating factorials:
var factorial = myMemoizeFunc(function(n) {
if(n < 2) return 1;
return n * factorial(n-1);
});
Consider this an extension on the answer of Peter Olson.
For a variable number of arguments you could use something like this.
Note: This example is not optimal if you intent to pass complex arguments (arrays, objects, functions). Be sure to read further and not copy/paste blindly.
function memo(fn) {
const cache = {};
function get(args) {
let node = cache;
for (const arg of args) {
if (!("next" in node)) node.next = new Map();
if (!node.next.has(arg)) node.next.set(arg, {});
node = node.next.get(arg);
}
return node;
}
return function (...args) {
const cache = get(args);
if ("item" in cache) return cache.item;
cache.item = fn(...args);
return cache.item;
}
}
This builds the following cache tree structure:
const memoizedFn = memo(fn);
memoizedFn();
memoizedFn(1);
memoizedFn(1, 2);
memoizedFn(2, 1);
cache = {
item: fn(),
next: Map{ // <- Map contents depicted as object
1: {
item: fn(1),
next: Map{
2: { item: fn(1, 2) }
}
},
2: {
next: Map{
1: { item: fn(2, 1) }
}
}
}
}
This solution leaks memory when passing complex arguments (arrays, object, functions) that are no longer referenced afterwards.
memoizedFn({ a: 1 })
Because { a: 1 } is not referenced after the memoizedFn call it would normally be garbage collected. However now it can't be because cache still holds a reference. It can only be garbage collected once memoizedFn itself is garbage collected.
I showed the above first because it shows the base concept and demonstrates the cache structure in a somewhat simple form. To clean up cache that would normally be garbage collected we should use a WeakMap instead of a Map for complex objects.
For those unfamiliar with WeakMap, the keys are a "weak" reference. This means that the keys do not count towards active references towards an object. Once an object is no longer referenced (not counting weak references) it will be garbage collected. This will in turn remove the key/value pair from the WeakMap instance.
const memo = (function () {
const primitives = new Set([
"undefined",
"boolean",
"number",
"bigint",
"string",
"symbol"
]);
function typeOf(item) {
const type = typeof item;
if (primitives.has(type)) return "primitive";
return item === null ? "primitive" : "complex";
}
const map = {
"primitive": Map,
"complex": WeakMap
};
return function (fn) {
const cache = {};
function get(args) {
let node = cache;
for (const arg of args) {
const type = typeOf(arg);
if (!(type in node)) node[type] = new map[type];
if (!node[type].has(arg)) node[type].set(arg, {});
node = node[type].get(arg);
}
return node;
}
return function (...args) {
const cache = get(args);
if ("item" in cache) return cache.item;
cache.item = fn(...args);
return cache.item;
}
}
})();
const fib = memo((n) => {
console.log("fib called with", n);
if (n == 0) return 0;
if (n == 1) return 1;
return fib(n - 1) + fib(n - 2);
});
// heavy operation with complex object
const heavyFn = memo((obj) => {
console.log("heavyFn called with", obj);
// heavy operation
return obj.value * 2;
});
// multiple complex arguments
const map = memo((iterable, mapFn) => {
console.log("map called with", iterable, mapFn);
const result = [];
for (const item of iterable) result.push(mapFn(item));
return result;
});
console.log("### simple argument demonstration ###");
console.log("fib(3)", "//=>", fib(3));
console.log("fib(6)", "//=>", fib(6));
console.log("fib(5)", "//=>", fib(5));
console.log("### exlanation of when cache is garbage collected ###");
(function () {
const item = { value: 7 };
// item stays in memo cache until it is garbade collected
console.log("heavyFn(item)", "//=>", heavyFn(item));
console.log("heavyFn(item)", "//=>", heavyFn(item));
// Does not use the cached item. Although the object has the same contents
// it is a different instance, so not considdered the same.
console.log("heavyFn({ value: 7 })", "//=>", heavyFn({ value: 7 }));
// { value: 7 } is garbade collected (and removed from the memo cache)
})();
// item is garbade collected (and removed from memo cache) it is no longer in scope
console.log("### multiple complex arguments demonstration ###");
console.log("map([1], n => n * 2)", "//=>", map([1], n => n * 2));
// Does not use cache. Although the array and function have the same contents
// they are new instances, so not considdered the same.
console.log("map([1], n => n * 2)", "//=>", map([1], n => n * 2));
const ns = [1, 2];
const double = n => n * 2;
console.log("map(ns, double)", "//=>", map(ns, double));
// Does use cache, same instances are passed.
console.log("map(ns, double)", "//=>", map(ns, double));
// Does use cache, same instances are passed.
ns.push(3);
console.log("mutated ns", ns);
console.log("map(ns, double)", "//=>", map(ns, double));
The structure stays essentially the same, but depending on the type of the argument it will look in either the primitive: Map{} or complex: WeakMap{} object.
const memoizedFn = memo(fn);
memoizedFn();
memoizedFn(1);
memoizedFn(1, 2);
memoizedFn({ value: 2 }, 1);
cache = {
item: fn(),
primitive: Map{
1: {
item: fn(1),
primitive: Map{
2: { item: fn(1, 2) }
}
}
},
complex: WeakMap{
{ value: 2 }: { // <- cleared if { value: 2 } is garbage collected
primitive: Map{
1: { item: fn({ value: 2 }, 1) }
}
}
}
}
This solution does not memoize any errors thrown. Arguments are considered equal based on Map key equality. If you also need to memoize any errors thrown I hope that this answer gave you the building blocks to do so.
There are a number of memoization libraries available. Doing memoization efficiently is not as straight forward as it seems. I suggest a library be used. Two of the fastest are:
https://github.com/anywhichway/iMemoized
https://github.com/planttheidea/moize
See here for a comprehensive(-ish) list of memoization libraries: https://stackoverflow.com/a/61402805/2441655

How to clone ES6 generator?

I'm trying to create a List monad in ES6 using generators. To make it work I need to create a copy of an iterator that has already consumed several states. How do I clone an iterator in ES6?
function* test() {
yield 1;
yield 2;
yield 3;
}
var x = test();
console.log(x.next().value); // 1
var y = clone(x);
console.log(x.next().value); // 2
console.log(y.next().value); // 2 (sic)
I've tried clone and cloneDeep from lodash, but they were of no use. Iterators that are returned in this way are native functions and keep their state internally, so it seems there's no way to do it with own JS code.
Iterators […] keep their state internally, so it seems there's no way
Yes, and that for a good reason. You cannot clone the state, or otherwise you could tamper too much with the generator.
It might be possible however to create a second iterator that runs alongside of the first one, by memorizing its sequence and yielding it later again. However, there should be only one iterator that really drives the generator - otherwise, which of your clones would be allowed to send next() arguments?
I wrote a do-notation library for JavaScript, burrido. To get around the mutable generator problem I made immutagen, which emulates an immutable generator by maintaining a history of input values and replaying them to clone the generator at any particular state.
You can't clone a generator--it's just a function with no state. What could have state, and therefore what could be cloned, is the iterator resulting from invoking the generator function.
This approach caches intermediate results, so that cloned iterators can access them if necessary until they "catch up". It returns an object which is both an iterator and an iterable, so you can either call next on it or for...of over it. Any iterator may be passed in, so you could in theory have cloned iterators over an array by passing in array.values(). Whichever clone calls next first at a given point in the iteration will have the argument passed to next, if any, reflected in the value of the yield in the underlying generator.
function clonableIterator(it) {
var vals = [];
return function make(n) {
return {
next(arg) {
const len = vals.length;
if (n >= len) vals[len] = it.next(arg);
return vals[n++];
},
clone() { return make(n); },
throw(e) { if (it.throw) it.throw(e); },
return(v) { if (it.return) it.return(v); },
[Symbol.iterator]() { return this; }
};
}(0);
}
function *gen() {
yield 1;
yield 2;
yield 3;
}
var it = clonableIterator(gen());
console.log(it.next());
var clone = it.clone();
console.log(clone.next());
console.log(it.next());
Obviously this approach has the problem that it keeps the entire history of the iterator. One optimization would be to keep a WeakMap of all the cloned iterators and how far they have progressed, and then clean up the history to eliminate all the past values that have already been consumed by all clones.
Thanks for the comments on my previous question. Inspired by those and some of the answers here I've made a cloneable_generator_factory to solve the problem:
function cloneable_generator_factory (args, generator_factory, next_calls = [])
{
let generator = generator_factory(args)
const cloneable_generator = {
next: (...args) =>
{
next_calls.push(args)
return generator.next(...args)
},
throw: e => generator.throw(e),
return: e => generator.return(e),
[Symbol.iterator]: () => cloneable_generator,
clone: () =>
{
// todo, use structuredClone when supported
const partial_deep_cloned_next_args = [...next_calls].map(args => [...args])
return cloneable_generator_factory(args, generator_factory, partial_deep_cloned_next_args)
},
}
// Call `generator` not `cloneable_generator`
next_calls.forEach(args => generator.next(...args))
return cloneable_generator
}
// Demo
function* jumpable_sequence (args) {
let i = args.start
while (true)
{
let jump = yield ++i
if (jump !== undefined) i += jump
}
}
let iter = cloneable_generator_factory({ start: 10 }, jumpable_sequence)
console.log(iter.next().value) // 11
console.log(iter.next(3).value) // 15 (from 11 + 1 + 3)
let saved = iter.clone()
console.log("Saved. Continuing...")
console.log(iter.next().value) // 16
console.log(iter.next(10).value) // 27 (from 16 + 1 + 10)
console.log("Restored")
iter = saved
console.log(iter.next().value) // 16
console.log(iter.next().value) // 17
console.log(iter.next().value) // 18
For those using TypeScript, here's a link to the playground of the following code:
interface CloneableGenerator <A, B, C> extends Generator<A, B, C>
{
clone: () => CloneableGenerator <A, B, C>
}
function cloneable_generator_factory <R, A, B, C> (args: R, generator_factory: (args: R) => Generator<A, B, C>, next_calls: ([] | [C])[] = []): CloneableGenerator<A, B, C>
{
let generator = generator_factory(args)
const cloneable_generator: CloneableGenerator<A, B, C> = {
next: (...args: [] | [C]) =>
{
next_calls.push(args)
return generator.next(...args)
},
throw: e => generator.throw(e),
return: e => generator.return(e),
[Symbol.iterator]: () => cloneable_generator,
clone: () =>
{
// todo, use structuredClone when supported
const partial_deep_cloned_next_args: ([] | [C])[] = [...next_calls].map(args => [...args])
return cloneable_generator_factory(args, generator_factory, partial_deep_cloned_next_args)
},
}
// Call `generator` not `cloneable_generator` to avoid args for `next` being multiplied indefinitely
next_calls.forEach(args => generator.next(...args))
return cloneable_generator
}
// Demo
function* jumpable_sequence (args: {start: number}): Generator<number, number, number | undefined> {
let i = args.start
while (true)
{
let jump = yield ++i
if (jump !== undefined) i += jump
}
}
let iter = cloneable_generator_factory({ start: 10 }, jumpable_sequence)
console.log(iter.next().value) // 11
console.log(iter.next(3).value) // 15 (from 11 + 1 + 3)
let saved = iter.clone()
console.log("Saved. Continuing...")
console.log(iter.next().value) // 16
console.log(iter.next(10).value) // 27 (from 16 + 1 + 10)
console.log("Restored")
iter = saved
console.log(iter.next().value) // 16
console.log(iter.next().value) // 17
console.log(iter.next().value) // 18
You could do something like is provided in Python itertools.tee, i.e. let a function return multiple iterators that take off from where the given iterator is at.
Once you call tee, you should no longer touch the original iterator, since tee is now managing it. But you can continue with the 2 or more "copies" you got back from it, which will have their independent iterations.
Here is how that function tee can be defined, with a simple example use of it:
function tee(iter, length=2) {
const buffers = Array.from({length}, () => []);
return buffers.map(function* makeIter(buffer) {
while (true) {
if (buffer.length == 0) {
let result = iter.next();
for (let buffer of buffers) {
buffer.push(result);
}
}
if (buffer[0].done) return;
yield buffer.shift().value;
}
});
}
// Demo
function* naturalNumbers() {
let i = 0;
while (true) yield ++i;
}
let iter = naturalNumbers();
console.log(iter.next().value); // 1
console.log(iter.next().value); // 2
let saved;
[iter, saved] = tee(iter);
console.log("Saved. Continuing...");
console.log(iter.next().value); // 3
console.log(iter.next().value); // 4
console.log("Restored");
iter = saved;
console.log(iter.next().value); // 3
console.log(iter.next().value); // 4
console.log(iter.next().value); // 5

Categories

Resources