Intercept destructured function parameters in JavaScript? - javascript

Is there any way or pattern that would allow me to intercept destructured function arguments and do something with them before the function is executed, without knowing the argument signature beforehand? I.e. something like:
const add = ({a, b}) => a + b
const multiplyArgs = (fun) => (...args) => fun(...args.map(x => x * 10)) // Won't work
const res = multiplyArgs(add)({a: 5, b: 10, c: 20}) // 150, won't perform the multiplication on c
The only option I found so far is using a regex to get the arguments from the string representation of the function, but that's very messy.
EDIT:
The actual use case is this:
I have an object with RxJS observables/subjects, and would like to be able to call a function/method that would take in another function, pick the required observables from the object, combine them, and then pipe the function to the new combined observable.
const observablePool = {a: new Rx.BehaviorSubject(5), b: new Rx.BehaviorSubject(10)}
updatePool( ({a, b}) => ({c: a + b}) )
// In the background:
// const picked = {a: observablePool.a, b: observablePool.b}
// observablePool.c = Rx.combineLatest(picked)
// .pipe(Rx.map({a, b} => a + b))
The idea is to hide the implementation of accessing the observables object and creating new combined observables. The user should be able to chain simple functions whose results would get lifted into observables automatically.
I can do it by adding a pick() function, i.e. updatePool(pick("a", "b"), ({a, b}) => ({c: a + b}) ) but that duplicates and decouples the argument names. I was wondering if there was a more elegant solution.

For just one object argument, you could map over Object.entries.
const add = ({a, b}) => a + b;
const multiplyObj0 = fun => obj => fun(Object.fromEntries(Object.entries(obj)
.map(([k, v])=>[k, v * 10])));
const res = multiplyObj0(add)({a: 5, b: 10});
console.log(res);

Related

How to write a composed function using Ramda compose?

How to modify the zComposedFn function so that the output of both z and zComposedOutput is same?
const R = require('ramda');
let f1 = R.curry((p1, p2, p3) => {
let result = {
n1: p1,
n2: p2,
n3: p3
};
return result;
}
);
let x = f1(1, 2, 3);
let y = f1(1, 2, x);
let z = f1(1, x , y);
console.log(z);
let zComposedFn = R.compose(f1);
let input = [1,2,3];
let zComposedOutput = zComposedFn(...input);
console.log(zComposedOutput);
The goal is to create some metric calculation functions having the same signature and output type but different implementation.
const MetricFn = (m,f,a) => {<to be implemented using switch case on m> return b}
m : name of metric as string
f : Array of functions utilizing input data objects
a : input data object
example:
There is a financial dashboard, which receives the input data as (1,2,3). The dashboard displays the metric1, metric2 and metric3 calculated as below:
metric1 = MetricFn('metric1',[f1])(1,2,3);
metric2 = MetricFn('metric2',[f1, f2])(1,2,3);
metric3 = MetricFn('metric3',[f1, f2, f3])(1,2,3);
I am wondering on how to create the structure of MetricFn.
I can make very little sense out of your function. And I don't see anything that Ramda offers to help. While I'm a Ramda author, I don't always recall every function, but it seems unlikely.
Your requirements looks vaguely like the use of chain with functions, where chain(f, g) ~~> x => f(g(x))(x). But it's only a vague connection, and I can't see how to use chain to do what you are trying to do.
Is there an underlying problem you're trying to solve that we might be able to help with?
Here is a trivial implementation, but it's mostly just restating your code without the intermediate variables:
const foo = curry((f, a, b, c) => f(a, f(a, b, c), f(a, b, f(a, b, c))))
foo(f1)(1, 2, 3)

Clean way to keep original variable and destructure at the same time

Is there a cleaner way to do this (with anything that is at least an ES draft and has a babel plugin, i.e., ES6, ES7, etc.):
const { a, b } = result = doSomething();
Where I want to keep the overall result as one singular object, but also destructure it at the same time. It technically works, but result is implicitly declared (with an implicit var), while I'd really like it to also be a const.
I'm currently doing this:
const result = doSomething();
const { a, b } = result;
Which again works, but it's slightly on the verbose side, since I need to repeat this pattern dozens of times.
I'd ideally want something along the lines of:
const { a, b } = const result = doSomething();
But that is obviously an invalid syntax.
One possible way:
const result = doSomething(),
{ a, b } = result;
You still have to duplicate the name, though. const token isn't quite right-handy. )
Idea 1
Create this helper function:
function use(input, callback) {
callback(input, input);
}
and use it like:
use(doSomething(), (result, {a, b}) => {
// Do something with result as a whole, or a and b as destructured properties.
});
For example:
use ({a: "Hello", b: "World", c: "!"}, (result, {a, b}) => {
console.log(result);
console.log(a);
console.log(b);
});
// generates
// {a: "Hello", b: "World", c: "!"}
// Hello
// World
They're not const, but they're scoped, for better or worse!
Idea 2
Combine array and object deconstruction. Create this helper function:
const dup = input => [input, input];
And then deconstruct away like so:
const [result, {a, b}] = dup(doSomething());
Now, your result, a, and b are all consts.
In #raina77ow's answer they lament consttoken isn't quite right-handy; but if you use a colon (and repeat the keyword) instead of a comma, there's your answer.
But you already mentioned const result = doSomething(); const {a, b} = result; in your question, I dont see how it's any worse, and it works.
But from that, one thing you can see is that let something = x; let another = y; is the same as let [something, another] = [x, y];.
Thus a really elegant solution is actually simply:
const [result, {a, b}] = [,,].fill(doSomething());
You need the extra , as it is trailing
In addition to this (to make it it's own answer instead of only comment-worthy), this duplicating can also be done inside the destructuring syntaxt (which is why I came across this question).
Say b within result itself had a c; you want to destructure that, but also keep the reference to b.
//The above might lead you to believe you need to do this:
const result = doSomething(); const {a, b} = result; const {c} = b;
//or this
const [result, {a, b}, {b:{c}}] = [,,,].fill(doSomething());
But you can actually just
const [result, {a, b, b:{c}}] = [,,].fill(doSomething());
Now you have result, a, b, & c, even though a & b were in result, and c was in b.
This is especially handy if you dont actually need result, it looks like fill() is only required for the root object:
const {a, b, b:{c}} = doSomething();
This mightn't seem to work for arrays, since the position in the syntax is the key
const [result, [a, b, /*oops, I'm referencing index 2 now*/]] = [,,].fill(doArrayThing());
However, arrays are objects, so you can just use indices as keys and dupe an index reference:
const [result, {0:a, 1:b, 1:{c}}] = [,,].fill(doArrayThing());
This also means you can destructure array-likes, whereas normally it complains about the object not being iterable, and you can skip indices by just using a higher key instead of the array's syntax where you'll have to write empty commas.
And perhaps the best of all, {0:a, 1:b, ...c} still works as [a, b, ...c] would, since Object.keys() for an array pulls its indices (but the resulting c will not have a .length).
But I'm not content with that, and I really liked where #Arash was going with idea #2, but it wasn't generic enough to help with de-duping the b in the example above, and it dupes the const lines.
So...I wrote my own :| (ctrl+F for goodluck)
You use the same normal syntax, with some exceptions:
your destructure is written in a template literal, with the input object appearing as an interpolation
eg [,,] = input becomes `[,,] = ${input}`
The equals is actually optional
you never rename the outputs within the destruction
eg [a, b, ...c] = input becomes `[, , ...] ${input}`
the output of this template ran against μ (you can name it whatever) is an array of the elements you specified in order
eg const {a:A, b:B} = input; becomes const [A,B] = μ`{a, b} ${input}`;
NB how the rename comes at the output. And even if the input is an object, the output is always a flat array.
you can skip elements in an iterator using a number instead of repeated commas
eg const [a, , , d] = input; is const [a,d] = μ`[ , 2, ]`;
and finally, the whole point of this; when going into an object, preceding it with a colon saves it to the output
eg
const [result, {a, b, b:{c}}] = [,,].fill(doSomething());
becomes
const [result, a, b] = μ`:{a, b::{c}} ${doSomething()}`;
So, other than the above, the pros:
I'm not running eval at all, but actually parsing and applying logic to your input,as such I can give you way better error messages at runtime.
Eg ES6 doesnt even bother with this one:
_ = {a:7, get b() {throw 'hi'}};
console.warn('ES6');
out(() => {
const {a, b} = _;
return [a, b];
});
console.warn('hashbrown');
out(() => {
const {a,b} = μ`{a,...} ${_}`;
return [a, b];
});
Eg2 Here ES6 says _ was the culprit. Not only do I correctly say it was 1 at fault, but I tell you where in the destructure it happened:
_ = [1];
console.warn('ES6');
out(() => {
const [[a]] = _;
return [a];
});
console.warn('hashbrown');
out(() => {
const [a] = μ`[[]] ${_}`;
return [a];
});
super handy if you need to skip large arrays or are keeping a lot of inner variables
Eg
const [[a,,,,,,,,,j], [[aa, ab], [ba]]] = [,,].fill(_);
const [a, aa, ab, ba, j] = μ`[:[ , ], [ ], 7, ] ${_}`;
Okay, what's the catch? The cons:
well even that last pro, the destruction syntax with all the names missing can be hard to read. Really we need this syntax in the language, so the names are inside it instead of the const [ happening outside of it.
compilers dont know what to do with this, syntax errors are runtime (whereas you'd be told earlier natively with ES6), IDE's probably wont be able to tell whats getting spat out (and I'm refusing to write a properly done templated .d.ts for it) if you're using some sort of typechecking
and as alluded to before you get slightly worse compile time errors for your syntax. I just tell you that something wasn't right, not what.
But, to be fair, I still tell you where you went wrong, if you have multiple rest operators, I dont think ES6 is much help
Eg
_ = [1, 2, 3, 4];
console.warn('ES6');
out(() => {
eval(`const [a, ...betwixt, b] = _`);
return [a, betwixt, b];
});
console.warn('hashbrown');
out(() => {
const [a, betwixt, b] = μ`[, ..., ] ${_}`;
return [a, betwixt, b];
});
it's really only worth it if you're dealing with arrays, or you're renaming all the outputs anyway, because otherwise you're left specifying the name twice. This would be fixed along with point 1 if :{ :[ and [2 were adopted into the language, you wouldn't need to respecify outside in your const [
how I wrote it it honestly probably only runs in Chrome since firefox still doesn't have named capture groups. I was diligent in writing the [regex] parser to make all unused groups non-capturing, so if you're keen, it wouldn't be hard to make it FF-compatible
So where's the code?
You're keen.
Goodluck.
window.μ = (() => {
//build regexes without worrying about
// - double-backslashing
// - adding whitespace for readability
// - adding in comments
let clean = (piece) => (piece
.replace(/(?<=^|\n)(?<line>(?:[^\/\\]|\/[^*\/]|\\.)*)\/\*(?:[^*]|\*[^\/])*(\*\/|)/g, '$<line>')
.replace(/(?<=^|\n)(?<line>(?:[^\/\\]|\/[^\/]|\\.)*)\/\/[^\n]*/g, '$<line>')
.replace(/\n\s*/g, '')
);
let regex = ({raw}, ...interpolations) => (
new RegExp(interpolations.reduce(
(regex, insert, index) => (regex + insert + clean(raw[index + 1])),
clean(raw[0])
))
);
let start = {
parse : regex`^\s*(?:
//the end of the string
//I permit the equal sign or just declaring the input after the destructure definition without one
(?<done>=?\s*)
|
//save self to output?
(?<read>(?<save>:\s*|))
//opening either object or array
(?<next>(?<open>[{[]).*)
)$`
};
let object = {
parse : regex`^\s*
(?<read>
//closing the object
(?<close>\})|
//starting from open or comma you can...
(?:[,{]\s*)(?:
//have a rest operator
(?<rest>\.\.\.)
|
//have a property key
(?<key>
//a non-negative integer
\b\d+\b
|
//any unencapsulated string of the following
\b[A-Za-z$_][\w$]*\b
|
//a quoted string
(?<quoted>"|')(?:
//that contains any non-escape, non-quote character
(?!\k<quoted>|\\).
|
//or any escape sequence
(?:\\.)
//finished by the quote
)*\k<quoted>
)
//after a property key, we can go inside
\s*(?<inside>:|)
)
)
(?<next>(?:
//after closing we expect either
// - the parent's comma/close,
// - or the end of the string
(?<=\})\s*(?:[,}\]=]|$)
|
//after the rest operator we expect the close
(?<=\.)\s*\}
|
//after diving into a key we expect that object to open
(?<=:)\s*[{[:]
|
//otherwise we saw only a key, we now expect a comma or close
(?<=[^:\.}])\s*[,}]
).*)
$`,
//for object, pull all keys we havent used
rest : (obj, keys) => (
Object.keys(obj)
.filter((key) => (!keys[key]))
.reduce((output, key) => {
output[key] = obj[key];
return output;
}, {})
)
};
let array = {
parse : regex`^\s*
(?<read>
//closing the array
(?<close>\])
|
//starting from open or comma you can...
(?:[,[]\s*)(?:
//have a rest operator
(?<rest>\.\.\.)
|
//skip some items using a positive integer
(?<skip>\b[1-9]\d*\b)
|
//or just consume an item
(?=[^.\d])
)
)
(?<next>(?:
//after closing we expect either
// - the parent's comma/close,
// - or the end of the string
(?<=\])\s*(?:[,}\]=]|$)
|
//after the rest operator we expect the close
(?<=\.)\s*\]
|
//after a skip we expect a comma
(?<=\d)\s*,
|
//going into an object
(?<=[,[])\s*(?<inside>[:{[])
|
//if we just opened we expect to consume or consume one and close
(?<=\[)\s*[,\]]
|
//otherwise we're just consuming an item, we expect a comma or close
(?<=[,[])\s*[,\]]
).*)
$`,
//for 'array', juice the iterator
rest : (obj, keys) => (Array.from(keys))
};
let destructure = ({next, input, used}) => {
//for exception handling
let phrase = '';
let debugging = () => {
let tmp = type;
switch (tmp) {
case object: tmp = 'object'; break;
case array : tmp = 'array'; break;
case start : tmp = 'start'; break;
}
console.warn(
`${tmp}\t%c${phrase}%c\u2771%c${next}`,
'font-family:"Lucida Console";',
'font-family:"Lucida Console";background:yellow;color:black;',
'font-family:"Lucida Console";',
//input, used
);
};
debugging = null;
//this algorithm used to be recursive and beautiful, I swear,
//but I unwrapped it into the following monsterous (but efficient) loop.
//
//Lots of array destructuring and it was really easy to follow the different parse paths,
//now it's using much more efficient `[].pop()`ing.
//
//One thing that did get much nicer with this change was the error handling.
//having the catch() rethrow and add snippets to the string as it bubbled back out was...gross, really
let read, quoted, key, save, open, inside, close, done, rest, type, keys, parents, stack, obj, skip;
try {
let output = [];
while (
//this is the input object and any in the stack prior
[obj, ...parents] = input,
//this is the map of used keys used for the rest operator
[keys, ...stack] = used,
//assess the type from how we are storing the used 'keys'
type = (!keys) ? start : (typeof keys.next == 'function') ? array : object,
phrase += (read || ''),
read = '',
debugging && debugging(),
//parse the phrase, deliberately dont check if it doesnt match; this way it will throw
{read, quoted, next, key, save, open, inside, close, done, rest, skip} = next.match(type.parse).groups,
done == null
) {
if (open) {
//THIS IS THE EXTRA FUNCTIONALITY
if (save)
output.push(obj);
switch (open) {
case '{':
used = [{}, ...stack];
break;
case '[':
used = [obj[Symbol.iterator](), ...stack];
input = [null, ...parents];
break;
default:
throw open;
}
continue;
}
if (close) {
used = stack;
input = parents;
continue;
}
//THIS IS THE EXTRA FUNCTIONALITY
if (skip) {
for (skip = parseInt(skip); skip-- > 0; keys.next());
continue;
}
//rest operator
if (rest) {
obj = type.rest(obj, keys);
//anticipate an immediate close
input = [null, ...parents];
}
//fetch the named item
else if (key) {
if (quoted) {
key = JSON.parse(key);
}
keys[key] = true;
obj = obj[key];
}
//fetch the next item
else
obj = keys.next().value;
//dive into the named object or append it to the output
if (inside) {
input = [obj, ...input];
used = [null, ...used];
}
else
output.push(obj);
}
return output;
}
catch (e) {
console.error('%c\u26A0 %cError destructuring', 'color:yellow;', '', ...input);
console.error(
`%c\u26A0 %c${phrase}%c${read || '\u2771'}%c${next || ''}`,
'color:yellow;',
'font-family:"Lucida Console";',
'font-family:"Lucida Console";background:red;color:white;',
'font-family:"Lucida Console";'
);
throw e;
}
return null;
};
//just to rearrange the inputs from template literal tags to what destructure() expects.
//I used to have the function exposed directly but once I started supporting
//iterators and spread I had multiple stacks to maintain and it got messy.
//Now that it's wrapped it runs iteratively instead of recursively.
return ({raw:[next]}, ...input) => (destructure({next, input, used:[]}));
})();
The demo's tests:
let out = (func) => {
try {
console.log(...func().map((arg) => (JSON.stringify(arg))));
}
catch (e) {
console.error(e);
}
};
let _;
//THE FOLLOWING WORK (AND ARE MEANT TO)
_ = {a:{aa:7}, b:8};
out(() => {
const [input,{a,a:{aa},b}] = [,,].fill(_);
return [input, a, b, aa];
});
out(() => {
const [input,a,aa,b] = μ`:{a::{aa},b}=${_}`;
return [input, a, b, aa];
});
_ = [[65, -4], 100, [3, 5]];
out(() => {
//const [[aa, ab], , c] = input; const [ca, cb] = c;
const {0:{0:aa, 1:ab}, 2:c, 2:{0:ca, 1:cb}} = _;
return [aa, ab, c, ca, cb];
});
out(() => {
const [aa,ab,c,ca,cb] = μ`{0:{0,1}, 2::{0,1}}=${_}`;
return [aa, ab, c, ca, cb];
});
_ = {a:{aa:7, ab:[7.5, 7.6, 7.7], 'a c"\'':7.8}, b:8};
out(() => {
const [input,{a,a:{aa,ab,ab:{0:aba, ...abb},"a c\"'":ac},b,def='hi'}] = [,,].fill(_);
return [input, a, aa, ab, aba, abb, ac, b, def];
});
out(() => {
const [input,a,aa,ab,aba,abb,ac,b,def='hi'] = μ`:{a::{aa,ab::{0, ...},"a c\"'"},b}=${_}`;
return [input, a, aa, ab, aba, abb, ac, b, def];
});
_ = [{aa:7, ab:[7.5, {abba:7.6}, 7.7], 'a c"\'':7.8}, 8];
out(() => {
const [input,[{aa,ab,ab:[aba,{abba},...abc],"a c\"'":ac}],[a,b,def='hi']] = [,,,].fill(_);
return [input, a, aa, ab, aba, abba, abc, ac, b, def];
});
out(() => {
const [input,a,aa,ab,aba,abba,abc,ac,b,def='hi'] = μ`:[:{aa,ab::[,{abba},...],"a c\"'"},]=${_}`;
return [input, a, aa, ab, aba, abba, abc, ac, b, def];
});
_ = [[-1,-2],[-3,-4],4,5,6,7,8,9,0,10];
out(() => {
const [[a,,,,,,,,,j], [[aa, ab], [ba]]] = [,,].fill(_);
return [a, aa, ab, ba, j];
});
out(() => {
const [a, aa, ab, ba, j] = μ`[:[ , ], [ ], 7, ] ${_}`;
return [a, aa, ab, ba, j];
});
//THE FOLLOWING FAIL (AND ARE MEANT TO)
_ = [1];
console.warn('ES6');
out(() => {
const [[a]] = _;
return [a];
});
console.warn('hashbrown');
out(() => {
const [a] = μ`[[]] ${_}`;
return [a];
});
_ = [1, 2, 3, 4];
console.warn('ES6');
out(() => {
eval(`const [a, ...betwixt, b] = _`);
return [a, betwixt, b];
});
console.warn('hashbrown');
out(() => {
const [a, betwixt, b] = μ`[, ..., ] ${_}`;
return [a, betwixt, b];
});
_ = {a:7, get b() {throw 'hi'}};
console.warn('ES6');
out(() => {
const {a, b} = _;
return [a, b];
});
console.warn('hashbrown');
out(() => {
const {a,b} = μ`{a,...} ${_}`;
return [a, b];
});
And the output if your browser couldn't run it but you're curious (the errors are testing error outputs for native vs this thing)
Destructuring an object using a key twice is allowed. Do that a wrapper and it works well for me. The key can be data, _, or whatever you like.
This works in Javascript and Typescript.
const getMealData = () => ({ breakfast: 'fried egg 🍳', lunch: 'sushi 🍱' })
const {
data: mealData,
data: { breakfast },
} = { data: getMealData() }
console.log(mealData) // { breakfast: 'fried egg 🍳', lunch: 'sushi 🍱' }
console.log(breakfast) // "fried egg 🍳"
Went looking for a good solution and ended up working it out so thought I would throw this in with the MANY solutions in this old thread...
This returns an object with an additional field being the raw return.
The same thing works for destructuring hooks into an array.
const { a, b, func } = (x => ({ ...x, func: x }))(doSomething());

What would be a good example of an endofunctor that is not the identity functor?

In Professor Frisby Introduces Composable Functional JavaScript the identity functor was introduced:
const Box = x =>
({
map: f => Box(f(x)),
fold: f => f(x) // for testing
})
I spent the better part of the day understanding functors and why the above JavaScript code is actually the identity functor. So I thought I would alter it to get a "real" functor that is not the identity functor. I came up with this:
const Endo = x =>
({
map: f => Endo(f(x).split('')),
fold: f => f(x).split('') // for testing
})
My reasoning is that with Box, Id_Box: Box -> Box and Id_Box f = f. Endo would also map to itself but Endo(f): Endo(x) -> Endo(y) (if f: x -> y).
Am I on the right track?
EDIT:
Replaced string with the more generic x as it was in the original examples.
As pointed out in this answer, for our purposes as programmers we can treat all functors as endofunctors so don't get too caught up on the differences.
As for what a functor is, in brief it is
a data structure (Box in your example)
that can support a mapping operation (think Array.prototype.map)
and that mapping operation respects identity: xs === xs.map(x => x)
...and composition: xs.map(f).map(g) === xs.map(f . g) where . is function composition.
That's it. No more, no less. Looking at your Box, it's a data structure that has a map function (check 1 & 2) and that map function looks like it should respect identity and composition (check 3 & 4). So it's a functor. But it doesn't do anything, which is why it's the identity functor. The fold function isn't strictly necessary, it just provides a way to 'unwrap' the box.
For a useful functor, let's look at JavaScript arrays. Arrays actually do something: namely they contain multiple values rather than just a single one. If an array could only have one element, it'd be your Box. For our purposes we'll pretend that they can only hold values of the same type to simply things. So an array is a data structure, that has a map function, that respects identity and composition.
let plus = x => y => x + y;
let mult = x => y => x * y;
let plus2 = plus(2);
let times3 = mult(3);
let id = x => x;
let compose = (...fs) => arg => fs.reverse().reduce((x, f) => { return f(x) }, arg);
// Here we need to stringify the arrays as JS will compare on
// ref rather than value. I'm omitting it after the first for
// brevity, but know that it's necessary.
[1,2,3].map(plus2).toString() === [3,4,5].toString(); // true
[1,2,3].map(id) === [1,2,3]; // true
[1,2,3].map(plus2).map(times3) === [1,2,3].map(compose(times3, plus2)); // true
So when we map a function over a functor (array) we get back another instance of the same functor (a new Array) with the function applied to whatever the functor (array) was holding.
So now lets look at another ubiquitous JavaScript data structure, the object. There's no built in map function for objects. Can we make them a functor? Assume again that the object is homogenous (only has keys to one type of value, in this example Number):
let mapOverObj = obj => f => {
return Object.entries(obj).reduce((newObj, [key, value]) => {
newObj[key] = f(value);
return newObj;
}, {});
};
let foo = { 'bar': 2 };
let fooPrime = mapOverObj(foo)(plus2); // { 'bar': 4 }
And you can continue on to test that the function accurately (as far as is possible in JavaScript) supports identity and composition to satisfy the functor laws.

Can we set persistent default parameters which remain set until explicitly changed?

The below is a function fn where expected result is for a, b, c to defined at every call of fn, whether an object parameter is passed or not. If object is passed which sets property, property should be set only for that object.
const fn = (opts = {a:1, b:2, c:3}) => console.log(opts);
when called without parameters the result is
fn() // {a: 1, b: 2, c: 3}
when called with parameter, for example {b:7}, the expected result is
fn({b:7}) // {a: 1, b: 7, c: 3}
however, the actual result is
fn({b:7}) // {b: 7}
Was able to get expected result by defining an object outside of function and using Object.assign() within function body
const settings = {a: 1, b: 2, c: 3};
const fn = opts => {opts = Object.assign({}, settings, opts); console.log(opts)}
fn({b: 7}) // {a: 1, b: 7, c: 3}
fn(); // {a: 1, b: 2, c: 3}
/*
// does not log error; does not return expected result
const fn = (opts = Object.assign({}, settings, opts)) => console.log(opts)
*/
Can the above result be achieved solely utilizing default parameters, without defining an object to reference outside of function parameters or within function body?
Maybe I misunderstood the question, but you seem to be looking for default initialisers for each separate property. For that, you have to use destructuring:
const fn = ({a = 1, b = 2, c = 3} = {}) => console.log({a, b, c});
If you want to keep arbitrary properties, not just those that you know of up front, you might be interested in the object rest/spread properties proposal that allows you to write
const fn = ({a = 1, b = 2, c = 3, ...opts} = {}) => console.log({a, b, c, ...opts});
Can an opts variable as the single object reference be achieved solely utilizing default parameters, without defining an object to reference outside of function parameters or within function body?
No. Parameter declarations are only able to initialise variables with (parts of) the arguments, and possibly (as syntactic sugar) with default values when no or undefined argument (parts) are passed. They are not able to carry out unconditional computations and create variables inialised from the results - which is what you attempt to achieve here.
You are supposed to use the function body for that.
No
The best that can be done is either your own answer or this:
const fn = (default_parameters) => {
default_parameters = Object.assign({}, {a: 1, b: 2, c: 3},default_parameters);
console.log('These are the parameters:');
console.log(default_parameters);
}
fn();
fn({b: 7});
fn({g: 9, x: 10});
The default parameter block is only executed if the value is not set, so your own answer is the best that is on offer ie use two parameters
You can convince yourself of this by creating a code block that will fail if executed and testing that passing a parameter works (to show that the code block is not executed) and testing that not passing a parameter fails (showing that the code block is only executed when no parameter is passed).
This should demonstrate clearly that any paramter passed will prevent the default parameter from being evaluated at all.
const fn = (default_parameters = (default_parameters = Object.assign({}, {a: 1, b: 2, c: 3},default_parameters))) => {
console.log('These are the parameters:');
console.log(default_parameters);
}
fn({b: 7});
fn();
fn({g: 9, x: 10});
We can set fn as a variable which returns an arrow function expression. When called set a, b, c and rest parameters reference using spread element at new object, which is returned when the function is invoked.
const fn = ((...opts) => ({a:1,b:2,c:3, ...opts.pop()}));
let opts = fn();
console.log(opts);
opts = fn({b: 7});
console.log(opts);
opts = fn({g: 9, x: 10});
console.log(opts);
Using rest element, Object.assign(), spread element, Array.prototype.map(), setting element that is not an object as value of property reflecting index of element in array.
const fn = ((...opts) => Object.assign({a:1,b:2,c:3}, ...opts.map((prop, index) =>
prop && typeof prop === "object" && !Array.isArray(prop)
? prop
: {[index]:prop}))
);
let opts = fn([2,3], ...[44, "a", {b:7}, {g:8, z: 9}, null, void 0]);
console.log(opts);
Though code at OP uses single default parameter, until we locate or develop a procedure for using only single parameter, we can utilize setting two default parameters to achieve expected result.
The first parameter defaults to a plain object, at second default parameter we pass parameter identifier from first parameter to Object.assign() following pattern at Question.
We reference second parameter identifier of function fn to get the default parameters when called without parameters; when called with first parameter having properties set to properties of object passed at first parameter and default parameters, the former overwriting the latter at the resulting object.
const fn = (__ = {}, opts = Object.assign({}, {a: 1, b: 2, c: 3}, __)) =>
console.log(opts);
fn();
fn({b: 7});
fn({g: 9, x: 10});

Using the same first argument in a Ramda pipe

Is it possible to compose the following function in Ramda?
(a, b) => R.pipe(func1(a), func2(a))(b)
The aim is passing argument a to all functions in the pipe.
My initial thought was the following:
R.pipe(R.juxt([func1, func2]), R.apply(R.pipe))
But this gives TypeError: Cannot read property 'length' of undefined.
This is easier to express using S.pipe as it takes an array of functions to compose.
One option is to use R.ap:
> S.pipe(R.ap([S.add, S.mult, S.add], [3]), 10)
42
Another option is to use R.map with S.T, the thrush combinator:
> S.pipe(R.map(S.T(3), [S.add, S.mult, S.add]), 10)
42
You could then wrap one of these in a lambda to create a binary function:
const f = (a, b) => S.pipe(R.ap([S.add, S.mult, S.add], [a]), b);
f(3, 10); // add(3)(mult(3)(add(3)(10)))
// => 42
You have to create a curried version of all the functions, then call the curried version of each with a and finally use b in either R.pipe or R.reduce
Let's say that the functions that need to have a as the first argument are f,g,h (in that specific order) then we want to achieve the following expression
h(a, g(a, f(a, b)))
First of all let's create a curried function which receives two arguments, a single value v and a function fn, when this function receives all its required arguments it will simply return fn(v)
const rev = R.curry((v, fn) => fn(v))
Next we can create the curried versions of the functions with R.map and R.curry
// note the reversed order of the functions
// I'll use R.compose instead of R.pipe
let curriedVersion = R.map(R.curry, [h,g,f])
However we also need to use a as the first argument of the curried functions, we could call each curried function with a using R.map but instead we will use our special function rev
const curriedVersion = R.map(R.compose(rev(a), R.curry), [h,g,f])
Finally let's use this array of curried functions with R.compose and b
const result = R.compose.apply(undefined, curriedVersion)(b)
One liner (for the functions f,g,h in that specific order):
const solver = (a, b) => R.compose.apply(undefined, R.map(R.compose(rev(a), R.curry), [h,g,f]))(b)
const add = (a, b) => a + b
const sub = (a, b) => a - b
const mul = (a, b) => a * b
const rev = R.curry((v, fn) => fn(v))
const solver = (a, b) => R.compose.apply(undefined, R.map(R.compose(rev(a), R.curry), [mul,sub,add]))(b)
// mul(2, sub(2, add(2, 3)))
// mul(2, sub(2, 5))
// mul(2, -3)
// -6
console.log(solver(2, 3))
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.21.0/ramda.min.js"></script>

Categories

Resources