How to write a composed function using Ramda compose? - javascript

How to modify the zComposedFn function so that the output of both z and zComposedOutput is same?
const R = require('ramda');
let f1 = R.curry((p1, p2, p3) => {
let result = {
n1: p1,
n2: p2,
n3: p3
};
return result;
}
);
let x = f1(1, 2, 3);
let y = f1(1, 2, x);
let z = f1(1, x , y);
console.log(z);
let zComposedFn = R.compose(f1);
let input = [1,2,3];
let zComposedOutput = zComposedFn(...input);
console.log(zComposedOutput);
The goal is to create some metric calculation functions having the same signature and output type but different implementation.
const MetricFn = (m,f,a) => {<to be implemented using switch case on m> return b}
m : name of metric as string
f : Array of functions utilizing input data objects
a : input data object
example:
There is a financial dashboard, which receives the input data as (1,2,3). The dashboard displays the metric1, metric2 and metric3 calculated as below:
metric1 = MetricFn('metric1',[f1])(1,2,3);
metric2 = MetricFn('metric2',[f1, f2])(1,2,3);
metric3 = MetricFn('metric3',[f1, f2, f3])(1,2,3);
I am wondering on how to create the structure of MetricFn.

I can make very little sense out of your function. And I don't see anything that Ramda offers to help. While I'm a Ramda author, I don't always recall every function, but it seems unlikely.
Your requirements looks vaguely like the use of chain with functions, where chain(f, g) ~~> x => f(g(x))(x). But it's only a vague connection, and I can't see how to use chain to do what you are trying to do.
Is there an underlying problem you're trying to solve that we might be able to help with?
Here is a trivial implementation, but it's mostly just restating your code without the intermediate variables:
const foo = curry((f, a, b, c) => f(a, f(a, b, c), f(a, b, f(a, b, c))))
foo(f1)(1, 2, 3)

Related

Better way to compose nested functions

Suppose I have two functions with the following types.
f :: (a, b) -> c
g :: (a, b, c) -> d
I can compose them as follows.
function h(a, b) {
const c = f(a, b);
const d = g(a, b, c);
return d;
}
Here, h is a composition of g and f. However, this looks a lot like imperative code with the constant declarations and the return statement. How can I compose any two such functions in a functional style?
You can define h in a single line as follows.
const h = (a, b) => g(a, b, f(a, b));
Then you can generalize this composition for all such functions as follows.
const ss = (g, f) => (a, b) => g(a, b, f(a, b));
const h = ss(g, f);
This is actually the same as the S combinator but extended to two inputs. Hence, the name ss.
Alternatively, we could generalize the S combinator to any number of inputs using the spread operator as Scott Sauyet suggested.
const s = (g, f) => (...args) => g(...args, f(...args));
const h = s(g, f);
Personally, I'd stay away from polyvariadic functions.
By the way, you original example is not imperative even though it uses constant declarations.
This looks like a problem of how to combine two functions in a somewhat different way than by simple composition. We can write combinators that do this for us.
In this case, we essentially want to use our initial a and b arguments in f to generate d and use a, b, and d in g. We can write a function that combines arbitrary f and g to do this, keeping our original arguments and adding an additional one in the call to g. Maybe we'd call it addArg:
const addArg = (g, f) => (...args) =>
g (...args, f (...args))
This version attempts to be a bit more generic; we don't handle only two parameters. We can have any number of arguments to f, and we add one more to the call to g.
Here is a quick demo:
const addArg = (g, f) => (...args) =>
g (...args, f (...args))
const f = (a, b) => `f (${a}, ${b})`
const g = (a, b, d) => `g (${a}, ${b}, ${d})`
const h = addArg (g, f)
console .log (h ('a', 'b'))
I don't think there is anything wrong with the previous answers but I wanted to add an alternative and also to warn you about combinators.
I feel uncomfortable with the use of custom combinators because the reader must look up their definition, which reduces the value of using them.
I am familiar with lift2 and my first instinct would be to bend it to apply it to your pattern (your functions must be curried for it to work)
const lift2 = f => g => h => x => f (g (x)) (h (x));
const id = x => x;
const h = (a, b) => lift2 (g (a)) (id) (f (a)) (b);
So we're partially applying f and g with a and using id to store b until it's fed to g.
If you have trouble parsing the above, we can express lift2 like so, to make it more clear what is happening:
const lift2 = ga => id => fa => b => ga (id (b)) (fa (b));
Honestly I would go for the first one-liner suggested by Aadit M Shah, or for the second one, but named h, to sort of convey "it's just the h you would expect, but with dependencies injected"... or your implementation. Yours even provides explanatory variable names. I mean why not!
Even known combinators sometimes only obfuscate intent. It all depends with whom you work. And that includes your future self ;)

Clean way to keep original variable and destructure at the same time

Is there a cleaner way to do this (with anything that is at least an ES draft and has a babel plugin, i.e., ES6, ES7, etc.):
const { a, b } = result = doSomething();
Where I want to keep the overall result as one singular object, but also destructure it at the same time. It technically works, but result is implicitly declared (with an implicit var), while I'd really like it to also be a const.
I'm currently doing this:
const result = doSomething();
const { a, b } = result;
Which again works, but it's slightly on the verbose side, since I need to repeat this pattern dozens of times.
I'd ideally want something along the lines of:
const { a, b } = const result = doSomething();
But that is obviously an invalid syntax.
One possible way:
const result = doSomething(),
{ a, b } = result;
You still have to duplicate the name, though. const token isn't quite right-handy. )
Idea 1
Create this helper function:
function use(input, callback) {
callback(input, input);
}
and use it like:
use(doSomething(), (result, {a, b}) => {
// Do something with result as a whole, or a and b as destructured properties.
});
For example:
use ({a: "Hello", b: "World", c: "!"}, (result, {a, b}) => {
console.log(result);
console.log(a);
console.log(b);
});
// generates
// {a: "Hello", b: "World", c: "!"}
// Hello
// World
They're not const, but they're scoped, for better or worse!
Idea 2
Combine array and object deconstruction. Create this helper function:
const dup = input => [input, input];
And then deconstruct away like so:
const [result, {a, b}] = dup(doSomething());
Now, your result, a, and b are all consts.
In #raina77ow's answer they lament consttoken isn't quite right-handy; but if you use a colon (and repeat the keyword) instead of a comma, there's your answer.
But you already mentioned const result = doSomething(); const {a, b} = result; in your question, I dont see how it's any worse, and it works.
But from that, one thing you can see is that let something = x; let another = y; is the same as let [something, another] = [x, y];.
Thus a really elegant solution is actually simply:
const [result, {a, b}] = [,,].fill(doSomething());
You need the extra , as it is trailing
In addition to this (to make it it's own answer instead of only comment-worthy), this duplicating can also be done inside the destructuring syntaxt (which is why I came across this question).
Say b within result itself had a c; you want to destructure that, but also keep the reference to b.
//The above might lead you to believe you need to do this:
const result = doSomething(); const {a, b} = result; const {c} = b;
//or this
const [result, {a, b}, {b:{c}}] = [,,,].fill(doSomething());
But you can actually just
const [result, {a, b, b:{c}}] = [,,].fill(doSomething());
Now you have result, a, b, & c, even though a & b were in result, and c was in b.
This is especially handy if you dont actually need result, it looks like fill() is only required for the root object:
const {a, b, b:{c}} = doSomething();
This mightn't seem to work for arrays, since the position in the syntax is the key
const [result, [a, b, /*oops, I'm referencing index 2 now*/]] = [,,].fill(doArrayThing());
However, arrays are objects, so you can just use indices as keys and dupe an index reference:
const [result, {0:a, 1:b, 1:{c}}] = [,,].fill(doArrayThing());
This also means you can destructure array-likes, whereas normally it complains about the object not being iterable, and you can skip indices by just using a higher key instead of the array's syntax where you'll have to write empty commas.
And perhaps the best of all, {0:a, 1:b, ...c} still works as [a, b, ...c] would, since Object.keys() for an array pulls its indices (but the resulting c will not have a .length).
But I'm not content with that, and I really liked where #Arash was going with idea #2, but it wasn't generic enough to help with de-duping the b in the example above, and it dupes the const lines.
So...I wrote my own :| (ctrl+F for goodluck)
You use the same normal syntax, with some exceptions:
your destructure is written in a template literal, with the input object appearing as an interpolation
eg [,,] = input becomes `[,,] = ${input}`
The equals is actually optional
you never rename the outputs within the destruction
eg [a, b, ...c] = input becomes `[, , ...] ${input}`
the output of this template ran against μ (you can name it whatever) is an array of the elements you specified in order
eg const {a:A, b:B} = input; becomes const [A,B] = μ`{a, b} ${input}`;
NB how the rename comes at the output. And even if the input is an object, the output is always a flat array.
you can skip elements in an iterator using a number instead of repeated commas
eg const [a, , , d] = input; is const [a,d] = μ`[ , 2, ]`;
and finally, the whole point of this; when going into an object, preceding it with a colon saves it to the output
eg
const [result, {a, b, b:{c}}] = [,,].fill(doSomething());
becomes
const [result, a, b] = μ`:{a, b::{c}} ${doSomething()}`;
So, other than the above, the pros:
I'm not running eval at all, but actually parsing and applying logic to your input,as such I can give you way better error messages at runtime.
Eg ES6 doesnt even bother with this one:
_ = {a:7, get b() {throw 'hi'}};
console.warn('ES6');
out(() => {
const {a, b} = _;
return [a, b];
});
console.warn('hashbrown');
out(() => {
const {a,b} = μ`{a,...} ${_}`;
return [a, b];
});
Eg2 Here ES6 says _ was the culprit. Not only do I correctly say it was 1 at fault, but I tell you where in the destructure it happened:
_ = [1];
console.warn('ES6');
out(() => {
const [[a]] = _;
return [a];
});
console.warn('hashbrown');
out(() => {
const [a] = μ`[[]] ${_}`;
return [a];
});
super handy if you need to skip large arrays or are keeping a lot of inner variables
Eg
const [[a,,,,,,,,,j], [[aa, ab], [ba]]] = [,,].fill(_);
const [a, aa, ab, ba, j] = μ`[:[ , ], [ ], 7, ] ${_}`;
Okay, what's the catch? The cons:
well even that last pro, the destruction syntax with all the names missing can be hard to read. Really we need this syntax in the language, so the names are inside it instead of the const [ happening outside of it.
compilers dont know what to do with this, syntax errors are runtime (whereas you'd be told earlier natively with ES6), IDE's probably wont be able to tell whats getting spat out (and I'm refusing to write a properly done templated .d.ts for it) if you're using some sort of typechecking
and as alluded to before you get slightly worse compile time errors for your syntax. I just tell you that something wasn't right, not what.
But, to be fair, I still tell you where you went wrong, if you have multiple rest operators, I dont think ES6 is much help
Eg
_ = [1, 2, 3, 4];
console.warn('ES6');
out(() => {
eval(`const [a, ...betwixt, b] = _`);
return [a, betwixt, b];
});
console.warn('hashbrown');
out(() => {
const [a, betwixt, b] = μ`[, ..., ] ${_}`;
return [a, betwixt, b];
});
it's really only worth it if you're dealing with arrays, or you're renaming all the outputs anyway, because otherwise you're left specifying the name twice. This would be fixed along with point 1 if :{ :[ and [2 were adopted into the language, you wouldn't need to respecify outside in your const [
how I wrote it it honestly probably only runs in Chrome since firefox still doesn't have named capture groups. I was diligent in writing the [regex] parser to make all unused groups non-capturing, so if you're keen, it wouldn't be hard to make it FF-compatible
So where's the code?
You're keen.
Goodluck.
window.μ = (() => {
//build regexes without worrying about
// - double-backslashing
// - adding whitespace for readability
// - adding in comments
let clean = (piece) => (piece
.replace(/(?<=^|\n)(?<line>(?:[^\/\\]|\/[^*\/]|\\.)*)\/\*(?:[^*]|\*[^\/])*(\*\/|)/g, '$<line>')
.replace(/(?<=^|\n)(?<line>(?:[^\/\\]|\/[^\/]|\\.)*)\/\/[^\n]*/g, '$<line>')
.replace(/\n\s*/g, '')
);
let regex = ({raw}, ...interpolations) => (
new RegExp(interpolations.reduce(
(regex, insert, index) => (regex + insert + clean(raw[index + 1])),
clean(raw[0])
))
);
let start = {
parse : regex`^\s*(?:
//the end of the string
//I permit the equal sign or just declaring the input after the destructure definition without one
(?<done>=?\s*)
|
//save self to output?
(?<read>(?<save>:\s*|))
//opening either object or array
(?<next>(?<open>[{[]).*)
)$`
};
let object = {
parse : regex`^\s*
(?<read>
//closing the object
(?<close>\})|
//starting from open or comma you can...
(?:[,{]\s*)(?:
//have a rest operator
(?<rest>\.\.\.)
|
//have a property key
(?<key>
//a non-negative integer
\b\d+\b
|
//any unencapsulated string of the following
\b[A-Za-z$_][\w$]*\b
|
//a quoted string
(?<quoted>"|')(?:
//that contains any non-escape, non-quote character
(?!\k<quoted>|\\).
|
//or any escape sequence
(?:\\.)
//finished by the quote
)*\k<quoted>
)
//after a property key, we can go inside
\s*(?<inside>:|)
)
)
(?<next>(?:
//after closing we expect either
// - the parent's comma/close,
// - or the end of the string
(?<=\})\s*(?:[,}\]=]|$)
|
//after the rest operator we expect the close
(?<=\.)\s*\}
|
//after diving into a key we expect that object to open
(?<=:)\s*[{[:]
|
//otherwise we saw only a key, we now expect a comma or close
(?<=[^:\.}])\s*[,}]
).*)
$`,
//for object, pull all keys we havent used
rest : (obj, keys) => (
Object.keys(obj)
.filter((key) => (!keys[key]))
.reduce((output, key) => {
output[key] = obj[key];
return output;
}, {})
)
};
let array = {
parse : regex`^\s*
(?<read>
//closing the array
(?<close>\])
|
//starting from open or comma you can...
(?:[,[]\s*)(?:
//have a rest operator
(?<rest>\.\.\.)
|
//skip some items using a positive integer
(?<skip>\b[1-9]\d*\b)
|
//or just consume an item
(?=[^.\d])
)
)
(?<next>(?:
//after closing we expect either
// - the parent's comma/close,
// - or the end of the string
(?<=\])\s*(?:[,}\]=]|$)
|
//after the rest operator we expect the close
(?<=\.)\s*\]
|
//after a skip we expect a comma
(?<=\d)\s*,
|
//going into an object
(?<=[,[])\s*(?<inside>[:{[])
|
//if we just opened we expect to consume or consume one and close
(?<=\[)\s*[,\]]
|
//otherwise we're just consuming an item, we expect a comma or close
(?<=[,[])\s*[,\]]
).*)
$`,
//for 'array', juice the iterator
rest : (obj, keys) => (Array.from(keys))
};
let destructure = ({next, input, used}) => {
//for exception handling
let phrase = '';
let debugging = () => {
let tmp = type;
switch (tmp) {
case object: tmp = 'object'; break;
case array : tmp = 'array'; break;
case start : tmp = 'start'; break;
}
console.warn(
`${tmp}\t%c${phrase}%c\u2771%c${next}`,
'font-family:"Lucida Console";',
'font-family:"Lucida Console";background:yellow;color:black;',
'font-family:"Lucida Console";',
//input, used
);
};
debugging = null;
//this algorithm used to be recursive and beautiful, I swear,
//but I unwrapped it into the following monsterous (but efficient) loop.
//
//Lots of array destructuring and it was really easy to follow the different parse paths,
//now it's using much more efficient `[].pop()`ing.
//
//One thing that did get much nicer with this change was the error handling.
//having the catch() rethrow and add snippets to the string as it bubbled back out was...gross, really
let read, quoted, key, save, open, inside, close, done, rest, type, keys, parents, stack, obj, skip;
try {
let output = [];
while (
//this is the input object and any in the stack prior
[obj, ...parents] = input,
//this is the map of used keys used for the rest operator
[keys, ...stack] = used,
//assess the type from how we are storing the used 'keys'
type = (!keys) ? start : (typeof keys.next == 'function') ? array : object,
phrase += (read || ''),
read = '',
debugging && debugging(),
//parse the phrase, deliberately dont check if it doesnt match; this way it will throw
{read, quoted, next, key, save, open, inside, close, done, rest, skip} = next.match(type.parse).groups,
done == null
) {
if (open) {
//THIS IS THE EXTRA FUNCTIONALITY
if (save)
output.push(obj);
switch (open) {
case '{':
used = [{}, ...stack];
break;
case '[':
used = [obj[Symbol.iterator](), ...stack];
input = [null, ...parents];
break;
default:
throw open;
}
continue;
}
if (close) {
used = stack;
input = parents;
continue;
}
//THIS IS THE EXTRA FUNCTIONALITY
if (skip) {
for (skip = parseInt(skip); skip-- > 0; keys.next());
continue;
}
//rest operator
if (rest) {
obj = type.rest(obj, keys);
//anticipate an immediate close
input = [null, ...parents];
}
//fetch the named item
else if (key) {
if (quoted) {
key = JSON.parse(key);
}
keys[key] = true;
obj = obj[key];
}
//fetch the next item
else
obj = keys.next().value;
//dive into the named object or append it to the output
if (inside) {
input = [obj, ...input];
used = [null, ...used];
}
else
output.push(obj);
}
return output;
}
catch (e) {
console.error('%c\u26A0 %cError destructuring', 'color:yellow;', '', ...input);
console.error(
`%c\u26A0 %c${phrase}%c${read || '\u2771'}%c${next || ''}`,
'color:yellow;',
'font-family:"Lucida Console";',
'font-family:"Lucida Console";background:red;color:white;',
'font-family:"Lucida Console";'
);
throw e;
}
return null;
};
//just to rearrange the inputs from template literal tags to what destructure() expects.
//I used to have the function exposed directly but once I started supporting
//iterators and spread I had multiple stacks to maintain and it got messy.
//Now that it's wrapped it runs iteratively instead of recursively.
return ({raw:[next]}, ...input) => (destructure({next, input, used:[]}));
})();
The demo's tests:
let out = (func) => {
try {
console.log(...func().map((arg) => (JSON.stringify(arg))));
}
catch (e) {
console.error(e);
}
};
let _;
//THE FOLLOWING WORK (AND ARE MEANT TO)
_ = {a:{aa:7}, b:8};
out(() => {
const [input,{a,a:{aa},b}] = [,,].fill(_);
return [input, a, b, aa];
});
out(() => {
const [input,a,aa,b] = μ`:{a::{aa},b}=${_}`;
return [input, a, b, aa];
});
_ = [[65, -4], 100, [3, 5]];
out(() => {
//const [[aa, ab], , c] = input; const [ca, cb] = c;
const {0:{0:aa, 1:ab}, 2:c, 2:{0:ca, 1:cb}} = _;
return [aa, ab, c, ca, cb];
});
out(() => {
const [aa,ab,c,ca,cb] = μ`{0:{0,1}, 2::{0,1}}=${_}`;
return [aa, ab, c, ca, cb];
});
_ = {a:{aa:7, ab:[7.5, 7.6, 7.7], 'a c"\'':7.8}, b:8};
out(() => {
const [input,{a,a:{aa,ab,ab:{0:aba, ...abb},"a c\"'":ac},b,def='hi'}] = [,,].fill(_);
return [input, a, aa, ab, aba, abb, ac, b, def];
});
out(() => {
const [input,a,aa,ab,aba,abb,ac,b,def='hi'] = μ`:{a::{aa,ab::{0, ...},"a c\"'"},b}=${_}`;
return [input, a, aa, ab, aba, abb, ac, b, def];
});
_ = [{aa:7, ab:[7.5, {abba:7.6}, 7.7], 'a c"\'':7.8}, 8];
out(() => {
const [input,[{aa,ab,ab:[aba,{abba},...abc],"a c\"'":ac}],[a,b,def='hi']] = [,,,].fill(_);
return [input, a, aa, ab, aba, abba, abc, ac, b, def];
});
out(() => {
const [input,a,aa,ab,aba,abba,abc,ac,b,def='hi'] = μ`:[:{aa,ab::[,{abba},...],"a c\"'"},]=${_}`;
return [input, a, aa, ab, aba, abba, abc, ac, b, def];
});
_ = [[-1,-2],[-3,-4],4,5,6,7,8,9,0,10];
out(() => {
const [[a,,,,,,,,,j], [[aa, ab], [ba]]] = [,,].fill(_);
return [a, aa, ab, ba, j];
});
out(() => {
const [a, aa, ab, ba, j] = μ`[:[ , ], [ ], 7, ] ${_}`;
return [a, aa, ab, ba, j];
});
//THE FOLLOWING FAIL (AND ARE MEANT TO)
_ = [1];
console.warn('ES6');
out(() => {
const [[a]] = _;
return [a];
});
console.warn('hashbrown');
out(() => {
const [a] = μ`[[]] ${_}`;
return [a];
});
_ = [1, 2, 3, 4];
console.warn('ES6');
out(() => {
eval(`const [a, ...betwixt, b] = _`);
return [a, betwixt, b];
});
console.warn('hashbrown');
out(() => {
const [a, betwixt, b] = μ`[, ..., ] ${_}`;
return [a, betwixt, b];
});
_ = {a:7, get b() {throw 'hi'}};
console.warn('ES6');
out(() => {
const {a, b} = _;
return [a, b];
});
console.warn('hashbrown');
out(() => {
const {a,b} = μ`{a,...} ${_}`;
return [a, b];
});
And the output if your browser couldn't run it but you're curious (the errors are testing error outputs for native vs this thing)
Destructuring an object using a key twice is allowed. Do that a wrapper and it works well for me. The key can be data, _, or whatever you like.
This works in Javascript and Typescript.
const getMealData = () => ({ breakfast: 'fried egg 🍳', lunch: 'sushi 🍱' })
const {
data: mealData,
data: { breakfast },
} = { data: getMealData() }
console.log(mealData) // { breakfast: 'fried egg 🍳', lunch: 'sushi 🍱' }
console.log(breakfast) // "fried egg 🍳"
Went looking for a good solution and ended up working it out so thought I would throw this in with the MANY solutions in this old thread...
This returns an object with an additional field being the raw return.
The same thing works for destructuring hooks into an array.
const { a, b, func } = (x => ({ ...x, func: x }))(doSomething());

ES6 double arrow parameters (i.e. const update = x => y => { } ) [duplicate]

This question already has answers here:
javascript es6 double arrow functions
(2 answers)
Closed 5 years ago.
What does double arrow parameters mean in the following code?
const update = x => y => {
// Do something with x and y
}
How is it different compared to the following?
const update = (x, y) => {
// Do something with x and y
}
Thanks!
Let's rewrite them "old style", the first one is:
const update = function (x) {
return function(y) {
// Do something with x and y
};
};
While the second one is:
const update = function (x, y) {
// Do something with x and y
};
So as you can see they are quite different, the first returns an "intermediate" function, while the second is a single function with two parameters.
There's nothing special about "double arrow parameters", this is just one arrow function returning another, and can be extended for as many arguments as you'd like. It's a technique called "currying".
From Wikipedia:
In mathematics and computer science, currying is the technique of translating the evaluation of a function that takes multiple arguments (or a tuple of arguments) into evaluating a sequence of functions, each with a single argument.
The benefit of this is that it makes it easier to partially apply and compose functions, which is useful for some styles of functional programming.
Example
Let's say you have a function add which takes two numbers and adds them together, which you might traditionally write like this:
const add = (a, b) => a + b;
Now let's say you have an array of numbers and want to add 2 to all of them. Using map and the function above, you can do it like this:
[1, 2, 3].map(x => add(2, x));
However, if the function had been in curried form, you wouldn't need to wrap the call to add in another arrow function just to adapt the function to what map expects. Instead you could just do this:
const add = a => b => a + b;
[1, 2, 3].map(add(2));
This is of course a trivial and rather contrived example, but it shows the essence of it. Making it easier to partially apply functions also makes it more practical to write small and flexible functions that can be composed together, which then enables a much more "functional" style of programming.
That are called arrow functions, it the new format for functions presented by ES6, in the first example
const update = x => y => {
// Do something with x and y
}
can be traduced to
var update = function (x){
return function (y){
// Do something with x and y..
}
}
in ES5, and is a function that returns a function
is totally different than
const update = function (x, y) {
// Do something with x and y
};
The syntax PARAM => EXPR represents a function that takes a parameter PARAM and whose body is { return EXPR; }. It is itself an expression, so it can be used as the EXPR of other functions:
x => y => { ... }
parses as
x => (y => { ... })
which is the same as
x => { return y => { ... }; }

What would be a good example of an endofunctor that is not the identity functor?

In Professor Frisby Introduces Composable Functional JavaScript the identity functor was introduced:
const Box = x =>
({
map: f => Box(f(x)),
fold: f => f(x) // for testing
})
I spent the better part of the day understanding functors and why the above JavaScript code is actually the identity functor. So I thought I would alter it to get a "real" functor that is not the identity functor. I came up with this:
const Endo = x =>
({
map: f => Endo(f(x).split('')),
fold: f => f(x).split('') // for testing
})
My reasoning is that with Box, Id_Box: Box -> Box and Id_Box f = f. Endo would also map to itself but Endo(f): Endo(x) -> Endo(y) (if f: x -> y).
Am I on the right track?
EDIT:
Replaced string with the more generic x as it was in the original examples.
As pointed out in this answer, for our purposes as programmers we can treat all functors as endofunctors so don't get too caught up on the differences.
As for what a functor is, in brief it is
a data structure (Box in your example)
that can support a mapping operation (think Array.prototype.map)
and that mapping operation respects identity: xs === xs.map(x => x)
...and composition: xs.map(f).map(g) === xs.map(f . g) where . is function composition.
That's it. No more, no less. Looking at your Box, it's a data structure that has a map function (check 1 & 2) and that map function looks like it should respect identity and composition (check 3 & 4). So it's a functor. But it doesn't do anything, which is why it's the identity functor. The fold function isn't strictly necessary, it just provides a way to 'unwrap' the box.
For a useful functor, let's look at JavaScript arrays. Arrays actually do something: namely they contain multiple values rather than just a single one. If an array could only have one element, it'd be your Box. For our purposes we'll pretend that they can only hold values of the same type to simply things. So an array is a data structure, that has a map function, that respects identity and composition.
let plus = x => y => x + y;
let mult = x => y => x * y;
let plus2 = plus(2);
let times3 = mult(3);
let id = x => x;
let compose = (...fs) => arg => fs.reverse().reduce((x, f) => { return f(x) }, arg);
// Here we need to stringify the arrays as JS will compare on
// ref rather than value. I'm omitting it after the first for
// brevity, but know that it's necessary.
[1,2,3].map(plus2).toString() === [3,4,5].toString(); // true
[1,2,3].map(id) === [1,2,3]; // true
[1,2,3].map(plus2).map(times3) === [1,2,3].map(compose(times3, plus2)); // true
So when we map a function over a functor (array) we get back another instance of the same functor (a new Array) with the function applied to whatever the functor (array) was holding.
So now lets look at another ubiquitous JavaScript data structure, the object. There's no built in map function for objects. Can we make them a functor? Assume again that the object is homogenous (only has keys to one type of value, in this example Number):
let mapOverObj = obj => f => {
return Object.entries(obj).reduce((newObj, [key, value]) => {
newObj[key] = f(value);
return newObj;
}, {});
};
let foo = { 'bar': 2 };
let fooPrime = mapOverObj(foo)(plus2); // { 'bar': 4 }
And you can continue on to test that the function accurately (as far as is possible in JavaScript) supports identity and composition to satisfy the functor laws.

Array Spread Operator in Pure Function

In Redux Tutorial, they have used array spread operator a lot for writing reducers(which have to be pure functions) . Go through the following script.
let a = {
b : "ddff",
c : "adxas"
}
let c = {
b : "ssdds",
c : "asdxasdsad"
}
let d = [];
d.push(a);
d.push(c);
console.log(d);
const pureFunc = (arr,b,c) => {
return [...arr, { b , c}];
}
let n = pureFunc(d,"daaaa","sdadad");
console.log(n);
d[0].b = "Hello";
console.log(n)
Is the function "pureFunc" is a proper pure function. Mutations on array d are getting reflected in object n.
Yes, pureFunc is pure. The mutation does not occur within pureFunc.
One of the most common and basic pure functions is the identity function:
let identity = x => x;
So, if we pass that an object, we'll get the same object back. We can modify it after the fact, but that doesn't make identity impure, because identity isn't doing the mutation.
Basically, pure functions only need to satisfy two requirements:
They always produce the same output given the same input
They do not cause side effects

Categories

Resources