Array reduce Unexpected use of comma operator no-sequences - javascript

I am getting an "Unexpected use of comma operator no-sequences" warning -- on the .reduce - but I am not sure how to resolve this.
const getQueryParams = () =>
this.props.location.search
.replace('?', '')
.split('&')
.reduce((r,e) => (r[e.split('=')[0]] = decodeURIComponent(e.split('=')[1]), r), {});

The code quoted uses (some would say abuses) the comma operator in order to avoid using the function body form of an arrow function. The minimal change to remove the comma operator is to put {} around the function body and do an explicit return:
const getQueryParams = () =>
this.props.location.search
.replace('?', '')
.split('&')
.reduce((r,e) => {
r[e.split('=')[0]] = decodeURIComponent(e.split('=')[1]);
return r;
}, {});
As a matter of style, though, I'd suggest not using reduce there at all. (I have a fair bit of company disliking reduce outside of Functional Programming with predefined, reusable reducers.)
In that code, the reduce is just a loop; the accumulator never changes, it's always the same object. So I'd just use a loop:
const getQueryParams = () => {
const result = {};
for (const e of this.props.location.search.replace("?", "").split("&")) {
result[e.split("=")[0]] = decodeURIComponent(e.split("=")[1]);
}
return result;
};
I'd probably also remove the redundant call to split:
const getQueryParams = () => {
const result = {};
for (const e of this.props.location.search.replace("?", "").split("&")) {
const [key, value] = e.split("=");
result[key] = decodeURIComponent(value);
}
return result;
};
Finally, both keys and values in query strings are URI-encoded, so decodeURIComponent should be used on both:
const getQueryParams = () => {
const result = {};
for (const e of this.props.location.search.replace("?", "").split("&")) {
const [key, value] = e.split("=");
result[decodeURIComponent(key)] = decodeURIComponent(value);
}
return result;
};
It'll work without if the keys are just alphanumerics and such, but it's not correct.
Stepping back from the syntax, though, you don't need to invent your own function for parsing query string parameters. Browsers already have one:
const getQueryParams = () => Object.fromEntries(
new URLSearchParams(this.props.location.search)
.entries()
);
Live Example:
const search = "?bar=Testing%201%202%203&baz=2";
console.log(
Object.fromEntries(
new URLSearchParams(search)
.entries()
)
);

You can rewrite the reduce call, so to avoid an assignment expression (and comma operator), turning the arrow function expression syntax into block syntax (see arrow function expression):
.reduce((r,e) => {
r[e.split('=')[0]] = decodeURIComponent(e.split('=')[1]);
return r;
}, {});

Another approach would be to use Object.assign:
let search = ["item=test","name=code%28","foo=%20bar"]
let result = search.reduce((r,e) =>
Object.assign(r,{[e.split('=')[0]] : decodeURIComponent(e.split('=')[1])}), {});
console.log(result)

Related

Iterate over object and change values

I have a map of icons
const CategoryIconMap: Partial<
Record<Category, string>
> = {
[Category.BLUETOOTH]: mdiBluetooth,
[Category.BATTERY_MANAGEMENT_SYSTEM]:mdiBattery,
[Category.BATTERY_LOCK]: mdiLock,
[Category.DISPLAY]: mdiTablet,
};
but I want to make a new map that has the structure:
const CategoryIconMapWithTemplateResult: Partial<
Record<Category, TemplateResult>
> = {
[Category.BLUETOOTH]: html`<my-icon .path=${mdiBluetooth}></my-icon>`,
[Category.BATTERY_MANAGEMENT_SYSTEM]: html`<my-icon .path=${mdiBattery}></my-icon>`,
[Category.BATTERY_LOCK]: html`<my-icon .path=${mdiLock}></my-icon>`,
[Category.DISPLAY]: html`<my-icon .path=${mdiTablet}></my-icon>`,
};
I'd prefer to use a for..of or map over a forEach.
const CategoryIconMapToTemplateResult = Object.entries(CategoryIconMap).map(([key, value]) => CategoryIconMap[key] = html`<my-icon .path=${value}></my-icon>`);
but I'm getting an error Arrow function should not return assignment. Also, although it seems to work, I'm only getting the TemplateResult back, not the whole object.
I also tried using for..of
const CategoryIconMapToTemplateResult = () => {
for (
const [key, value] of Object.entries(CategoryIconMap)) {
CategoryIconMap[key] = html`<my-icon .path=${value}></my-icon>`;
}
}
But then everywhere else in my code complains that I am Placing a void expression inside another expression is forbidden. Move it to its own statement instead.
I also figured it out with reduce, but my company prefers fromEntries over reduce
Object.keys(CategoryIconMap).reduce((accumulator, [key, value]) =>
{
accumulator[key] = html`<my-icon .path=${value}></my-icon>`
return accumulator
}, {})
am I approaching this the right way?

ESlint warns for function keyword in anonymous and doesnt allow assignment to arguments within the function

I have a code to rename the keys of object available with me using lodash transform as:
const replaceKeysDeep = (obj, keysMap) => {
return transform(obj, function(result, value, key){
const currentKey = keysMap[key] || key;
result[currentKey] = isObject(value) ? replaceKeysDeep(value, keysMap) : value;
});
};
I had the above implementation changed to
const replaceKeysDeep = (obj, keysMap) => {
return transform(obj, (result, value, key) => {
const currentKey = keysMap[key] || key;
result[currentKey] = isObject(value) ? replaceKeysDeep(value, keysMap) : value;
});
};
const newKeys = {
abs: 'myname',
tyu: 'yourname'
};
const someObjectContainingKeys = {
abs: 'something',
tyu: 'somethingelse'
};
const finalTimePointCalc = replaceKeysDeep(someObjectContainingKeys, newKeys);
the implementation is changed because eslint was prompting for the keyword "function", but now it prompts me for Assignment to property of function parameter
Here i dont want to suppress the ESLINT warnings but to get it right for execution.
Please suggest
After looking into source code(trasnform() calls baseForOwn() that is actually baseFor() that is generated by createBaseFor()) I believe there is no chance to fit both requirements(consuming lodash's transform() for readability and satisfy eslint's no-param-reassign).
Unlike Array.prototype.reduce transform() does not allow you returning accumulator explicitly(you can see it in createBaseFor() code).
So there are just few solutions: either write your own code for transform(). Or you can suppress eslint rule on per-line/per-file/global basis to allow modifying function arguments' props with props: false(see no-param-reassign docs for details)

Is my understanding of transducers correct?

Let's start with a definition: A transducer is a function that takes a reducer function and returns a reducer function.
A reducer is a binary function that takes an accumulator and a value and returns an accumulator. A reducer can be executed with a reduce function (note: all function are curried but I've cat out this as well as definitions for pipe and compose for the sake of readability - you can see them in live demo):
const reduce = (reducer, init, data) => {
let result = init;
for (const item of data) {
result = reducer(result, item);
}
return result;
}
With reduce we can implement map and filter functions:
const mapReducer = xf => (acc, item) => [...acc, xf(item)];
const map = (xf, arr) => reduce(mapReducer(xf), [], arr);
const filterReducer = predicate => (acc, item) => predicate(item) ?
[...acc, item] :
acc;
const filter = (predicate, arr) => reduce(filterReducer(predicate), [], arr);
As we can see there're a few similarities between map and filter and both of those functions work only with arrays. Another disadvantage is that when we compose those two functions, in each step a temporary array is created that gets passed to another function.
const even = n => n % 2 === 0;
const double = n => n * 2;
const doubleEven = pipe(filter(even), map(double));
doubleEven([1,2,3,4,5]);
// first we get [2, 4] from filter
// then final result: [4, 8]
Transducers help us solve that concerns: when we use a transducer there are no temporary arrays created and we can generalize our functions to work not only with arrays. Transducers need a transduce function to work Transducers are generally executed by passing to transduce function:
const transduce = (xform, iterator, init, data) =>
reduce(xform(iterator), init, data);
const mapping = (xf, reducer) => (acc, item) => reducer(acc, xf(item));
const filtering = (predicate, reducer) => (acc, item) => predicate(item) ?
reducer(acc, item) :
acc;
const arrReducer = (acc, item) => [...acc, item];
const transformer = compose(filtering(even), mapping(double));
const performantDoubleEven = transduce(transformer, arrReducer, [])
performantDoubleEven([1, 2, 3, 4, 5]); // -> [4, 8] with no temporary arrays created
We can even define array map and filter using transducer because it's so composable:
const map = (xf, data) => transduce(mapping(xf), arrReducer, [], data);
const filter = (predicate, data) => transduce(filtering(predicate), arrReducer, [], data);
live version if you'd like to run the code -> https://runkit.com/marzelin/transducers
Does my reasoning makes sense?
Your understanding is correct but incomplete.
In addition to the concepts you've described, transducers can do the following:
Support a early exit semantic
Support a completion semantic
Be stateful
Support an init value for the step function.
So for instance, an implementation in JavaScript would need to do this:
// Ensure reduce preserves early termination
let called = 0;
let updatesCalled = map(a => { called += 1; return a; });
let hasTwo = reduce(compose(take(2), updatesCalled)(append), [1,2,3]).toString();
console.assert(hasTwo === '1,2', hasTwo);
console.assert(called === 2, called);
Here because of the call to take the reducing operation bails early.
It needs to be able to (optionally) call the step function with no arguments for an initial value:
// handles lack of initial value
let mapDouble = map(n => n * 2);
console.assert(reduce(mapDouble(sum), [1,2]) === 6);
Here a call to sum with no arguments returns the additive identity (zero) to seed the reduction.
In order to accomplish this, here's a helper function:
const addArities = (defaultValue, reducer) => (...args) => {
switch (args.length) {
case 0: return typeof defaultValue === 'function' ? defaultValue() : defaultValue;
case 1: return args[0];
default: return reducer(...args);
}
};
This takes an initial value (or a function that can provide one) and a reducer to seed for:
const sum = addArities(0, (a, b) => a + b);
Now sum has the proper semantics, and it's also how append in the first example is defined. For a stateful transducer, look at take (including helper functions):
// Denotes early completion
class _Wrapped {
constructor (val) { this[DONE] = val }
};
const isReduced = a => a instanceof _Wrapped;
// ensures reduced for bubbling
const reduced = a => a instanceof _Wrapped ? a : new _Wrapped(a);
const unWrap = a => isReduced(a) ? a[DONE] : a;
const enforceArgumentContract = f => (xform, reducer, accum, input, state) => {
// initialization
if (!exists(input)) return reducer();
// Early termination, bubble
if (isReduced(accum)) return accum;
return f(xform, reducer, accum, input, state);
};
/*
* factory
*
* Helper for creating transducers.
*
* Takes a step process, intial state and returns a function that takes a
* transforming function which returns a transducer takes a reducing function,
* optional collection, optional initial value. If collection is not passed
* returns a modified reducing function, otherwise reduces the collection.
*/
const factory = (process, initState) => xform => (reducer, coll, initValue) => {
let state = {};
state.value = typeof initState === 'function' ? initState() : initState;
let step = enforceArgumentContract(process);
let trans = (accum, input) => step(xform, reducer, accum, input, state);
if (coll === undefined) {
return trans; // return transducer
} else if (typeof coll[Symbol.iterator] === 'function') {
return unWrap(reduce(...[trans, coll, initValue].filter(exists)));
} else {
throw NON_ITER;
}
};
const take = factory((n, reducer, accum, input, state) => {
if (state.value >= n) {
return reduced(accum);
} else {
state.value += 1;
}
return reducer(accum, input);
}, () => 0);
If you want to see all of this in action I made a little library a while back. Although I ignored the interop protocol from Cognitect (I just wanted to get the concepts) I did try to implement the semantics as accurately as possible based on Rich Hickey's talks from Strange Loop and Conj.

Transform all keys in array from underscore to camel case in js

So, I need to transform all keys in array from underscore to camel space in js. That is what I need to do before send form to server. I'm using Angular.js and I want to represent it as a filter (but I think it's not rly important in this case). Anyway, here is a function I've created.
.filter('underscoreToCamelKeys', function () {
return function (data) {
var tmp = [];
function keyReverse(array) {
angular.forEach(array, function (value, key) {
tmp[value] = underscoreToCamelcase(key);
});
return tmp;
}
var new_obj = {};
for (var prop in keyReverse(data)) {
if(tmp.hasOwnProperty(prop)) {
new_obj[tmp[prop]] = prop;
}
}
return new_obj;
};
function underscoreToCamelcase (string) {
return string.replace(/(\_\w)/g, function(m){
return m[1].toUpperCase();
});
}
})
Here I will try to explain how it works, because it looks terrible at first.
underscoreToCamelcase function just reverting any string in underscore to came case, except first character (like this some_string => someString)
So, as I said earlier, I should revert all keys to camel case, but as you understant we can't simply write
date[key] = underscoreToCamelcase(key)
so keyReverse function returns a reverted array, here is example
some_key => value
will be
value => someKey
and for the last I simply reverting keys and values back, to get this
someKey => value
But, as you already may understand, I got a problem, if in array exists the same values, those data will be dissapear
array
some_key1 => value,
some_key2 => value
returns as
someKey2 => value
So how can I fix that? I have a suggestion to check if those value exists and if it is add some special substring, like this
some_key1 => value,
some_key2 => value
value => someKey1,
zx99value => someKey2
and after all parse it for zx99, but it I think I`m going crazy...
Maybe some one have a better solution in this case?
Important! Them main problem is not just to transform some string to camel case, but do it with array keys!
If you use an existing library to do the camelCase transform, you can then reduce an object like so
import {camelCase} from 'lodash/string'
const camelCaseKeys = (obj) =>
Object.keys(obj).reduce((ccObj, field) => ({
...ccObj,
[camelCase(field)]: obj[field]
}), {})
.filter('underscoreToCamelKeys', function () {
return function (data) {
var tmp = {};
angular.forEach(data, function (value, key) {
var tmpvalue = underscoreToCamelcase(key);
tmp[tmpvalue] = value;
});
return tmp;
};
function underscoreToCamelcase (string) {
return string.replace(/(\_\w)/g, function(m){
return m[1].toUpperCase();
});
}
})
thanks to ryanlutgen
As an alternative solution, you could use the optional replacer parameter of the JSON.stringify method.
var result = JSON.stringify(myVal, function (key, value) {
if (value && typeof value === 'object') {
var replacement = {};
for (var k in value) {
if (Object.hasOwnProperty.call(value, k)) {
replacement[underscoreToCamelcase(k)] = value[k];
}
}
return replacement;
}
return value;
});
Of course you'll end up with a string and have to call JSON.parse to get the object.

Object literal (hash) with Promise.all

I have a situation where it would be quite convenient to use Promise.all like so Promise.all({}) instead of the more standard Promise.all([]).
but this doesn't seem to work
Promise.all({a:1,b:2}).then(function(val){
console.log('val:',val);
});
whilst this does of course
Promise.all([1,2,3]).then(function(val){
console.log('val:',val);
});
(what I would expect would be for Promise.all to map the values of the Object literal, but leave the keys intact.)
But the MDN docs for Promise seem to indicate that Promise all will work for any iterable. To my knowledge, an object literal {} is an iterable. So what am I missing?
Here is another async / await ES6 solution:
async function allOf(hash = {}) {
const promises = Object.keys(hash).map(async key => ({[key]: await hash[key]}));
const resolved = await Promise.all(promises);
return resolved.reduce((hash, part) => ({...hash, ...part}), {});
}
This converts the keys into a promise that produces a single element hash. Then at the end we combine all the hashes in the array to a single hash. You could compact this to a one-liner even, at the cost of readability.
async function allOfOneLiner(hash = {}) {
return (await Promise.all(Object.keys(hash).map(async k => ({[k]: await hash[k]})))).reduce((h, p) => ({...h, ...p}), {});
}
Object does not have an Iterator symbol if you look at the mdn documentation for those.
What you can do, is use a tool function to create an object iterable and later consume it.
reference to objectEntries source, however nodejs does not implement Reflect, so for the purpose of using it with node I just change it into using Object.keys()
function objectEntries(obj) {
let index = 0;
// In ES6, you can use strings or symbols as property keys,
// Reflect.ownKeys() retrieves both
let propKeys = Object.keys(obj);
return {
[Symbol.iterator]() {
return this;
},
next() {
if (index < propKeys.length) {
let key = propKeys[index];
index++;
return { value: [key, obj[key]] };
} else {
return { done: true };
}
}
};
}
Use Object.values. Works in Firefox Nightly:
Promise.all(Object.values({a:1,b:2}))
.then(vals => console.log('vals: ' + vals)) // vals: 1,2
.catch(e => console.log(e));
var console = { log: msg => div.innerHTML += msg + "<br>" };
<div id="div"></div>
Then, to put the results back in an object, we can make a Promise.allParams function:
Promise.allParams = o =>
Promise.all(Object.values(o)).then(promises =>
Object.keys(o).reduce((o2, key, i) => (o2[key] = promises[i], o2), {}));
// Demo:
Promise.allParams({a:1,b:2}).then(function(val){
console.log('val: ' + JSON.stringify(val)); // val: {"a":1,"b":2}
});
var console = { log: msg => div.innerHTML += msg + "<br>" };
<div id="div"></div>
Syntax
Promise.all(iterable);
Parameters
iterable
An iterable object, such as an Array. See iterable.
This function does the trick:
Promise.allAssoc = function(object){
var values = [], keys = [];
for(var key in object){
values.push(object[key]);
keys.push(key);
}
return Promise.all(values).then(function(results){
var out = {};
for(var i=0; i<results.length; i++) out[keys[i]] = results[i];
return out;
});
};
Not all objects are iterable by default. You can make an object iterable by defining a ##iterator method. ##iterator is a Well-Known Symbol available as Symbol.iterator:
Specification Name
##iterator
[[Description]]
"Symbol.iterator"
Value and Purpose
A method that returns the default Iterator for an object. Called by the semantics of the for-of statement.
For example, this will make all object iterable (probably not a good idea):
Object.prototype[Symbol.iterator] = function*() {
for(let key of Object.keys(this))
yield this[key];
};
Then you will be able to use
Promise.all({a:1,b:2}).then(function(val){
console.log('val:', val); // [ 1, 2 ]
});
With Babel/ES2015 you can use Object.keys and map to get the values like this:
const obj = {a:1,b:2};
const vals = Object.keys(obj).map(k=>obj[k]);
Promise.all(vals).then( vals => { console.log('vals', vals) });
ES6 way
Promise.hashProperties = async function(object) {
const keys = [];
const values = [];
for (const key in object) {
keys.push(key);
values.push(object[key]);
}
const results = await Promise.all(values);
for (var i=0; i<results.length; i++)
object[keys[i]] = results[i];
return object;
};

Categories

Resources