Array to Object es6 javascript - javascript

I am trying to see if there is a smaller way of converting an array to an object in es6. ( I do not have to worry about cross-browser )
I currently have:
function (values) // values being an array.
{
let [videos, video_count, page] = values;
let data = { videos, video_count, page };
someFunctions(data);
moreFunctions(data);
}
But I was wondering if it's possible to cut out the first line of the function, let [videos....] part. And somehow inline do the conversion.
I have read through mozilla: Destructuring assignment but I could not see it there. (but I may have understood it wrong) and I am really not clever enough to understand ECMA: ES6 Spec.
I suspect it is not possible and the above is already the simplest I can make it.
But if I can get away with not creating the videos, video_count & page tmp variables I would be happier.

You can destructure right in the function parameters
function myFunc([ videos, video_count, page ])
{
let data = { videos, video_count, page };
someFunctions(data);
moreFunctions(data);
}
myFunc(values);
I do a lot of data abstraction using this technique
// basic abstraction
const ratio = (n, d) => ({n, d});
const numer = ({n}) => n;
const denom = ({d}) => d;
// compound abstraction using selectors
const ratioAdd = (x,y) => ratio(
numer(x) * denom(y) + numer(y) * denom(x),
denom(x) * denom(y)
);
// or skip selectors if you're feeling lazy
const printRatio = ({n,d}) => `${n}/${d}`;
console.log(printRatio(ratioAdd(ratio(1,3), ratio(1,4)))); //= 7/12
You seem hell-bent on somehow making the code shorter, so here you go. In this case, "making it shorter" means making it longer first.
// Goal:
obuild(keys,values) //=> ourObject
Generic procedures zip, assign, and obuild should give us what we need. This is vastly superior to #CodingIntigue's answer as it's not one big function that tries to do all of the tasks. Keeping them separate means reducing complexity, and increasing readability and reusability.
// zip :: [a] -> [b] -> [[a,b]]
const zip = ([x,...xs], [y,...ys]) => {
if (x === undefined || y === undefined)
return [];
else
return [[x,y], ...zip(xs,ys)];
}
// assign :: (Object{k:v}, [k,v]) -> Object{k:v}
const assign = (o, [k,v]) => Object.assign(o, {[k]: v});
// obuild :: ([k], [v]) -> Object{k:v}
const obuild = (keys, values) => zip(keys, values).reduce(assign, {});
let keys = ['a', 'b', 'c'];
let values = [1, 2, 3];
console.log(obuild(keys,values));
// { 'a': 1, 'b': 2, 'c': 3 }

With the given properties and names you have, that's probably the shortest way to achieve your desired result.
However, if you had more fields, you could use reduce to avoid repetition. It's not as readable as the destructuring though:
let data = values.reduce((prev, val, index) => Object.assign(prev, {[["videos", "video_count", "page"][index]]: val} ), {});
You could then abstract that out into a generic function:
const values = ["test1","test2","test3"];
const mapArrayToObject = (array, fields) =>
array.reduce(
(prev, val, index) => Object.assign(prev, { [fields[index]]: val } ),
{}
);
const data = mapArrayToObject(values, ["videos", "video_count", "page"]);
console.log(data);

Related

Javascript fluent style - way to access entire array within call chain like "tap" in RXJS

After using RXJS, I have loved using tap operator which essentially allows you to peek into a chain without modifying it. I would love to do the same with a fluent array chain. This can be done by adding a function to the prototype as shown below, but is there a way to do the same or similar using standard array methods? Also, is there any developments on adding this to the specification (or rationale as to why not to have it)?
I know I could just break up the chain into multiple variables (i.e., const oddNumbers...) but that sometimes doesn't fit the style I am going for.
Array.prototype.tap = function(fn){
const arr = Object(this)
fn(arr);
return arr
}
const numbers = [0,1,2,3,4,5];
const sumOfOdds = numbers.filter(x=>x%2)
.tap(oddNumbers => console.log(oddNumbers))
.reduce((acc,cur)=>acc+cur,0)
console.log(sumOfOdds);
One way to do this using map would be the following, but it is called n times and is not super pretty.
Array.prototype.tap = function(fn){
const arr = Object(this)
fn(arr);
return arr
}
const numbers = [0,1,2,3,4,5];
const sumOfOdds = numbers.filter(x=>x%2)
.map((val, i, arr) => {
if (i === 0) // run once
console.log(arr);
return val;
})
.reduce((acc,cur)=>acc+cur,0)
console.log(sumOfOdds);
A .map which calls the other function inside the callback can do the same thing.
const numbers = [0,1,2,3,4,5];
const fn = oddNumber => console.log(oddNumber);
const sumOfOdds = numbers.filter(x=>x%2)
.map(num => (fn(num), num))
.reduce((acc,cur)=>acc+cur,0)
console.log(sumOfOdds);
or if you don't like the comma operator
const numbers = [0,1,2,3,4,5];
const fn = oddNumber => console.log(oddNumber);
const sumOfOdds = numbers.filter(x=>x%2)
.map(num => { fn(num); return num; })
.reduce((acc,cur)=>acc+cur,0)
console.log(sumOfOdds);

Given an array of objects, count how many (possibly different) properties are defined

In a GeoJSON file, some properties are shared by all "features" (element) of the entire collection (array). But some properties are defined only for a subset of the collection.
I've found this question: [javascript] counting properties of the objects in an array of objects, but it doesn't answer my problem.
Example:
const features =
[ {"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4] ...]}},
{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4] ...]}},
{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4] ...]]}},
// ... for instance 1000 different cities
{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[...]}}
];
expected result: a list a all existing properties and their cardinality, letting us know how (in)complete is the data-set. For instance:
properties: 1000, properties.name: 1000, properties.zip: 890, properties.updated: 412,
geometry: 1000, geometry.type: 1000, geometry.coordinates: 1000
I have a (rather complicated) solution, but I do suspect that some people have already faced the same issue (seems a data science classic), with a better one (performance matters).
Here is my clumsy solution:
// 1: list all properties encountered in the features array, at least two levels deep
const countProps = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f)))), []);
// adding all the properties of each individual feature, then removing duplicates using the array-set-array trick
const countProp2s = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f.properties)))), []);
const countProp2g = af => af.reduce((pf,f) =>
Array.from(new Set(pf.concat(Object.keys(f.geometry)))), []);
// 2: counting the number of defined occurrences of each property of the list 1
const countPerProp = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f[pf]), 0)}`;
const countPerProp2s = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f.properties[pf]), 0)}`;
const countPerProp2g = (ff) => pf => ` ${pf}:${ff.reduce((p,f)=> p+(!!f.geometry[pf]), 0)}`;
const cardinalities = countProps(features).map((kk,i) => countPerProp(ff)(kk)) +
countProp2s(features).map(kk => countPerProp2s(ff)(kk)) +
countProp2g(features).map(kk => countPerProp2g(ff)(kk));
Therefore, there are three issues:
-step 1: this is much work (adding everything before removing most of it) for a rather simple operation. Moreover, this isn't recursive and second level is "manually forced".
-step 2, a recursive solution is probably a better one.
-May step 1 and 2 be performed in a single step (starting to count when a new property is added)?
I would welcome any idea.
The JSON.parse reviver and JSON.stringify replacer can be used to check all key value pairs :
var counts = {}, json = `[{"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4]]]}},{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[]}} ]`
var features = JSON.parse(json, (k, v) => (isNaN(k) && (counts[k] = counts[k] + 1 || 1), v))
console.log( counts, features )
Consider trying the following. It is just one reduce, with a couple of nested forEach's inside. It checks whether the keys for indicating the count exist in the object to be returned, and if not creates them initialized to 0. Then whether those keys existed or not to begin with, their corresponding values get incremented by 1.
Repl is here: https://repl.it/#dexygen/countobjpropoccur2levels , code below:
const features =
[ {"properties":{"name":"city1","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},
{"properties":{"name":"city2","zip":1234}, "geometry":{"type":"polygon","coordinates":[[1,2],[3,4]]}},
{"properties":{"name":"city3"},"geometry":{"type":"multiPolygon","coordinates":[[[1,2],[3,4]]]}},
{"properties":{"name":"city1000","zip":1234,"updated":"May-2018"}, "geometry":{"type":"polygon","coordinates":[]}}
];
const featuresCount = features.reduce((count, feature) => {
Object.keys(feature).forEach(key => {
count[key] = count[key] || 0;
count[key] += 1;
Object.keys(feature[key]).forEach(key2 => {
let count2key = `${key}.${key2}`;
count[count2key] = count[count2key] || 0;
count[count2key] += 1;
});
});
return count;
}, {});
console.log(featuresCount);
/*
{ properties: 4,
'properties.name': 4,
'properties.zip': 3,
geometry: 4,
'geometry.type': 4,
'geometry.coordinates': 4,
'properties.updated': 1 }
*/
Use polymorphic serialization of json using jackson. It will look something like below. Your base interface will have all common properties and for each variation create sub types. Count on each type will give what you need
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME, include=JsonTypeInfo.As.PROPERTY, property="name") #JsonSubTypes({ #JsonSubTypes.Type(value=Lion.class, name="lion"), #JsonSubTypes.Type(value=Tiger.class, name="tiger"), }) public interface Animal { }

What is a faster way to get an object from array using find index, or getting value at a key of a map in JavaScript?

Method 1 is using an array which contains objects and i want to get an object with a certain ID.
The second method is suing map, in which an object is created at "id" key.
To get object from array
array.findIndex((x) => x.id == id)
and to get object from map is simple
Map.get(id)
I want to know which method is faster and better
Big O notation is how long an algorithm takes to execute depending on how
long is the input and usually talking about the unfavorable case scenario.
When you use array.findIndex((x) => x.id == id), it means that the whole array should be iterated in the worst case array.length times. So the complexity of this method is O(n).
When you use Map.get(id), it means that hash will be calculated and then by this hash we can go to the desired item in Map. So the complexity of this method is O(1). It is faster.
UPDATE:
Comment by VLAZ:
On the flip side, of course, a map takes extra space and processing.
So, if you only want to look up one or two items, it's probably not
worth it. However, if you have lots of lookups, it will have a massive
benefit speed-wise. Just throwing it out there. I'd personally opt for
a Map.
It Depends
In theory you should use Big O notation to decide. As noted, .findIndex is O(n) while map.get is O(1), which simply means map.get is more efficient. However, in practice, there are many other considerations. For example:
What is the the size of the data? Does it comes already as an array and you need to Map it? How many lookups you need to perform?
On a very small array, simple lookup (for loop, not .findIndex, because of the additional function call) might be more efficient. On very large array, mapping takes time, few lookups will be more efficient than mapping.
Consider the following code, for large size array, two lookups, mapping will be highly inefficient. Only above ~15-20 lookups (depend on your CPU, memory speed, ETC), mapping will be more efficient.
'use strict';
const gen_arr = size => {
const arr = [];
while (size) {
arr.push({ id:size, data:size.toString() });
--size;
}
return arr;
};
const map_it = arr => {
const map = new Map();
for (let obj of arr) {
map.set(obj.id, obj);
}
return map;
}
// Assume nodejs if no performance
const perf = typeof performance != 'undefined' ? performance : { now: ()=> Number(process.hrtime.bigint() / 1000000n) };
let t = perf.now();
const timeit = msg => { const t1 = perf.now(); console.log(msg, 'took', t1 - t, 'ms'); t = t1; };
const start_timer = () => t = perf.now();
const arr_size = 1000000;
const arr = gen_arr(arr_size);
start_timer();
let r = arr.find(obj => obj.id === 1); // end
timeit(`finding ${JSON.stringify(r)} in array`);
r = arr.find(obj => obj.id === arr_size);
timeit(`finding ${JSON.stringify(r)} in array`);
let map = map_it(arr);
timeit('mapping');
r = map.get(1);
timeit(`finding ${JSON.stringify(r)} in map`);
r = map.get(arr_size);
timeit(`finding ${JSON.stringify(r)} in map`);

JavaScript grabbing object/json by index

I know how to grab array by its index. Also I know how to grab objects by its key, but I don't want to grab it by key.
Here you go for my objects
var x = {"email":["This Position"]}
And I know I can grab This Position like writing x.email[0].
But the problem is, I can't always grab the This Position with x.email[0].
Because the server sometimes sends me like this:
var x = {"phone":["This Position"]}
Even some time like this: var x = {"name":["This Position"]}
So you know to grab This Position not possible, because, for those variable, I have to write like x.phone[0] or x.name[0].
And it is very tough to write for hundred of variables.
var x = {"email":["This Position"]}
So I want to grab This Position not with like this x.email[0].
Can give you me a solution that whatever key name, I just want to grab first first value first key? Is it possible?
Use Object.values like so:
var x = {"email":["This Position"]};
const [[res]] = Object.values(x);
console.log(res);
You can use Object.values
var x = {"email":["This Position"]}
var value = Object.values(x)[0][0]
This is possible in implementations where JavaScript objects are ordered by insertion sequence. This answer along with the other answers in the thread offers a good overview of when you can rely on this and under what restrictions. The rest of this answer assumes ordering is guaranteed by the ES implementation.
The most direct solution is to use Object.values(obj)[0], Object.keys(obj)[0] or Object.entries(obj)[0]. However, these methods visit every entry in the object which results in O(n) time complexity, so these approaches are only valuable for simple use cases that aren't performance-critical.
const obj = {
foobar: ["This position"],
foobaz: ["That position"]
};
// succinct but linear:
console.log(Object.keys(obj)[0]);
console.log(Object.values(obj)[0]);
console.log(Object.entries(obj)[0]);
// since you want the first array element of the first value, use:
console.log(Object.values(obj)[0][0]);
Faster is to iterate over the keys with in and return the first. This requires a bit more code but offers a massive complexity improvement for large objects.
// use a loop and return after the first element, O(1):
const getFirstVal = obj => {
for (const k in obj) return obj[k];
};
const getFirstKey = obj => {
for (const k in obj) return k;
};
const getFirstEntry = obj => {
for (const k in obj) return [k, obj[k]];
};
const obj = {
foobar: ["This position"],
foobaz: ["That position"]
};
console.log(getFirstVal(obj));
console.log(getFirstKey(obj));
console.log(getFirstEntry(obj));
A general solution to the "first n entries" problem is to create a generator version of Object.entries. You can use the generator to step through the object's entries, only going as far as you need and as you need it.
The downside is that there's a bit of generator management for the caller, but this can be abstracted with a function. Use this for cases when you anticipate that you might need more than just the first entry in performance-critical code.
// use a generator function for maximum flexibility
function *iterEntries(obj) {
for (const k in obj) yield [k, obj[k]];
}
const firstEntries = (obj, n=1) => {
const gen = iterEntries(obj);
return Array(n).fill().map(_ => gen.next().value);
};
/* warning: linear search */
const entryAtIndex = (obj, i) => {
const gen = iterEntries(obj);
while (i--) gen.next();
return gen.next().value;
};
const obj = Object.fromEntries(
Array(26).fill().map((_, i) => [String.fromCharCode(i + 65), i])
);
console.log(iterEntries(obj).next().value);
console.log(firstEntries(obj, 4));
console.log(entryAtIndex(obj, 4));
You can make versions specific to values or keys as desired. As shown, this technique is also effective for "indexing" into the n-th key of an ordered object. If you find you're doing this frequently, though, consider using an array instead of (or alongside of) an object which is the correct data structure for random index access.

Is it possible to group Observables in an object instead of an array?

I have several Observables, and I want do something similar to Promise.all for them (perhaps .forkJoin)
const two = Observable.of(2);
const three = Observable.of(3);
const four = Observable.of(4);
But as a result I want to have an object instead of array, because I don't want to access the data by numeric keys.
.subscribe(result => {
const two = result[0] // I don't want this
const three = result[1]
const four = result[2]
})
What I'd like to see is to do something like
.subscribe(result => {
const two = result.two // this is much better
const three = result.three
const four = result.four
})
For sure I can do something like this:
Observable
.forkJoin(two, three, four)
.flatMap(result => Observable.of({two: result [0], three: result [1], four: result [2]}))
But eventually it's the same - I will get data based on number keys.
Any thoughts?
Thanks
P.S. One of the possible suggested solution - is to use destructuring:
const [two, three, four] = [...result]
This is feels much better, but doesn't provide any help if somebody will change the order if .forkJoin:
//was
.forkJoin(two, three, four)
//become
.forkJoin(two, four, three)
// ...
.subscribe(([two, three, four]) => {
// say hello to "magic errors"
})
This is my main point - to avoid such kind of "hard to find" bugs
I have written a utility function that enables you do this in RxJS 6. I am not sure what version you are using but hopefully it can be supported there.
function forkJoinObj(obj) {
const keys = Object.keys(obj);
const values = Object.values(obj);
return forkJoin(values).pipe(map(result => result.reduce((acc, next, index) => {
acc[keys[index]] = next;
return acc;
}, {})));
}
See example here: https://stackblitz.com/edit/typescript-dvrff8
There's no build-in RxJS operator to do this but you can use the following syntax:
.subscribe(([two, three, four]) => {
// ...
})
...or this:
.subscribe(results => {
const [two, three, four] = results
}

Categories

Resources