Flattening object and limitting number of nesting - javascript

I would like to limit number of flattened structures inside an object. Consider example below.
Input:
const exampleObj = {
foo: {
bar: {
biz: "hello"
}
}
};
Output:
{foo_bar_biz: "hello"}
I wanted to set a limiter (limit = 2) so that function will stop performing recursive function and return something like this
{foo_bar: "[Object] object"}
Here's a snippet:
const { useState, useEffect} = React;
const exampleObj = {
foo: {
bar: {
biz: "hello"
}
}
};
let limit = 0;
function App() {
const [state, setState] = useState({});
useEffect(() => {
flatten(exampleObj);
}, []);
const flatten = (data, parent, result = {}) => {
for (let key in data) {
const propName = parent ? parent + "_" + key : key;
if (typeof data[key] === "object") {
limit++;
if (limit <= 1) {
flatten(data[key], propName, result);
} else {
setState(prevState => {
return {
...prevState,
[propName]:
typeof data[key] === "object" ? "[Object] object" : data[key]
};
});
}
} else {
result[propName] = data[key];
setState({
...state,
[propName]: data[key]
});
}
}
};
console.log(state);
return (
<div>
<p>Start editing to see some magic happen :)</p>
</div>
);
}
ReactDOM.render(<App />, document.getElementById("root"));
<script src="https://cdnjs.cloudflare.com/ajax/libs/react/16.8.4/umd/react.production.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/react-dom/16.8.4/umd/react-dom.production.min.js"></script>
<div id="root"></div>
I've prepared a solution for that, but it is quite cumbersome. https://stackblitz.com/edit/react-u5klvc

Here's one possible implementation of flatten. Notice how it's completely decoupled from React or a specific React component. It works on any JavaScript objects -
const snakecase = s =>
s.join("_")
const flatten = (t = {}, n = Infinity, join = snakecase) =>
{ const many = (t, n, path) =>
n >= 0 && Object(t) === t
? Object.entries(t).flatMap(_ => one(_, n - 1, path))
: [ [ join(path), t ] ]
const one = ([ k, v ], n, path) =>
many(v, n, [...path, k])
return Object.fromEntries(many(t, n, []))
}
Given some example data -
const data =
{ a1: 11
, a2: { b1: 21, b2: 22 }
, a3: { b1: { c1: 311, c2: 312 }
, b2: { c1: 321, c2: 322, c3: { d1: 3231 } }
}
}
A depth of 1 flattens one level -
flatten(data, 1)
{
"a1": 11,
"a2_b1": 21,
"a2_b2": 22,
"a3_b1": {
"c1": 311,
"c2": 312
},
"a3_b2": {
"c1": 321,
"c2": 322,
"c3": {
"d1": 3231
}
}
}
A depth of 2 flattens two levels -
flatten(data, 2) // => ...
{
"a1": 11,
"a2_b1": 21,
"a2_b2": 22,
"a3_b1_c1": 311,
"a3_b1_c2": 312,
"a3_b2_c1": 321,
"a3_b2_c2": 322,
"a3_b2_c3": {
"d1": 3231
}
}
The default depth is inifinity -
flatten(data) // => ...
{
"a1": 11,
"a2_b1": 21,
"a2_b2": 22,
"a3_b1_c1": 311,
"a3_b1_c2": 312,
"a3_b2_c1": 321,
"a3_b2_c2": 322,
"a3_b2_c3_d1": 3231
}
The default join is snakecase, but the parameter can be specified at the call site -
const camelCase = ([ first = "", ...rest ]) =>
first + rest.map(upperFirst).join("")
const upperFirst = ([ first = "", ...rest ]) =>
first.toUpperCase() + rest.join("")
flatten(data, 2, camelCase) // => ...
{
"a1": 11,
"a2B1": 21,
"a2B2": 22,
"a3B1C1": 311,
"a3B1C2": 312,
"a3B2C1": 321,
"a3B2C2": 322,
"a3B2C3": {
"d1": 3231
}
}
Expand the snippet below to verify the results in your own browser -
const data =
{ a1: 11
, a2: { b1: 21, b2: 22 }
, a3: { b1: { c1: 311, c2: 312 }
, b2: { c1: 321, c2: 322, c3: { d1: 3231 } }
}
}
const snakecase = s =>
s.join("_")
const flatten = (t = {}, n = Infinity, join = snakecase) =>
{ const many = (t, n, path) =>
n >= 0 && Object(t) === t
? Object.entries(t).flatMap(_ => one(_, n - 1, path))
: [ [ join(path), t ] ]
const one = ([ k, v ], n, path) =>
many(v, n, [...path, k])
return Object.fromEntries(many(t, n, []))
}
const result =
flatten(data, 2)
console.log(JSON.stringify(result, null, 2))

Thankyou's answer is great. But this is enough different to be worth posting, I think.
I keep handy functions like path and getPaths in my personal library. For this, I had to alter getPaths to accept a depth parameter (d) to escape earlier.
With these and Object.fromEntries we can easily write a partialFlatten function that does what I think you want:
const path = (ps = []) => (obj = {}) =>
ps .reduce ((o, p) => (o || {}) [p], obj)
const getPaths = (obj, d = Infinity) =>
Object (obj) === obj && d > 0
? Object .entries (obj) .flatMap (
([k, v]) => getPaths (v, d - 1) .map (p => [Array.isArray(obj) ? Number(k) : k, ...p])
)
: [[]]
const partialFlatten = (obj, depth) =>
Object .fromEntries (
getPaths (obj, depth) .map (p => [p .join ('_'), path (p) (obj)])
)
const exampleObj = {foo: {bar: {biz: "hello"}, baz: 'goodbye'}}
console .log ('depth 1:', partialFlatten (exampleObj, 1))
console .log ('depth 2:', partialFlatten (exampleObj, 2))
console .log ('depth 3:', partialFlatten (exampleObj, 3))
.as-console-wrapper {max-height: 100% !important; top: 0}

Related

Iterative depth-first traversal with remembering value's paths

I need help with specific implementation of iterative depth first traversal algorithm.
I have an object like this (it's just an example, object might have more properties and be deeper nested):
const root = {
a: 1,
b: {
c: {
d: {
e: 2,
f: 3,
}
},
g: [
{
h: 4,
i: 5,
},
{
j: 6,
k: 7,
}
]
}
}
What I need is a function that would traverse the whole object and return an array like this:
[
{"a": 1},
{"b.c.d.e": 2},
{"b.c.d.f": 3},
{"b.g.0.h": 4},
{"b.g.0.i": 5},
{"b.g.1.j": 6},
{"b.g.1.k": 7},
]
I managed to create an algorithm that sort of solves my problem, but needs one additional step in the end. Result of the algorithm is an array of strings like that:
[
'a^1',
'b.c.d.e^2',
'b.c.d.f^3',
'b.g.0.h^4',
'b.g.0.i^5',
'b.g.1.j^6',
'b.g.1.k^7'
]
so in order to achieve what I want I have to do one full iteration over the result of my algorithm, split strings by ^ symbol and then create objects based on that.
This is the part that I need help with - how can I improve/change my solution so I don't need to do that last step?
function dft(root) {
let stack = [];
let result = [];
const isObject = value => typeof value === "object";
stack.push(root);
while (stack.length > 0) {
let node = stack.pop();
if (isObject(node)) {
Object.entries(node).forEach(([childNodeKey, childNodeValue]) => {
if (isObject(childNodeValue)) {
const newObject = Object.fromEntries(
Object.entries(childNodeValue).map(([cnk, cnv]) => {
return [`${childNodeKey}.${cnk}`, cnv];
})
);
stack.push(newObject);
} else {
stack.push(`${childNodeKey}^${childNodeValue}`);
}
})
} else {
result.push(node);
}
}
return result.reverse();
}
I'd keep pairs <keys,value> in the stack and only create a string key when storing a newly created object:
function dft(obj) {
let stack = []
let res = []
stack.push([[], obj])
while (stack.length) {
let [keys, val] = stack.pop()
if (!val || typeof val !== 'object') {
res.push({
[keys.join('.')]: val
})
} else {
Object.entries(val).forEach(p => stack.push([
keys.concat(p[0]),
p[1],
]))
}
}
return res.reverse()
}
//
const root = {
a: 1,
b: {
c: {
d: {
e: 2,
f: 3,
}
},
g: [
{
h: 4,
i: 5,
},
{
j: 6,
k: 7,
}
]
}
}
console.log(dft(root))
You can push the childNodeKey childNodeValue pair directly as an object to your result array.
Change
stack.push(`${childNodeKey}^${childNodeValue}`);
to
const newEntry = {}
newEntry[childNodeKey] = childNodeValue
result.push(newEntry);
or with ES2015 syntax (you would need a transpiler for browser compatibility)
result.push({[childNodeKey]: childNodeValue});
Complete function:
const root = {
a: 1,
b: {
c: {
d: {
e: 2,
f: 3,
}
},
g: [
{
h: 4,
i: 5,
},
{
j: 6,
k: 7,
}
]
}
}
function dft(root) {
let stack = [];
let result = [];
const isObject = value => typeof value === "object";
stack.push(root);
while (stack.length > 0) {
let node = stack.pop();
if (isObject(node)) {
Object.entries(node).forEach(([childNodeKey, childNodeValue]) => {
if (isObject(childNodeValue)) {
const newObject = Object.fromEntries(
Object.entries(childNodeValue).map(([cnk, cnv]) => {
return [`${childNodeKey}.${cnk}`, cnv];
})
);
stack.unshift(newObject);
} else {
const newEntry = {}
newEntry[childNodeKey] = childNodeValue
result.push({[childNodeKey]: childNodeValue});
}
})
} else {
result.push(node);
}
}
return result;
}
console.log(dft(root))
As you mentioned, you almost got it complete. Just make the array entry an object just before pushing it into result. By splitting Array.prototype.split('^') you can get 'b.g.0.h^4' >>> ['b.g.0.h', '4']. So, rest is a cake:
if (isObject(node)) {
...
} else {
const keyAndValue = node.split('^')
// approach 1)
// const key = keyAndValue[0]
// const value = keyAndValue[1]
// dynamic key setting
// result.push({[key]: value});
// approach 2)
// or in short,
// dynamic key setting
result.push({[keyAndValue[0]]: keyAndValue[1]});
}
You could use a stack where each item has an iterator over the children, and the path up to that point:
function collect(root) {
const Node = (root, path) =>
({ iter: Object.entries(root)[Symbol.iterator](), path });
const result = [];
const stack = [Node(root, "")];
while (stack.length) {
const node = stack.pop();
const {value} = node.iter.next();
if (!value) continue;
stack.push(node);
const [key, child] = value;
const path = node.path ? node.path + "." + key : key;
if (Object(child) !== child) result.push({ [path]: child });
else stack.push(Node(child, path));
}
return result;
}
const root = {a:1,b:{c:{d:{e:2,f:3}},g:[{h:4,i:5},{j:6,k:7}]}};
console.log(collect(root));
I would suggest that the quickest fix to your code is simply to replace
return result.reverse();
with
return result.reverse()
.map ((s, _, __, [k, v] = s .split ('^')) => ({[k]: v}));
But I also think that we can write code to do this more simply. A function I use often will convert your input into something like this:
[
[["a"], 1],
[["b", "c", "d", "e"], 2],
[["b", "c", "d", "f"], 3],
[["b", "g", 0, "h"], 4],
[["b", "g", 0, "i"], 5],
[["b", "g", 1, "j"], 6],
[["b", "g", 1, "k"], 7]
]
and a fairly trivial wrapper can then convert this to your output. It could look like this:
const pathEntries = (obj) =>
Object (obj) === obj
? Object .entries (obj) .flatMap (
([k, x]) => pathEntries (x) .map (([p, v]) => [[Array.isArray(obj) ? Number(k) : k, ... p], v])
)
: [[[], obj]]
const transform = (o) =>
pathEntries (o)
.map (([k, v]) => ({[k .join ('.')] : v}))
const root = {a: 1, b: {c: {d: {e: 2, f: 3, }}, g: [{h: 4, i: 5, }, {j: 6, k: 7}]}}
console .log (transform (root))
.as-console-wrapper {max-height: 100% !important; top: 0}
I don't know your usecase, but I would find this output generally more helpful:
{
"a": 1,
"b.c.d.e": 2,
"b.c.d.f": 3,
"b.g.0.h": 4,
"b.g.0.i": 5,
"b.g.1.j": 6,
"b.g.1.k": 7
}
(that is, one object with a number of properties, rather than an array of single-property objects.)
And we could do this nearly as easily, with a small change to transform:
const transform = (o) =>
pathEntries (o)
.reduce ((a, [k, v]) => ((a[k .join ('.')] = v), a), {})

How to obtain a master structure for a json file?

I have a JSON file as follows:
[
{
"dog": "lmn",
"tiger": [
{
"bengoltiger": {
"height": {
"x": 4
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"width": {
"a": 8
}
},
"indiantiger": {
"b": 3
}
}
]
},
{
"dog": "pqr",
"tiger": [
{
"bengoltiger": {
"width": {
"m": 3
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"height": {
"n": 8
}
},
"indiantiger": {
"b": 3
}
}
],
"lion": 90
}
]
I want to transform this to obtain all possible properties of any object at any nesting level. For arrays, the first object should contain all the properties. The values are trivial, but the below solution considers the first encountered value for any property. (For ex. "lmn" is preserved for the "dog" property)
Expected output:
[
{
"dog": "lmn",
"tiger": [
{
"bengoltiger": {
"height": {
"x": 4,
"n": 8
},
"width": {
"a": 8,
"m": 3
}
},
"indiantiger": {
"paw": "a",
"foor": "b",
"b": 3
}
}
],
"lion": 90
}
]
Here's a recursive function I tried before this nesting problem struck me
function consolidateArray(json) {
if (Array.isArray(json)) {
const reference = json[0];
json.forEach(function(element) {
for (var key in element) {
if (!reference.hasOwnProperty(key)) {
reference[key] = element[key];
}
}
});
json.splice(1);
this.consolidateArray(json[0]);
} else if (typeof json === 'object') {
for (var key in json) {
if (json.hasOwnProperty(key)) {
this.consolidateArray(json[key]);
}
}
}
};
var json = [
{
"dog": "lmn",
"tiger": [
{
"bengoltiger": {
"height": {
"x": 4
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"width": {
"a": 8
}
},
"indiantiger": {
"b": 3
}
}
]
},
{
"dog": "pqr",
"tiger": [
{
"bengoltiger": {
"width": {
"m": 3
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"height": {
"n": 8
}
},
"indiantiger": {
"b": 3
}
}
],
"lion": 90
}
];
consolidateArray(json);
alert(JSON.stringify(json, null, 2));
General logic using this new JNode IIFE with comments - ask someone more clever if you do not understand something as me ;-)
And level starts from 1 as there is no root object #start.
var json;
function DamnDemo() {
json = DemoJSON();
var it = new JNode(json), it2 = it;
var levelKeys = []; /* A bit crazy structure:
[
levelN:{
keyA:[JNode, JNode,...],
keyB:[JNode, JNode,...],
...
},
levelM:...
]
*/
do {
var el = levelKeys[it.level]; // array of level say LevelN or undefined
el = levelKeys[it.level] = el || {}; // set 2 empty it if does not exist
el = el[it.key] = el[it.key] || []; // key array in say levelN
el.push(it); // save current node indexing by level, key -> array
} while (it = it.DepthFirst()) // traverse all nodes
for(var l1 in levelKeys) { // let start simply by iterating levels
l2(levelKeys[l1]);
}
console.log(JSON.stringify(json, null, 2));
}
function l2(arr) { // fun starts here...
var len = 0, items = []; // size of arr, his items to simple Array
for(var ln in arr) { // It's a kind of magic here ;-) Hate recursion, but who want to rewrite it ;-)
if (arr[ln] instanceof JNode) return 1; // End of chain - our JNode for traverse of length 1
len += l2(arr[ln]);
items.push(arr[ln]);
}
if (len == 2) { // we care only about 2 items to move (getting even 3-5)
//console.log(JSON.stringify(json));
if (!isNaN(items[0][0].key) || (items[0][0].key == items[1][0].key)) { // key is number -> ignore || string -> must be same
console.log("Keys 2B moved:", items[0][0].key, items[1][0].key, "/ level:", items[0][0].level);
var src = items[1][0]; // 2nd similar JNode
moveMissing(items[0][0].obj, src.obj); // move to 1st
//console.log(JSON.stringify(json));
if (src.level == 1) { // top level cleaning
delete src.obj;
delete json[src.key]; // remove array element
if (!json[json.length-1]) json.length--; // fix length - hope it was last one (there are other options, but do not want to overcomplicate logic)
} else {
var parent = src.parent;
var end = 0;
for(var i in parent.obj) {
end++;
if (parent.obj[i] == src.obj) { // we found removed in parent's array
delete src.obj; // delete this empty object
delete parent.obj[i]; // and link on
end = 1; // stupid marker
}
}
if (end == 1 && parent.obj instanceof Array) parent.obj.length--; // fix length - now only when we are on last element
}
} else console.log("Keys left:", items[0][0].key, items[1][0].key, "/ level:", items[0][0].level); // keys did not match - do not screw it up, but report it
}
return len;
}
function moveMissing(dest, src) {
for(var i in src) {
if (src[i] instanceof Object) {
if (!dest[i]) { // uff object, but not in dest
dest[i] = src[i];
} else { // copy object over object - let it bubble down...
moveMissing(dest[i], src[i]);
}
delete src[i];
} else { // we have value here, check if it does not exist, move and delete source
if (!dest[i]) {
dest[i] = src[i];
delete src[i];
}
}
}
}
// JSON_Node_Iterator_IIFE.js
'use strict';
var JNode = (function (jsNode) {
function JNode(json, parent, pred, key, obj, fill) {
var node, pred = null;
if (parent === undefined) {
parent = null;
} else if (fill) {
this.parent = parent;
this.pred = pred;
this.node = null;
this.next = null;
this.key = key;
this.obj = obj;
return this;
}
var current;
var parse = (json instanceof Array);
for (var child in json) {
if (parse) child = parseInt(child);
var sub = json[child];
node = new JNode(null, parent, pred, child, sub, true);
if (pred) {
pred.next = node;
node.pred = pred;
}
if (!current) current = node;
pred = node;
}
return current;
}
JNode.prototype = {
get hasNode() {
if (this.node) return this.node;
return (this.obj instanceof Object);
},
get hasOwnKey() { return this.key && (typeof this.key != "number"); },
get level() {
var level = 1, i = this;
while(i = i.parent) level++;
return level;
},
Down: function() {
if (!this.node && this.obj instanceof Object) {
this.node = new JNode(this.obj, this);
}
return this.node;
},
Stringify: function() { // Raw test stringify - #s are taken same as strings
var res;
if (typeof this.key == "number") {
res = '[';
var i = this;
do {
if (i.node) res += i.node.Stringify();
else res += "undefined";
i = i.next;
if (i) res += ','
} while(i);
res += ']';
} else {
res = '{' + '"' + this.key + '":';
res += (this.node?this.node.Stringify():this.hasNode?"undefined":'"'+this.obj+'"');
var i = this;
while (i = i.next) {
res += ',' + '"' + i.key + '":';
if (i.node) res += i.node.Stringify();
else {
if (i.obj instanceof Object) res += "undefined";
else res += '"' + i.obj + '"';
}
};
res += '}';
}
return res;
},
DepthFirst: function () {
if (this == null) return 0; // exit sign
if (this.node != null || this.obj instanceof Object) {
return this.Down(); // moved down
} else if (this.next != null) {
return this.next;// moved right
} else {
var i = this;
while (i != null) {
if (i.next != null) {
return i.next; // returned up & moved next
}
i = i.parent;
}
}
return 0; // exit sign
}
}
return JNode;
})();
// Fire test
DamnDemo();
function DemoJSON() {
return [
{
"dog": "lmn",
"tiger": [
{
"bengoltiger": {
"height": {
"x": 4
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"width": {
"a": 8
}
},
"indiantiger": {
"b": 3
}
}
]
},
{
"dog": "pqr",
"tiger": [
{
"bengoltiger": {
"width": {
"m": 3
}
},
"indiantiger": {
"paw": "a",
"foor": "b"
}
},
{
"bengoltiger": {
"height": {
"n": 8
}
},
"indiantiger": {
"b": 3
}
}
],
"lion": 90
}
]
;}
This was an interesting problem. Here's what I came up with:
// Utility functions
const isInt = Number.isInteger
const path = (ps = [], obj = {}) =>
ps .reduce ((o, p) => (o || {}) [p], obj)
const assoc = (prop, val, obj) =>
isInt (prop) && Array .isArray (obj)
? [... obj .slice (0, prop), val, ...obj .slice (prop + 1)]
: {...obj, [prop]: val}
const assocPath = ([p = undefined, ...ps], val, obj) =>
p == undefined
? obj
: ps.length == 0
? assoc(p, val, obj)
: assoc(p, assocPath(ps, val, obj[p] || (obj[p] = isInt(ps[0]) ? [] : {})), obj)
// Helper functions
function * getPaths(o, p = []) {
if (Object(o) !== o || Object .keys (o) .length == 0) yield p
if (Object(o) === o)
for (let k of Object .keys (o))
yield * getPaths (o[k], [...p, isInt (Number (k)) ? Number (k) : k])
}
const canonicalPath = (path) =>
path.map (n => isInt (Number (n)) ? 0 : n)
const splitPaths = (xs) =>
Object .values ( xs.reduce (
(a, p, _, __, cp = canonicalPath (p), key = cp .join ('\u0000')) =>
({...a, [key]: a [key] || {canonical: cp, path: p} })
, {}
))
// Main function
const canonicalRep = (data) => splitPaths ([...getPaths (data)])
.reduce (
(a, {path:p, canonical}) => assocPath(canonical, path(p, data), a),
Array.isArray(data) ? [] : {}
)
// Test
const data = [{"dog": "lmn", "tiger": [{"bengoltiger": {"height": {"x": 4}}, "indiantiger": {"foor": "b", "paw": "a"}}, {"bengoltiger": {"width": {"a": 8}}, "indiantiger": {"b": 3}}]}, {"dog": "pqr", "lion": 90, "tiger": [{"bengoltiger": {"width": {"m": 3}}, "indiantiger": {"foor": "b", "paw": "a"}}, {"bengoltiger": {"height": {"n": 8}}, "indiantiger": {"b": 3}}]}]
console .log (
canonicalRep (data)
)
The first few functions are plain utility functions that I would keep in a system library. They have plenty of uses outside this code:
isInt is simply a first-class function alias to Number.isInteger
path finds the nested property of an object along a given pathway
path(['b', 1, 'c'], {a: 10, b: [{c: 20, d: 30}, {c: 40}], e: 50}) //=> 40
assoc returns a new object cloning your original, but with the value of a certain property set to or replaced with the supplied one.
assoc('c', 42, {a: 1, b: 2, c: 3, d: 4}) //=> {a: 1, b: 2, c: 42, d: 4}
Note that internal objects are shared by reference where possible.
assocPath does this same thing, but with a deeper path, building nodes as needed.
assocPath(['a', 'b', 1, 'c', 'd'], 42, {a: {b: [{x: 1}, {x: 2}], e: 3})
//=> {a: {b: [{x: 1}, {c: {d: 42}, x: 2}], e: 3}}
Except for isInt, these borrow their APIs from Ramda. (Disclaimer: I'm a Ramda author.) But these are unique implementations.
The next function, getPaths, is an adaptation of one from another SO answer. It lists all the paths in your object in the format used by path and assocPath, returning an array of values which are integers if the relevant nested object is an array and strings otherwise. Unlike the function from which is was borrowed, it only returns paths to leaf values.
For your original object, it returns an iterator for this data:
[
[0, "dog"],
[0, "tiger", 0, "bengoltiger", "height", "x"],
[0, "tiger", 0, "indiantiger", "foor"],
[0, "tiger", 0, "indiantiger", "paw"],
[0, "tiger", 1, "bengoltiger", "width", "a"],
[0, "tiger", 1, "indiantiger", "b"],
[1, "dog"],
[1, "lion"],
[1, "tiger", 0, "bengoltiger", "width", "m"],
[1, "tiger", 0, "indiantiger", "foor"],
[1, "tiger", 0, "indiantiger", "paw"],
[1, "tiger", 1, "bengoltiger", "height", "n"],
[1, "tiger", 1, "indiantiger", "b"]
]
If I wanted to spend more time on this, I would replace that version of getPaths with a non-generator version, just to keep this code consistent. It shouldn't be hard, but I'm not interested in spending more time on it.
We can't use those results directly to build your output, since they refer to array elements beyond the first one. That's where splitPaths and its helper canonicalPath come in. We create the canonical paths by replacing all integers with 0, giving us a data structure like this:
[{
canonical: [0, "dog"],
path: [0, "dog"]
}, {
canonical: [0, "tiger", 0, "bengoltiger", "height", "x"],
path: [0, "tiger", 0, "bengoltiger", "height", "x"]
}, {
canonical: [0, "tiger", 0, "indiantiger", "foor"],
path: [0, "tiger", 0, "indiantiger", "foor"]
}, {
canonical: [0, "tiger", 0, "indiantiger", "paw"],
path: [0, "tiger", 0, "indiantiger", "paw"]
}, {
canonical: [0, "tiger", 0, "bengoltiger", "width", "a"],
path: [0, "tiger", 1, "bengoltiger", "width", "a"]
}, {
canonical: [0, "tiger", 0, "indiantiger", "b"],
path: [0, "tiger", 1, "indiantiger", "b"]
}, {
canonical: [0, "lion"],
path: [1, "lion"]
}, {
canonical: [0, "tiger", 0, "bengoltiger", "width", "m"],
path: [1, "tiger", 0, "bengoltiger", "width", "m"]
}, {
canonical: [0, "tiger", 0, "bengoltiger", "height", "n"],
path: [1, "tiger", 1, "bengoltiger", "height", "n"]
}]
Note that this function also removes duplicate canonical paths. We originally had both [0, "tiger", 0, "indiantiger", "foor"] and [1, "tiger", 0, "indiantiger", "foor"], but the output only contains the first one.
It does this by storing them in an object under a key created by joining the path together with the non-printable character \u0000. This was the easiest way to accomplish this task, but there is an extremely unlikely failure mode possible 1 so if we really wanted we could do a more sophisticated duplicate checking. I wouldn't bother.
Finally, the main function, canonicalRep builds a representation out of your object by calling splitPaths and folding over the result, using canonical to say where to put the new data, and applying the path function to your path property and the original object.
Our final output, as requested, looks like this:
[
{
dog: "lmn",
lion: 90,
tiger: [
{
bengoltiger: {
height: {
n: 8,
x: 4
},
width: {
a: 8,
m: 3
}
},
indiantiger: {
b: 3,
foor: "b",
paw: "a"
}
}
]
}
]
What's fascinating for me is that I saw this as an interesting programming challenge, although I couldn't really imagine any practical uses for it. But now that I've coded it, I realize it will solve a problem in my current project that I'd put aside a few weeks ago. I will probably implement this on Monday!
Update
Some comments discuss a problem with a subsequent empty value tries to override a prior filled value, causing a loss in data.
This version attempts to alleviate this with the following main function:
const canonicalRep = (data) => splitPaths ([...getPaths (data)])
.reduce (
(a, {path: p, canonical}, _, __, val = path(p, data)) =>
isEmpty(val) && !isEmpty(path(canonical, a))
? a
: assocPath(canonical, val, a),
Array.isArray(data) ? [] : {}
)
using a simple isEmpty helper function:
const isEmpty = (x) =>
x == null || (typeof x == 'object' && Object.keys(x).length == 0)
You might want to update or expand this helper in various ways.
My first pass worked fine with the alternate data supplied, but not when I switched the two entries in the outer array. I fixed that, and also made sure that an empty value is kept if it's not overridden with actual data (that's the z property in my test object.)
I believe this snippet solves the original problem and the new one:
// Utility functions
const isInt = Number.isInteger
const path = (ps = [], obj = {}) =>
ps .reduce ((o, p) => (o || {}) [p], obj)
const assoc = (prop, val, obj) =>
isInt (prop) && Array .isArray (obj)
? [... obj .slice (0, prop), val, ...obj .slice (prop + 1)]
: {...obj, [prop]: val}
const assocPath = ([p = undefined, ...ps], val, obj) =>
p == undefined
? obj
: ps.length == 0
? assoc(p, val, obj)
: assoc(p, assocPath(ps, val, obj[p] || (obj[p] = isInt(ps[0]) ? [] : {})), obj)
const isEmpty = (x) =>
x == null || (typeof x == 'object' && Object.keys(x).length == 0)
function * getPaths(o, p = []) {
if (Object(o) !== o || Object .keys (o) .length == 0) yield p
if (Object(o) === o)
for (let k of Object .keys (o))
yield * getPaths (o[k], [...p, isInt (Number (k)) ? Number (k) : k])
}
// Helper functions
const canonicalPath = (path) =>
path.map (n => isInt (Number (n)) ? 0 : n)
const splitPaths = (xs) =>
Object .values ( xs.reduce (
(a, p, _, __, cp = canonicalPath (p), key = cp .join ('\u0000')) =>
({...a, [key]: a [key] || {canonical: cp, path: p} })
, {}
))
// Main function
const canonicalRep = (data) => splitPaths ([...getPaths (data)])
.reduce (
(a, {path: p, canonical}, _, __, val = path(p, data)) =>
isEmpty(val) && !isEmpty(path(canonical, a))
? a
: assocPath(canonical, val, a),
Array.isArray(data) ? [] : {}
)
// Test data
const data1 = [{"dog": "lmn", "tiger": [{"bengoltiger": {"height": {"x": 4}}, "indiantiger": {"foor": "b", "paw": "a"}}, {"bengoltiger": {"width": {"a": 8}}, "indiantiger": {"b": 3}}]}, {"dog": "pqr", "lion": 90, "tiger": [{"bengoltiger": {"width": {"m": 3}}, "indiantiger": {"foor": "b", "paw": "a"}}, {"bengoltiger": {"height": {"n": 8}}, "indiantiger": {"b": 3}}]}]
const data2 = [{"d": "Foreign Trade: Export/Import: Header Data", "a": "false", "f": [{"g": "TRANSPORT_MODE", "i": "2"}, {"k": "System.String", "h": "6"}], "l": "true"}, {"a": "false", "f": [], "l": "false", "z": []}]
const data3 = [data2[1], data2[0]]
// Demo
console .log (canonicalRep (data1))
console .log (canonicalRep (data2))
console .log (canonicalRep (data3))
.as-console-wrapper {max-height: 100% !important; top: 0}
Why not change assoc?
This update grew out of discussion after I rejected an edit attempt to do the same sort of empty-checking inside assoc. I rejected that as too far removed from the original attempt. When I learned what it was supposed to do, I knew that what had to be changed was canonicalRep or one of its immediate helper functions.
The rationale is simple. assoc is a general-purpose utility function designed to do a shallow clone of an object, changing the named property to the new value. This should not have complex logic regarding whether the value is empty. It should remain simple.
By introducing the isEmpty helper function, we can do all this with only a minor tweak to canonicalRep.
1That failure mode could happen if you had certain nodes containing that separator, \u0000. For instance, if you had paths [...nodes, "abc\u0000", "def", ...nodes] and [...nodes, "abc", "\u0000def", ...nodes], they would both map to "...abc\u0000\u0000def...". If this is a real concern, we could certainly use other forms of deduplication.

sorting based on two parameters, using a closure

I have a list of objects. I want to sort it based on two parameters. I created a sort function, but I'm sure it can be refactored. Any suggestions?
const scoreArray = [{
"code": "NOR",
"g": 11,
"s": 5,
"b": 10
},
{
"code": "RUS",
"g": 13,
"s": 11,
"b": 9
},
{
"code": "NED",
"g": 8,
"s": 7,
"b": 9
}
]
var getCompareFunction = (primaryKey, secondaryKey) => {
return ((primaryKey, secondaryKey) => {
return (x, y) => {
if (x[primaryKey] === y[primaryKey]) {
if (x[secondaryKey] > y[secondaryKey]) {
return -1
} else if (x[secondaryKey] < y[secondaryKey]) {
return 1
} else {
return 0
}
} else if (x[primaryKey] > y[primaryKey]) {
return -1
} else if (x[primaryKey] < y[primaryKey]) {
return 1
}
}
})(primaryKey, secondaryKey)
}
const compareFuncHolder = getCompareFunction('s', 'b')
scoreArray.sort(compareFuncHolder);
console.log(scoreArray);
You can use partial application without IIFE. In addition, you can use the difference from the primaries, and if the result is 0 take the difference from the secondaries using short-circuit evaluation:
const scoreArray = [{"code":"NOR","g":11,"s":5,"b":10},{"code":"RUS","g":13,"s":11,"b":9},{"code":"NED","g":8,"s":7,"b":9}]
const getCompareFunction = (primaryKey, secondaryKey) => (x, y) =>
y[primaryKey] - x[primaryKey] || y[secondaryKey] - x[secondaryKey]
const compareFuncHolder = getCompareFunction('b', 's')
scoreArray.sort(compareFuncHolder);
console.log(scoreArray);
If you need to compare string values as well as numbers, you can use a < or >, and cast the result of the comparison to a number (false -> 0, true -> 1) using the +/- operators (see bergi's comment):
const scoreArray = [{"code":"NOR","g":11,"s":5,"b":10},{"code":"RUS","g":13,"s":11,"b":9},{"code":"NED","g":8,"s":7,"b":9}]
const compare = (x, y) => +(x > y) || -(y > x);
const getCompareFunction = (primaryKey, secondaryKey) => (x, y) =>
compare(y[primaryKey], x[primaryKey]) || compare(y[secondaryKey], x[secondaryKey])
const compareFuncHolder = getCompareFunction('code', 's')
scoreArray.sort(compareFuncHolder);
console.log(scoreArray);
You could take a nested approach of closure and a short circuit by iterating the keys and their return values for sorting.
This works for an arbitrary count of compare keys.
const
genCompareFn = (...keys) => (v => (a, b) => keys.some(k => v = (a[k] > b[k]) - (a[k] < b[k])) && v)(),
scoreArray = [{ code: "NOR", g: 11, s: 5, b: 10 }, { code: "RUS", g: 13, s: 11, b: 9 }, { code: "NED", g: 8, s: 7, b: 9 }],
compareFn1 = genCompareFn('s', 'b'),
compareFn2 = genCompareFn('b', 'g');
console.log(scoreArray.sort(compareFn1));
console.log(scoreArray.sort(compareFn2));
.as-console-wrapper { max-height: 100% !important; top: 0; }

Make uniqBy with es6

const uniqBy = (arr, fn) => {
let res = []
const hash = new Set()
for (let i = 0; i < arr.length; i++) {
const foo = typeof fn === 'function' ? fn(arr[i]) : arr[i][fn]
if (!hash.has(foo)) {
res.push(arr[i])
hash.add(foo)
}
}
return res
}
console.log(uniqBy([2.1, 1.2, 2.3], Math.floor)) // [2.1, 1.2]
console.log(uniqBy([{
x: 1
}, {
x: 2
}, {
x: 1
}], 'x')) // [{x: 1 },{ x: 2 }]
That's my code to achieve uniqBy, but the code is too much, I want to get a better way with less code
const uniqBy = (arr, fn, set = new Set) => arr.filter(el => (v => !set.has(v) && set.add(v))(typeof fn === "function" ? fn(el) : el[fn]));
Map is a decent alternative to Set when comparing the uniqueness of variables using keys.
// UniqBy.
const uniqBy = (arr, fn) => [...new Map(arr.reverse().map((x) => [typeof fn === 'function' ? fn(x) : x[fn], x])).values()]
// Proof.
console.log(uniqBy([2.1, 1.2, 2.3], Math.floor)) // [2.1, 1.2]
console.log(uniqBy([{x: 1}, {x: 2}, {x: 1}], 'x')) // [{x: 1},{x: 2}]
This is my take at it
const uniqBy = (arr, fn, obj={}) => ((arr.reverse().forEach(item => obj[typeof fn === 'function' ? fn(item) : item[fn]] = item)), Object.values(obj))
console.log(uniqBy([2.1, 2.3, 3.2], Math.floor))
console.log(uniqBy([{x:1}, {x:2}, {x:1}, {x:4}], 'x'))

Swap key with value in object

I have an extremely large JSON object structured like this:
{A : 1, B : 2, C : 3, D : 4}
I need a function that can swap the values with keys in my object and I don't know how to do it. I would need an output like this:
{1 : A, 2 : B, 3 : C, 4 : D}
Is there any way that I can do this would manually created a new object where everything is swapped?
Thanks
function swap(json){
var ret = {};
for(var key in json){
ret[json[key]] = key;
}
return ret;
}
Example here FIDDLE don't forget to turn on your console to see the results.
ES6 versions:
static objectFlip(obj) {
const ret = {};
Object.keys(obj).forEach(key => {
ret[obj[key]] = key;
});
return ret;
}
Or using Array.reduce() & Object.keys()
static objectFlip(obj) {
return Object.keys(obj).reduce((ret, key) => {
ret[obj[key]] = key;
return ret;
}, {});
}
Or using Array.reduce() & Object.entries()
static objectFlip(obj) {
return Object.entries(obj).reduce((ret, entry) => {
const [ key, value ] = entry;
ret[ value ] = key;
return ret;
}, {});
}
Now that we have Object.fromEntries:
const f = obj => Object.fromEntries(Object.entries(obj).map(a => a.reverse()))
console.log(
f({A : 'a', B : 'b', C : 'c'})
) // => {a : 'A', b : 'B', c : 'C'}
or:
const f = obj => Object.fromEntries(Object.entries(obj).map(([k, v]) => [v, k]))
console.log(
f({A : 'a', B : 'b', C : 'c'})
) // => {a : 'A', b : 'B', c : 'C'}
(Updated to remove superfluous parentheses - thanks #devin-g-rhode)
you can use lodash function _.invert it also can use multivlaue
var object = { 'a': 1, 'b': 2, 'c': 1 };
_.invert(object);
// => { '1': 'c', '2': 'b' }
// with `multiValue`
_.invert(object, true);
// => { '1': ['a', 'c'], '2': ['b'] }
Using ES6:
const obj = { a: "aaa", b: "bbb", c: "ccc", d: "ddd" };
Object.assign({}, ...Object.entries(obj).map(([a,b]) => ({ [b]: a })))
Get the keys of the object, and then use the Array's reduce function to go through each key and set the value as the key, and the key as the value.
const data = {
A: 1,
B: 2,
C: 3,
D: 4
}
const newData = Object.keys(data).reduce(function(obj, key) {
obj[data[key]] = key;
return obj;
}, {});
console.log(newData);
In ES6/ES2015 you can combine use of Object.keys and reduce with the new Object.assign function, an arrow function, and a computed property name for a pretty straightforward single statement solution.
const foo = { a: 1, b: 2, c: 3 };
const bar = Object.keys(foo)
.reduce((obj, key) => Object.assign({}, obj, { [foo[key]]: key }), {});
If you're transpiling using the object spread operator (stage 3 as of writing this) that will simplify things a bit further.
const foo = { a: 1, b: 2, c: 3 };
const bar = Object.keys(foo)
.reduce((obj, key) => ({ ...obj, [foo[key]]: key }), {});
Finally, if you have Object.entries available (stage 4 as of writing), you can clean up the logic a touch more (IMO).
const foo = { a: 1, b: 2, c: 3 };
const bar = Object.entries(foo)
.reduce((obj, [key, value]) => ({ ...obj, [value]: key }), {});
2021's answer
The concise way by using ES6 syntax like this.
const obj = {A : 1, B : 2, C : 3, D : 4}
console.log(
Object.entries(obj).reduce((acc, [key, value]) => (acc[value] = key, acc), {})
);
Explain:
(acc[value] = key, acc)
Using Comma operator (,) syntax.
The comma operator (,) evaluates each of its operands (from left to
right) and returns the value of the last operand.
As a complement of #joslarson and #jPO answers:
Without ES6 needed, you can use Object.keys Array.reduce and the Comma Operator:
Object.keys(foo).reduce((obj, key) => (obj[foo[key]] = key, obj), {});
Some may find it ugly, but it's "kinda" quicker as the reduce doesn't spread all the properties of the obj on each loop.
Using Ramda:
const swapKeysWithValues =
R.pipe(
R.keys,
R.reduce((obj, k) => R.assoc(source[k], k, obj), {})
);
const result = swapKeysWithValues(source);
Try
let swap = (o,r={})=> Object.keys(o).map(k=> r[o[k]]=k) && r;
let obj = {A : 1, B : 2, C : 3, D : 4};
let swap = (o,r={})=> Object.keys(o).map(k=> r[o[k]]=k) && r;
console.log(swap(obj));
With pure Ramda in a pure and point-free style:
const swapKeysAndValues = R.pipe(
R.toPairs,
R.map(R.reverse),
R.fromPairs,
);
Or, with a little more convoluted ES6 version, still pure functional:
const swapKeysAndValues2 = obj => Object
.entries(obj)
.reduce((newObj, [key, value]) => ({...newObj, [value]: key}), {})
Shortest one I came up with using ES6..
const original = {
first: 1,
second: 2,
third: 3,
fourth: 4,
};
const modified = Object
.entries(original)
.reduce((all, [key, value]) => ({ ...all, [value]: key }), {});
console.log('modified result:', modified);
var data = {A : 1, B : 2, C : 3, D : 4}
var newData = {};
Object.keys(data).forEach(function(key){newData[data[key]]=key});
console.log(newData);
Here is a pure functional implementation of flipping keys and values in ES6:
TypeScript
const flipKeyValues = (originalObj: {[key: string]: string}): {[key: string]: string} => {
if(typeof originalObj === "object" && originalObj !== null ) {
return Object
.entries(originalObj)
.reduce((
acc: {[key: string]: string},
[key, value]: [string, string],
) => {
acc[value] = key
return acc;
}, {})
} else {
return {};
}
}
JavaScript
const flipKeyValues = (originalObj) => {
if(typeof originalObj === "object" && originalObj !== null ) {
return Object
.entries(originalObj)
.reduce((acc, [key, value]) => {
acc[value] = key
return acc;
}, {})
} else {
return {};
}
}
const obj = {foo: 'bar'}
console.log("ORIGINAL: ", obj)
console.log("FLIPPED: ", flipKeyValues(obj))
function swapKV(obj) {
const entrySet = Object.entries(obj);
const reversed = entrySet.map(([k, v])=>[v, k]);
const result = Object.fromEntries(reversed);
return result;
}
This can make your object, {A : 1, B : 2, C : 3, D : 4}, array-like, so you can have
const o = {A : 1, B : 2, C : 3, D : 4}
const arrayLike = swapKV(o);
arrayLike.length = 5;
const array = Array.from(arrayLike);
array.shift(); // undefined
array; // ["A", "B", "C", "D"]
Here is an option that will swap keys with values but not lose duplicates, if your object is : { a: 1, b: 2, c: 2}, it will always return an array in the output :
function swapMap(map) {
const invertedMap = {};
for (const key in map) {
const value = map[key];
invertedMap[value] = invertedMap[value] || [];
invertedMap[value].push(key);
}
return invertedMap;
}
swapMap({a: "1", b: "2", c: "2"})
// Returns => {"1": ["a"], "2":["b", "c"]}
A simple TypeScript variant:
const reverseMap = (map: { [key: string]: string }) => {
return Object.keys(map).reduce((prev, key) => {
const value = map[key];
return { ...prev, [value]: [...(prev.value || []), key] };
}, {} as { [key: string]: [string] })
}
Usage:
const map = { "a":"1", "b":"2", "c":"2" };
const reversedMap = reverseMap(map);
console.log(reversedMap);
Prints:
{ "1":["a"], "2":["b", "c"] }
Rewriting answer of #Vaidd4, but using Object.assign (instead of comma operator):
/**
* Swap object keys and values
* #param {Object<*>} obj
* #returns {Object<string>}
*/
function swapObject(obj) {
return Object.keys(obj).reduce((r, key) => (Object.assign(r, {
[obj[key]]: key,
})), {});
}
Or, shorter:
Object.keys(obj).reduce((r, key) => (Object.assign(r, {[obj[key]]: key})), {});
function myFunction(obj) {
return Object.keys(obj).reduce((acc, cur) => {
return { ...acc, [obj[cur]]: cur };
}, {});
}
This is the solution that I'm using:
function objSwap(obj, tl = false) {
return Object.entries(obj).reduce((a, [k, v]) => (a[v = tl ? v.toLowerCase() : v] = k = tl ? k.toLowerCase() : k, a), {});
}
As a bonus: if you need to swap then check some values I added the possibility to lowercase keys and values. Simply you've to set tl = true, else if you don't need it ...
function objSwap(obj) {
return Object.entries(obj).reduce((a, [k, v]) => (a[v] = k, a), {});
}
Using a for...of loop:
let obj = {A : 1, B : 2, C : 3, D : 4}
for (let [key, value] of Object.entries(obj)){
obj[value] = key
delete obj[key]
}
console.log(obj) // {1: 'A', 2: 'B', 3: 'C', 4: 'D'}
ONE OF THE ES6 WAYS IS HERE
const invertObject = (object) =>Object.entries(object).reduce((result, value) => ({...result, [value[1]]: value[0] }), {});
let obj = invertObject({A : 1, B : 2, C : 3, D : 4});
Here's a type-safe way using TypeScript that has not been suggested before. This solution takes two generics that means the return type will be typed as expected. It's faster than doing methods with .reduce or Object.entries.
// Easier way to type `symbol | number | string` (the only valid keys of an object)
export type AnyKey = keyof any;
export function mirror<K extends AnyKey, V extends AnyKey>(
object: Record<K, V>,
) {
const ret: Partial<Record<V, K>> = {};
for (const key in object) {
ret[object[key]] = key;
}
return ret as Record<V, K>;
}
Usage:
const obj = mirror({
a: 'b',
c: 'd',
});
// {b: 'a', d: 'c'}
obj;
Modern JS solution:
const swapKeyValue = (object) =>
Object.entries(object).reduce((swapped, [key, value]) => (
{ ...swapped, [value]: key }
), {});
Typescript:
type ValidKey = number | string;
const swapKeyValue = <K extends ValidKey, V extends ValidKey>(
object: Record<K, V>
): Record<V, K> =>
Object.entries(object)
.reduce((swapped, [key, value]) => (
{ ...swapped, [value as ValidKey]: key }
), {} as Record<V, K>);
I believe it's better to do this task by using an npm module, like invert-kv.
invert-kv: Invert the key/value of an object. Example: {foo: 'bar'} → {bar: 'foo'}
https://www.npmjs.com/package/invert-kv
const invertKv = require('invert-kv');
invertKv({foo: 'bar', unicorn: 'rainbow'});
//=> {bar: 'foo', rainbow: 'unicorn'}

Categories

Resources