dynamical key & mongodb unclarities - javascript

let time = "12:00";
"10:00": {
name: "john",
status: "registered"
},
"11:00": {
name: "jane",
status: "pending"
},
"12:00": {
name: "joe",
status: "denied"
}
how to find() the data in mongodb by its dynamically changing key variable (time in this case)?
also how to access the name key afterwards like in plain javascript (someVar[time].name) as with the results of find().toArray I can't really do that ... and storedJson[2].name after storedJson = JSON.parse(JSON.stringify(result)) storing the result of dbo.collection(someCollection).find({}, { projection: { _id: 0} }).toArray ( (err, result) seems a bit off and not how this should be done ... (also curious if it unwise saving data that way - and would therefore be open for an suggested alternative)
EDIT:
time: [time],
content: {
name: "john",
status: "pending"
}
this also isn't really a case since as I'm struggling grasping the .toArray which forces me to do someVar[0].content.name instead of someVar[time].content.name

Related

Object with values as keys that store each individual object related to it using reduce: Javascript

just working through a problem and was wondering if anyone could shed some light on this issue I'm having. I'm supposed to take an array of objects (a set of parks) and I need to take a key from an object nested within it, use that as a new key and store an array of every object related by that same value we pulled for the key. Here's an example:
I'm working with data that looks a lot like this:
const myParks = [
{
name: "Garden of the Gods",
areaInSquareKm: 200,
location: {
state: "Colorado"
}
},
{
name: "Acadia",
areaInSquareKm: 198.6,
location: {
state: "Maine"
}
},
{
name: "Mountain Pass",
areaInSquareKm: 400.6,
location: {
state: "Colorado"
}
},
{
name: "Newleaf Forest",
areaInSquareKm: 150.4,
location: {
state: "Maine"
}
},
];
And I need to take the location.state value and use that as individual keys of a new object, then assign each key a value of the object with a related state. If the state is Maine, the key should contain the entire object with the related state:
{
"Maine": [
{
name: "Acadia",
areaInSquareKm: 198.6,
location: {
state: "Maine"
}
},
{
name: "Newleaf Forest",
areaInSquareKm: 150.4,
location: {
state: "Maine"
}
},
]
};
I was trying to do it like so:
function parkByState(parks) {
let stateParks = parks.reduce((result, park) => {
result[park.location.state] = parks.filter(park => { park.location.state })
}, {});
return stateParks
};
I can't for the life of me figure out how to do this properly. I can't change the structure, they MUST be assembled within an object, with the state names as keys, and have an array containing each individual park if the state matches the key. I just seriously don't know what I'm doing wrong. I keep getting an error:
Uncaught TypeError: Cannot set properties of undefined (setting 'Maine')
at <anonymous>:37:35
at Array.reduce (<anonymous>)
at parkByState (<anonymous>:36:26)
at <anonymous>:42:18
I'm relatively new to JavaScript and would love if anyone can shed some light on this. I needed to use the reduce() method to assemble the object. I figured using filter to filter the objects into each key would have made sense, but I can't seem to get the two methods to work together properly. You don't have to feed me the answer, but at the very least, please tell me what I'm doing wrong here. I've tried several different ways to do this and none seem to work. If I'm missing a key point of how to use these methods, please let me know what I can do to better understand what they are trying to do. I really appreciate all the help and I apologize if this is a dumb question. I'm just trying to understand how to reassemble an array of each park, stored in a key that is pulled from the state of each park. I'm sorry if the question isn't worded the absolute best, I really want to get better at this kind of stuff so I'm just trying to understand what I'm doing wrong here and how I can achieve the desired result.
Thank you in advance.
Without reduce we can do :
const myParks = [
{
name: "Garden of the Gods",
areaInSquareKm: 200,
location: {
state: "Colorado"
}
},
{
name: "Acadia",
areaInSquareKm: 198.6,
location: {
state: "Maine"
}
},
{
name: "Mountain Pass",
areaInSquareKm: 400.6,
location: {
state: "Colorado"
}
},
{
name: "Newleaf Forest",
areaInSquareKm: 150.4,
location: {
state: "Maine"
}
},
];
const stateNames = new Set(myParks.map(area => {
return area.location.state;
}));
const collectionsByStateName = {};
[...stateNames].forEach(stateName => {
collectionsByStateName[stateName] = [];
myParks.forEach(area => {
if(stateName === area.location.state) {
collectionsByStateName[stateName].push(area);
}
})
});
console.log(collectionsByStateName);
The callback that you pass to .reduce() should return a value. When you give an arrow funnction a body {} it must include return if you wish to return a value from it, otherwise, it will return undefined. Because you don't return anything from your reduce callback it currently returns undefined, so on the next iteration result will be undefined. This causes your error as you're trying to set a value of undefind which you can't do. Moreover, your .filter() needs to return a truthy value when you want to keep the value, and a falsy value when you want to discard it.
While you can do some modifications to your code to get the filter to work, I recommend not filtering at all. Instead, check what the current state is of the current park, and if that state is already in result, push to the existing array, otherwise, create a new key in result for that state with a new array holding the current state:
const myParks = [ { name: "Garden of the Gods", areaInSquareKm: 200, location: { state: "Colorado" } }, { name: "Acadia", areaInSquareKm: 198.6, location: { state: "Maine" } }, { name: "Mountain Pass", areaInSquareKm: 400.6, location: { state: "Colorado" } }, { name: "Newleaf Forest", areaInSquareKm: 150.4, location: { state: "Maine" } }, ];
function parkByState(parks) {
let stateParks = parks.reduce((result, park) => {
const key = park.location.state;
result[key] ??= []; // set `key` to an empty array if it doesn't exist, (similar to result[key] = result[key] || [];)
result[key].push(park);
return result;
}, {});
return stateParks;
};
console.log(parkByState(myParks));
As you said that you're new to JavaScript, I would do away with .reduce() here though. This particular example doesn't really need to use .reudce() and can be made more readable by using a for loop. I've also replace the nullish-coalescing operator with an if-statement which might ready more clearly if you're unfamiliar with ??
function parkByState(parks) {
const stateParks = {};
for(const park of parks) {
const key = park.location.state;
if(!stateParks[key])
stateParks[key] = [];
stateParks[key].push(park);
}
return stateParks;
}

How to change the location of an object key value pair in JavaScript

I've seen similar questions to this one but in different languages and I am struggling to create a JavaScript equivalent.
I am receiving an object and through a function I want to change the location of one (or more) of the properties. For example,
With the original object of
{
individual: [
{
dob: '2017-01-01',
isAuthorized: true,
},
],
business: [
{
taxId: '123',
},
],
product: {
code: '123',
},
}
I would like to change the location of isAuthorized to be in the first object inside of the business array instead of individual.
Like so
{
individual: [
{
dob: '2017-01-01',
},
],
business: [
{
taxId: '123',
isAuthorized: true,
},
],
product: {
code: '123',
},
}
So far I was trying to create an object that would contain the key name and location to change it to, e.g.
{
isAuthorized: obj.business[0]
}
And then loop over the original object as well as the object with the location values and then set the location of that key value pair.
Basically, in this function I want to see that if the original object contains a certain value (in this case isAuthorized) that it will take that key value pair and move it to the desired location.
What you want can easily be achieved by using loadsh, here's a working snippet of how to restructure based on defined structure map. Extended this example to match what you want.
The example is doing a deep clone, if you are fine modifying the original object then skip that step to avoid the overhead.
// input data
const data = {
individual: [
{
dob: '2017-01-01',
isAuthorized: true,
},
],
business: [
{
taxId: '123',
},
],
product: {
code: '123',
},
};
// the structure change map
const keyMap = {
'individual[0].isAuthorized': 'business[0].isAuthorized'
};
function parseData(data,keyMap) {
const newData = _.cloneDeep(data);
for( let [source,dest] of Object.entries(keyMap) ) {
_.set(newData,dest,_.get(newData,source));
_.unset(newData,source);
}
return newData;
}
console.log(parseData(data, keyMap));
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.min.js"></script>
Note: loadsh's set consider any numeric value as an array index so if you are using a numeric object key then use loadash.setWith. I recommend reading examples in doc for a better understanding.
https://lodash.com/docs/4.17.15#set

Deep Object Comparison and Property Targeting in JavaScript

I am trying to find out if there any any es6 (or external library) ways to handle deep object comparison and parsing in JavaScript.
Take the following example, where I have a property history, which is an array, embedded within a property services, which is also an array:
{
_id: 4d39fe8b23dac43194a7f571,
name: {
first: "Jane",
last: "Smith"
}
services: [
{
service: "typeOne",
history: [
{ _id: 121,
completed: true,
title: "rookie"
},
{ _id: 122,
completed: false,
title: "novice"
}
]
},
{
service: "typeTwo",
history: [
{ _id: 135,
completed: true,
title: "rookie"
},
{ _id: 136,
completed: false,
title: "novice"
}
]
}
]
}
Now, say a new element is pushed onto the "history" array within the second "services" element, where (service : "typeTwo") -- on the "services" array. I need to identify that's happened, and pull out the entire parent element, because I also need to know what "service" within the "services" array had a new "history" element added.
Is there a way I can scan this entire object and not only determine when something's changed, but actually be able to pull out the section I need reference to? I'm open to either a native JS or JS library option here.
You can check for duplicates like this:
function isEqual(firstObject, secondObject) {
function _equals(firstObject, secondObject) {
let clone = {...{}, ...firstObject}, cloneStr = JSON.stringify(clone);
return cloneStr === JSON.stringify({...clone, ...secondObject});
}
return _equals(firstObject, secondObject) && _equals(secondObject, firstObject);
}
https://jsfiddle.net/b1puL04w/
If you considering libraries has stated, then lodash has _.isEqual which does perform a deep comparison between two values to determine if they are equal.
I have used it extensively for deep comparison in the past.

How to access data in an array of objects in another file

Basically, I'm working on an HTML page where I have to print out text that's stored in another .js file. The thing is, the text is stored in an object... in an array... in a function... in a variable. So there's lots of digging to be done, just to access ONE piece of data.
Here's the data (or something similar) in the accompanying .js file:
var TestDataSet= (function() {
var reviews = [
{ Id: "abcd1234",
Title: "This Is Title Text",
Number: 5,
Body: "text",
CreateDate: new Date(2012,5,23,14,12,10,0),
Owner: {
Id: "Person1234",
Name: "James Smith",
}
},
]
How would I make the browser return "abcd1234" from the First ID? How about "Person1234" from the nested ID?
The best I've got so far is this:
var data1 = new reviews;
console.log(data1.reviews[0].Id);
But this does nothing. I get a whole lot of "not defined" errors.
Apparently you haven't posted your complete function.
However, I suggest to return the array value as in a getter function and use it if is safe for the rest of your script:
var TestDataSet= (function() {
var reviews = [
{ Id: "abcd1234",
Title: "This Is Title Text",
Number: 5, Body: "text",
CreateDate: new Date(2012,5,23,14,12,10,0),
Owner: {
Id: "Person1234",
Name: "James Smith",
}
},];
return reviews;
});
Then you can access data using:
TestDataSet[0].Id

MongoDB mapReduce method unexpected results

I have 100 documents in my mongoDB, assuming each of them are possible duplicate with other document(s) in different conditions, such as firstName & lastName, email and mobile phone.
I am trying to mapReduce these 100 documents to have the key-value pairs, like grouping.
Everything works fine until I have the 101st duplicate records in the DB.
The output of the mapReduce result for the other documents which are duplicate with the 101st records are corrupted.
For example:
I am working on firstName & lastName now.
When the DB contains 100 documents, I can have the result containing
{
_id: {
firstName: "foo",
lastName: "bar,
},
value: {
count: 20
duplicate: [{
id: ObjectId("/*an object id*/"),
fullName: "foo bar",
DOB: ISODate("2000-01-01T00:00:00.000Z")
},{
id: ObjectId("/*another object id*/"),
fullName: "foo bar",
DOB: ISODate("2000-01-02T00:00:00.000Z")
},...]
},
}
It is what exactly I want, but...
when the DB contains more than 100 possible duplicated documents, the result became like this,
Let's say the 101st documents is
{
firstName: "foo",
lastName: "bar",
email: "foo#bar.com",
mobile: "019894793"
}
containing 101 documents:
{
_id: {
firstName: "foo",
lastName: "bar,
},
value: {
count: 21
duplicate: [{
id: undefined,
fullName: undefined,
DOB: undefined
},{
id: ObjectId("/*another object id*/"),
fullName: "foo bar",
DOB: ISODate("2000-01-02T00:00:00.000Z")
}]
},
}
containing 102 documents:
{
_id: {
firstName: "foo",
lastName: "bar,
},
value: {
count: 22
duplicate: [{
id: undefined,
fullName: undefined,
DOB: undefined
},{
id: undefined,
fullName: undefined,
DOB: undefined
}]
},
}
I found another topic on stackoverflow having the similar issue like me, but the answer does not work for me
MapReduce results seem limited to 100?
Any ideas?
Edit:
Original source code:
var map = function () {
var value = {
count: 1,
userId: this._id
};
emit({lastName: this.lastName, firstName: this.firstName}, value);
};
var reduce = function (key, values) {
var reducedObj = {
count: 0,
userIds: []
};
values.forEach(function (value) {
reducedObj.count += value.count;
reducedObj.userIds.push(value.userId);
});
return reducedObj;
};
Source code now:
var map = function () {
var value = {
count: 1,
users: [this]
};
emit({lastName: this.lastName, firstName: this.firstName}, value);
};
var reduce = function (key, values) {
var reducedObj = {
count: 0,
users: []
};
values.forEach(function (value) {
reducedObj.count += value.count;
reducedObj.users = reducedObj.users.concat(values.users); // or using the forEach method
// value.users.forEach(function (user) {
// reducedObj.users.push(user);
// });
});
return reducedObj;
};
I don't understand why it would fail as I was also pushing a value (userId) to reducedObj.userIds.
Are there some problems about the value that I emitted in map function?
Explaining the problem
This is a common mapReduce trap, but clearly part of the problem you have here is that the questions you are finding don't have answers that explain this clearly or even properly. So an answer is justified here.
The point in the documentation that is often missed or at least misunderstood is here in the documentation:
MongoDB can invoke the reduce function more than once for the same key. In this case, the previous output from the reduce function for that key will become one of the input values to the next reduce function invocation for that key.
And adding to that just a little later down the page:
the type of the return object must be identical to the type of the value emitted by the map function.
What this means in the context of your question is that at a certain point there are "too many" duplicate key values being passed in for a reduce stage to act on this in one single pass as it will be able to do for a lower number of documents. By design the reduce method is called multiple times, often taking the "output" from data that is already reduced as part of it's "input" for yet another pass.
This is how mapReduce is designed to handle very large datasets, by processing everything in "chunks" until it finally "reduces" down to a singular grouped result per key. This is why the next statement is important is that what comes out of both emit and the reduce output needs to be structured exactly the same in order for the reduce code to handle it correctly.
Solving the problem
You correct this by fixing up how you are both emitting the data in the map and how you also return and process in the reduce function:
db.collection.mapReduce(
function() {
emit(
{ "firstName": this.firstName, "lastName": this.lastName },
{ "count": 1, "duplicate": [this] } // Note [this]
)
},
function(key,values) {
var reduced = { "count": 0, "duplicate": [] };
values.forEach(function(value) {
reduced.count += value.count;
value.duplicate.forEach(function(duplicate) {
reduced.duplicate.push(duplicate);
});
});
return reduced;
},
{
"out": { "inline": 1 },
}
)
The key points can be seen in both the content to emit and the first line of the reduce function. Essentially these present a structure that is the same. In the case of the emit it does not matter that the array being produced only has a singular element, but you send it that way anyhow. Side by side:
{ "count": 1, "duplicate": [this] } // Note [this]
// Same as
var reduced = { "count": 0, "duplicate": [] };
That also means that the remainder of the reduce function will always assume that the "duplicate" content is in fact an array, because that is how it came as original input and is also how it will be returned:
values.forEach(function(value) {
reduced.count += value.count;
value.duplicate.forEach(function(duplicate) {
reduced.duplicate.push(duplicate);
});
});
return reduced;
Alternate Solution
The other reason for an answer is that considering the output you are expecting, this would in fact be much better suited to the aggregation framework. It's going to do this a lot faster than mapReduce can, and is even far more simple to code up:
db.collection.aggregate([
{ "$group": {
"_id": { "firstName": "$firstName", "lastName": "$lastName" },
"duplicate": { "$push": "$$ROOT" },
"count": { "$sum": 1 }
}},
{ "$match": { "count": { "$gt": 1 } }}
])
That's all it is. You can write out to a collection by adding an $out stage to this where required. But basically either mapReduce or aggregate, you are still placing the same 16MB restriction on the document size by adding your "duplicate" items into an array.
Also note that you can simply do something that mapReduce cannot here, and just "omit" any items that are not in fact a "duplicate" from the results. The mapReduce method cannot do this without first producing output to a collection and then "filtering" the results in a separate query.
That core documentation itself quotes:
NOTE
For most aggregation operations, the Aggregation Pipeline provides better performance and more coherent interface. However, map-reduce operations provide some flexibility that is not presently available in the aggregation pipeline.
So it's really a case of weighing up which is better suited to the problem at hand.

Categories

Resources