Intersecting two objects in angular 2 - javascript

What I want to do is intersect two objects.
I want to compare the objects, and if they have same values on same keys, just add them to another object.
obj1 = { "Projects": [ "test" ], "Companies": [ "facebook", "google", "yahoo" ], "Locations": [ "LA", "NY" ], "Interests": [] }
obj2 = { "Projects": [ "test" ], "Companies": [ "netflix", "skype", "facebook" ], "Locations": [ "sttugart", "torino", "LA" ], "Interests": [] }
The result will be:
obj3 = { "Projects": [ "test" ], "Companies": [ "facebook" ], "Locations": [ "LA" ], "Interests": [] }
What i tried is something like this:
intersect(obj1, obj2)
for(let key of obj1)
if(obj2[key] == obj1[key]) obj3[key] = obj2[key];
And yes, i did checked SO for other solutions, i had no result.
EDIT
My attempt dind't probably work because my object is not an array type or a string type

This isnt really a problem just for angular 2 but more javascript in itself. No angular functions will probably help you here
Using lodash or underscore.js might prove to be more productive and useful
However if you insist that you need to do this in your own way. there are two cases
One is that you already know how many objects you would be comparing
Two is that you don't know how many objects you would be comparing
For case one it would a simple for loop with && cases for logical comparisons
For case two i would suggest you first push all your objects that need to be compared into an array and iterate through there.

Use lodash
Here you will find a good documentation:
https://lodash.com/docs/4.16.2#intersection
We often use it with good experience

Related

Is there a way to update items in an array with JsonPatch?

The API to be invoked uses JsonPatch. The following is a sample JSON.
{ "hello": false
, "array1":
[ { "subarray": [ "k2", "k1"] }
, { "subarray": [ "k1"] }
]
}
I would like to update both the subarrays (elements of the array1). There could be N number of elements/items in array1 that I'm not aware of when calling this API.
Now I can do the following if I am aware of the the size of array1.
[{ "op": "add", "path": "/array1/0/subarray/0", "value": "gk" }]
[{ "op": "add", "path": "/array1/1/subarray/0", "value": "gk" }]
But since I'm not aware of the the size of array1, it does not seem that this can be achieved using JsonPointer. Is there something that can be done to do an update that targets all the elements of array1 (i.e all the subarrays) in one go? Something like this:
[{ "op": "add", "path": "/array1/*/subarray1/0", "value": "gk-new" }]
After invocation, the resulting subarrays should have an additional element "gk-new" in addition to what they have?
There is no wildcard support in JsonPatch or JsonPointer. Therefore, what is asked in the question is not possible.

JSON list optimization

I want to create a JSON API that returns a list of objects. Each object has an id, a name and some other information. API is consumed using JavaScript.
The natural options for my JSON output seems to be:
"myList": [
{
"id": 1,
"name": "object1",
"details": {}
},
{
"id": 2,
"name": "object2",
"details": {}
},
{
"id": 3,
"name": "object3",
"details": {}
},
]
Now let's imagine that I use my API to get all the objects but want to first do something with id2 then something else with id1 and id3.
Then I may be interested to be able to directly get the object for a specific id:
"myList": {
"1": {
"name": "object1",
"details": {}
},
"2": {
"name": "object2",
"details": {}
},
"3": {
"name": "object3",
"details": {}
},
}
This second option may be less natural when somewhere else in the code I want to simply loop through all the elements.
Is there a good practice for these use cases when the API is used for both looping through all elements and sometime using specific elements only (without doing a dedicated call for each element)?
In your example you've changed the ID value from 1 to id1. This would make operating on the data a bit annoying, because you have to add and remove id all the time.
If you didn't do that, and you were relying on the sorted order of the object, you may be in for a surprise, depending on JS engine:
var source = JSON.stringify({z: "first", a: "second", 0: "third"});
var parsed = JSON.parse(source);
console.log(Object.keys(parsed));
// ["0", "z", "a"]
My experience is to work with arrays on the transport layer and index the data (i.e. convert array to map) when required.

lodash filter by property array of array

I have array of users who have a property array 'rights' and I want to filter out the users who have specific rights. I would like to filter by an array so if I wanted all the users with full rights ['full'] or users with both full and edit ['full','edit']. I am fairly new to using lodash and I think I can chain some together but I am not sure if this is there are more efficient ways of doing it.
Here is my plunker: http://plnkr.co/edit/5PCvaDJaXF4uxRowVBlK?p=preview
Result ['full'] :
[{
"name": "Company1 Admin",
"rights": [
"full"
]
},
{
"name": "FullRights Company1",
"rights": [
"full","review"
]
}]
Result ['full','edit']:
[{
"name": "Company1 Admin",
"rights": [
"full"
]
},
{
"name": "FullRights Company1",
"rights": [
"full","review"
]
},
{
"name": "EditRights Company1",
"rights": [
"edit"
]
}]
Code:
var users = [
{
"name": "Company1 Admin",
"rights": [
"full"
]
},
{
"name": "FullRights Company1",
"rights": [
"full","review"
]
},
{
"name": "ApproveRights Company1",
"rights": [
"approve","review"
]
},
{
"name": "EditRights Company1",
"rights": [
"edit"
]
},
{
"name": "ReviewRights Company1",
"rights": [
"review"
]
},
{
"name": "NoRights Company1",
"rights": [
"none"
]
}
];
var tUsers = [];
var filterRights = ['full','edit'];
_.forEach(users, function(user) {
if (_.intersection(user.rights, filterRights).length > 0) {
tUsers.push(user);
}
}) ;
//console.log('users', JSON.stringify(users, null, 2));
console.log('tUsers', JSON.stringify(tUsers, null, 2));
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/3.10.1/lodash.min.js"></script>
From the docs
_.filter(collection, predicate, thisArg);
Arguments
collection (Array|Object|string): The collection to iterate over.
[predicate=_.identity] (Function|Object|string): The function invoked per iteration.
[thisArg] (*): The this binding of predicate.
Chaining is great when you want to connect different processing steps.
If your problem statement was to
filter by rights
sort by oldest person
take 10
Then chaining would make a lot of sense.
This problem seems to be mostly custom logic on filtering.
var users = [/* Your user data here */];
function filterByRights (users, rights) {
return _.filter(users, function (user) {
return _.any(user.rights, function (right) {
return _.contains(rights, right);
});
});
}
filterByRights(users, ['full', 'edit']); // [/*Users with full or edit rights*/]
I think my example is good becuase it doesn't depend on conditional logic. It uses lodash defined methods like any and contains
Performance concerns
I want to expand on what performance concerns you have. Here are a couple of points.
Your question code is maintaining its own mechanism for filtering out users. While it is a perfectly good solution you should opt into letting the guys who maintain lodash handle this logic. They have probably spent a lot of time optimizing how to create another array from an original one.
_.any is more efficient than _.intersection. _.intersection needs to process every element to know what the intersection is. _.any stops when it hits the first element which passes the predicate otherwise it checks each of them. This point is minor since there are a small number of "rights"
The example I've given is probably more "lodash standard". You typically can do data transformations completely with lodash defined methods and trivial predicates.
Here is an update to #t3dodson 's answer. You should now use the following snippet if using current (4.17.4) Lodash version:
function filterByRights (users, rights) {
return _.filter(users, function (user) {
return _.some(user.rights, function (right) {
return _.includes(rights, right);
});
});
}
From the Changelog:
Removed _.contains in favor of _.includes
Removed _.any in favor of _.some
I think you were on the right path with intersection() (I've never seen any performance issues with this function). Here's how I would compose an iteratee using flow():
_.filter(users, _.flow(
_.property('rights'),
_.partial(_.intersection, filterRights),
_.size
));
The property() function gets the rights property, and passes it to intersection(). We've already partially-applied the filterRights array. Lastly, the size() function is necessary to pass a thruthy/falesy value to filter().

Access anonymous array using NgRepeat

I have a multi-level array containing some objects at its deepest level.
[
[
[
"FUND",
{
"totassets":10.9,
"totdate":"2015-03-23",
"expratiogross":1.35,
"exprationet":1.08
}
],
[
"DAILY",
{
"navdate":"2015-03-23",
"nav":10.05,
"chgamt":0,
"chgpct":0,
"pop":10.05,
"ytdreturn":2.03,
"curr7dayyield":0,
"eff7dayyield":0,
"unsub7dayyield":0,
"30dayyield":0,
"30dayloadyield":0
}
]
]
]
I would like to use ngRepeat to display all the items in "FUND" or "DAILY" but I'm unsure how to access objects this deep without names for each of the arrays above.
Sorry if this is a basic question but I wasn't able to find an answer elsewhere.
You'll want to get the first element of your two outer arrays.
$scope.obj = [
[
[
"FUND",
{
"totassets":10.9,
"totdate":"2015-03-23",
"expratiogross":1.35,
"exprationet":1.08
}
],
[
"DAILY",
{
"navdate":"2015-03-23",
"nav":10.05,
"chgamt":0,
"chgpct":0,
"pop":10.05,
"ytdreturn":2.03,
"curr7dayyield":0,
"eff7dayyield":0,
"unsub7dayyield":0,
"30dayyield":0,
"30dayloadyield":0
}
]
]
]
<ng-repeat el in obj[0][0]>
<span>totassets: {{el[0].FUND.totalAssets}}</span>
<span>navdate: {{el[0].DAILY.navdate}}</span>
</ng-repeat>
An issue you have with the array is that even when you ignore the outer arrays, you're still left with two individual arrays a la:
[
"FUND",
{
"totassets":10.9,
"totdate":"2015-03-23",
"expratiogross":1.35,
"exprationet":1.08
}
],
And:
[
"DAILY",
{
"navdate":"2015-03-23",
"nav":10.05,
"chgamt":0,
"chgpct":0,
"pop":10.05,
"ytdreturn":2.03,
"curr7dayyield":0,
"eff7dayyield":0,
"unsub7dayyield":0,
"30dayyield":0,
"30dayloadyield":0
}
]
So you will need two ngRepeat blocks to achieve what I assume you want to achieve as well as going one level deeper to actually access the values you want.
Here's a quick plnkr to demonstrate what I mean: http://plnkr.co/edit/ArCh8q8w2JoXsg107XwP?p=preview

MongoDb: How to get a field (sub document) from a document?

Consider this example collection:
{
"_id:"0,
"firstname":"Tom",
"children" : {
"childA":{
"toys":{
'toy 1':'batman',
'toy 2':'car',
'toy 3':'train',
}
"movies": {
'movie 1': "Ironman"
'movie 2': "Deathwish"
}
},
"childB":{
"toys":{
'toy 1':'doll',
'toy 2':'bike',
'toy 3':'xbox',
}
"movies": {
'movie 1': "Frozen"
'movie 2': "Barbie"
}
}
}
}
Now I would like to retrieve ONLY the movies from a particular document.
I have tried something like this:
movies = users.find_one({'_id': 0}, {'_id': 0, 'children.ChildA.movies': 1})
However, I get the whole field structure from 'children' down to 'movies' and it's content. How do I just do a query and retrieve only the content of 'movies'?
To be specific I want to end up with this:
{
'movie 1': "Frozen"
'movie 2': "Barbie"
}
The problem here is your current data structure is not really great for querying. This is mostly because you are using "keys" to actually represent "data points", and while it might initially seem to be a logical idea it is actually a very bad practice.
So rather than do something like assign "childA" and "childB" as keys of an object or "sub-document", you are better off assigning these are "values" to a generic key name in a structure like this:
{
"_id:"0,
"firstname":"Tom",
"children" : [
{
"name": "childA",
"toys": [
"batman",
"car",
"train"
],
"movies": [
"Ironman"
"Deathwish"
]
},
{
"name": "childB",
"toys": [
"doll",
"bike",
"xbox",
],
"movies": [
"Frozen",
"Barbie"
]
}
]
}
Not the best as there are nested arrays, which can be a potential problem but there are workarounds to this as well ( but later ), but the main point here is this is a lot better than defining the data in "keys". And the main problem with "keys" that are not consistently named is that MongoDB does not generally allow any way to "wildcard" these names, so you are stuck with naming and "absolute path" in order to access elements as in:
children -> childA -> toys
children -> childB -> toys
And that in a nutshell is bad, and compared to this:
"children.toys"
From the sample prepared above, then I would say that is a whole lot better approach to organizing your data.
Even so, just getting back something such as a "unique list of movies" is out of scope for standard .find() type queries in MongoDB. This actually requires something more of "document manipulation" and is well supported in the aggregation framework for MongoDB. This has extensive capabilities for manipulation that is not present in the query methods, and as a per document response with the above structure then you can do this:
db.collection.aggregate([
# De-normalize the array content first
{ "$unwind": "$children" },
# De-normalize the content from the inner array as well
{ "$unwind": "$children.movies" },
# Group back, well optionally, but just the "movies" per document
{ "$group": {
"_id": "$_id",
"movies": { "$addToSet": "$children.movies" }
}}
])
So now the "list" response in the document only contains the "unique" movies, which corresponds more to what you are asking. Alternately you could just $push instead and make a "non-unique" list. But stupidly that is actually the same as this:
db.collection.find({},{ "_id": False, "children.movies": True })
As a "collection wide" concept, then you could simplify this a lot by simply using the .distinct() method. Which basically forms a list of "distinct" keys based on the input you provide. This playes with arrays really well:
db.collection.distinct("children.toys")
And that is essentially a collection wide analysis of all the "distinct" occurrences for each"toys" value in the collection, and returned as a simple "array".
But as for you existing structure, it deserves a solution to explain, but you really must understand that the explanation is horrible. The problem here is that the "native" and optimized methods available to general queries and aggregation methods are not available at all and the only option available is JavaScript based processing. Which even though a little better through "v8" engine integration, is still really a complete slouch when compared side by side with native code methods.
So from the "original" form that you have, ( JavaScript form, functions have to be so easy to translate") :
db.collection.mapReduce(
// Mapper
function() {
var id this._id;
children = this.children;
Object.keys(children).forEach(function(child) {
Object.keys(child).forEach(function(childKey) {
Object.keys(childKey).forEach(function(toy) {
emit(
id, { "toys": [children[childkey]["toys"][toy]] }
);
});
});
});
},
// Reducer
function(key,values) {
var output = { "toys": [] };
values.forEach(function(value) {
value.toys.forEach(function(toy) {
if ( ouput.toys.indexOf( toy ) == -1 )
output.toys.push( toy );
});
});
},
{
"out": { "inline": 1 }
}
)
So JavaScript evaluation is the "horrible" approach as this is much slower in execution, and you see the "traversing" code that needs to be implemented. Bad news for performance, so don't do it. Change the structure instead.
As a final part, you could model this differently to avoid the "nested array" concept. And understand that the only real problem with a "nested array" is that "updating" a nested element is really impossible without reading in the whole document and modifying it.
So $push and $pull methods work fine. But using a "positional" $ operator just does not work as the "outer" array index is always the "first" matched element. So if this really was a problem for you then you could do something like this, for example:
{
"_id:"0,
"firstname":"Tom",
"childtoys" : [
{
"name": "childA",
"toy": "batman"
}.
{
"name": "childA",
"toy": "car"
},
{
"name": "childA",
"toy": "train"
},
{
"name": "childB",
"toy": "doll"
},
{
"name": "childB",
"toy": "bike"
},
{
"name": "childB",
"toy": "xbox"
}
],
"childMovies": [
{
"name": "childA"
"movie": "Ironman"
},
{
"name": "childA",
"movie": "Deathwish"
},
{
"name": "childB",
"movie": "Frozen"
},
{
"name": "childB",
"movie": "Barbie"
}
]
}
That would be one way to avoid the problem with nested updates if you did indeed need to "update" items on a regular basis rather than just $push and $pull items to the "toys" and "movies" arrays.
But the overall message here is to design your data around the access patterns you actually use. MongoDB does generally not like things with a "strict path" in the terms of being able to query or otherwise flexibly issue updates.
Projections in MongoDB make use of '1' and '0' , not 'True'/'False'.
Moreover ensure that the fields are specified in the right cases(uppercase/lowercase)
The query should be as below:
db.users.findOne({'_id': 0}, {'_id': 0, 'children.childA.movies': 1})
Which will result in :
{
"children" : {
"childA" : {
"movies" : {
"movie 1" : "Ironman",
"movie 2" : "Deathwish"
}
}
}
}

Categories

Resources