Immutable.js: How to maintain immutability when exporting to array? - javascript

I'm passing an array to an immutable list object. However, the immutable list objects are modified when the list is converted to an array which is then being updated.
As a result, the immutable list is still mutable, how can I avoid this while still being able to return an array to work with?
Please provide an explanation as well.
Here's some psuedo code to illustrate the scenario:
var data = [{id:'a'}, {id:'b'}, {id:'c'}];
var immutables = Immutable.List(data).asImmutable(); //Immutable list?
var myData = immutables.toArray();
myData[0] = {id:'x'}; //object is updated in immutable list as well

Having mutable data inside immutable structures is almost always a bad idea.(Other way round it makes much more sense). To avoid this, use Immutable.fromJS(data) instead of Immutable.List(data) and Immutable.Map(data) - just as #mostruash suggested.

Related

Avoiding duplication of key/data

I have a design annoyance with some existing code in JS. The code is working, so I have no desperate hurry to change it, but the duplication shown below does annoy me. What is the usual/recommended/official way of avoiding this situation?
The actual system is a large/complex financial system, so I have simplified it to the most basic example which demonstrates the problem:
var colours={
red:{id:"red", vals:[1,0,0]},
green:{id:"green", vals:[0,1,0]},
grey:{id:"grey", vals:[0.5,0.5,0.5]}
// ...etc
};
// id needs to be known internally within the object - thus it is defined as a property.
// e.g:
colour.prototype.identify(console.log(this.id));
// id also needs to be used externally to find an object quickly.
// e.g:
function getcolour(s){return colours[s];}
// Although this works. It does mean duplicating data, with the theoretical possibility of a mismatch:
var colours={//...
blue:{id:"green", // oh dear...
How would this normally be handled by the experts?
This question is somewhat subjective.
When creating my applications I typically try do do the following:
never define same data in multiple places. source should always be unambiguous
if I need to create any indices for faster/easier access, I use utility methods to do it. Those methods should be properly unit-tested, so that I would have little doubts on them doing the wrong thing
use third party libraries as much as possible (such as already suggested lodash or underscore) to minimize the amount of code to be written/maintained.
If your algorithms and utilities are properly unit-tested you should not worry (too much) about getting the data into inconsistent state. However, if those are critically important systems/interfaces, you may add some validation on output. And it is generally a good practice to have data validation and marshaling on input.
Explanation on the utility methods:
if you have data array, say
var data = [{"id":"i_1", ...}, {"id":"i_2", ...},{"id":"i_3",....}];
Then and you have to create an index out of that or create more data sets based on the original array, then you create yourself a library of utility methods that do the modification on the array, create derivative data sets, or iterate on the array and create a resulting item on the fly. For example:
var createIndex = function( arr ){
// do something that converts the data array with expected structure to object
// {
// i_1: {"id":"i_1", ...},
// i_2: {"id":"i_2", ...},
// i_3: {"id":"i_3", ...}
return newObj;
}
This method will create a hash-map to access your data, which is faster then to iterate over the original array all the time. But now, this method you can easily unit-test and be sure that when you use it on the source data to get your intended dataset, there will be no inconsistency.
I wouldn't change the colours[key] direct access with other method to avoid duplication.
Any other attempt will lead to processing and you have mentioned that you have a large amount of data.
I assume that the duplication is over the incoming data that is a waste.
An example of processing over the network data consuming could be, going over the map object and set the id dynamically according to the key. (processing vs traffic)
colours[key].id = key
You can filter your object converting it to an array of objects and then filtering unique values. Converting it to an array would allow you to perform a lot of operations quicker and easier.
So you can map your object to an array:
var coloursArray = myObj.map(function(value, index) {
return [value];
});
Remove duplicates:
function removeDuplicates() {
return coloursArray.filter((obj, pos, arr) => {
return arr.map(mapObj => mapObj[id]).indexOf(obj[id]) === pos;
});
}
You can remove duplicates from an array using for example underscore.js through the .uniq method:
var uniqueColoursArray = _.uniq(coloursArray , function(c){ return c.id; });
Moreover, this function is pretty useless because you can access your element directly:
function getcolour(s){return colours[s];}
Calling colours[s] it is also shorter than getcolour(s). Your function would make sense if you pass also the array because it is not accessible in some other scope.
Then I can't understand why you do pass a console.log as parameter here:
colour.prototype.identify(console.log(this.id));
maybe you would like to pass just the this.id

Avoid sorting using immutable.js

To me Immutable.js reduces a lot of headaches and it's a great library, but now im facing with a trouble, my original object comes from the server but when I use any of it's functions like fromJS({myObj}) it works but saves a copy but sorted "a-z" and I'm making something that need the original structure to keep the components in the order that comes from the server, someone any Idea?
fromJS translates your objects into lists and maps by default. The former is ordered but not keyed, while the latter is keyed but not ordered, so neither fits your use case.
What you're looking for is an OrderedMap, which is a Map with an additional insertion order guarantee:
import { OrderedMap } from 'immutable';
const orderedMap = OrderedMap({key: "value"});
You can achieve it by still using fromJS: It has a second parameter called reviver, which can be used also for using OrderedMaps instead standard Maps:
import Immutable from 'immutable';
const reviver = (key, value) =>
Immutable.Iterable.isKeyed(value) ? value.toOrderedMap() : value.toList();
const data = Immutable.fromJS(js, reviver);
Javascript core objects explicitly provide no guarantees about key order. Immutable.Map (the expected result of your fromJS() call) just follows that.
If you want order, you should either specify the order as another property on each item, or, more conventionally, create an Immutable.List from an Array.
In other words, this sounds like a square peg/round hole problem. Make sure you're using the right data structure for your task.

How to traverse JS object and all arrays and objects inside to compare it with its copy?

I have a selectedItem object in Angular, it contains other objects and arrays. I create a deep copy using a JSON trick:
$scope.editableItem = JSON.parse(JSON.stringify($scope.selectedItem))
Then I use editableItem model in inputs, change some values inside. selectedItem doesn't change. Then I want to send via PATCH all the changes made, but not the fields which were not changed. So I need to strip the editableItem from all fields that are the same in unchanged selectedItem.
How to do this efficiently? I was thinking about traversing object recursively using Underscore, but I'd really like to know if it's a good way of thinking before I tackle it.
Alternatively I could probably create third object which would only contain touched fields from the second one, added dynamically, but I'm not sure how to approach this.
EDITED:
To be clear, I expect the answer to be generic and assume the most complicated object structure possible. For example no answers from this question are applicable here as they either assume the object has only simple fields or they need to have Angular watcher explicitly set for every field separately.
I do something similar with a function like this:
function getUpdateObject(orig, current) {
varChanges = {};
for (var prop in orig) {
if (prop.indexOf("$") != 0 && orig[prop] !== current[prop]) {
varChanges[prop] = current[prop];
}
}
return varChanges ;
};
I don't think this will get you all the way there. I'm not using it in any scenarios where the objects have member objects or arrays, but you should be able to test if "prop" is an object or array and call it recursively. The biggest caveat I see to that approach is if you have a deep, nested structure, you may not detect a change until you're down several levels. You'd probably have to keep the full potential hierarchy for a changed property in memory, then when you detect a change at a lower, level, write the whole hierarchy to the output object.
This is what I ended up with. Maybe it'll help someone. I used DeepDiff library. Code is in CoffeScript, should be easy to translate to JavaScript if anyone needs it.
$scope.getChangesObject = () ->
selected = $scope.selectedItem
editable = $scope.editableItem
changes = {}
differences = DeepDiff(selected, editable)
for diff in differences
formattedPath = ""
for pathPart, index in diff.path
if index isnt diff.path.length - 1
formattedPath += pathPart + "."
else
formattedPath += pathPart
changes[formattedPath] = editable[formattedPath]
changes

Updating objects in List in ImmutableJS

I am a little confused by the functionality of ImmutableJS when working with an array of objects. The following example shows that even though the List x is immutable, I can still modify properties of objects inside the list both with and without using Immutable List's update() function.
My question is, why would I use Immutable if I can still modify the contents of my objects? I expected this module to protect me from that. I realize that I will not be able to add or remove entire objects to/from the list, but that doesn't fully protect me from modifying the list, which when working with a list in React state, I do not want to be able to do.
The other interesting thing I noticed is that when I directly modify the name after first performing the update, x.get(0).name and y.get(0).name are both changed. I thought that the resulting list from update() would not contain references to the same objects in the list.
How and why is ImmutableJS really helping me in this case?
var x = Immutable.List.of({name: 'foo'});
console.log(x.get(0).name);
var y = x.update(0, (element) => {
element.name = 'bar';
return element;
});
console.log(x.get(0).name);
console.log(y.get(0).name);
x.get(0).name = 'baz';
console.log(x.get(0).name);
console.log(y.get(0).name);
Output:
foo
bar
bar
baz
baz
https://jsfiddle.net/shotolab/rwh116uw/1/
Example of #SpiderPig's suggestion of using Map:
var x = Immutable.List.of(new Immutable.Map({name: 'foo'}));
console.log(x.get(0).get('name'));
var y = x.update(0, (element) => {
return element.set('name', 'bar');
});
console.log(x.get(0).get('name'));
console.log(y.get(0).get('name'));
Output:
foo
foo
bar
While the last example shows what I was trying to accomplish, ultimately I don't know if I will end up using Map or List or even ImmutableJS at all. What I don't like is the alternate APIs (especially for a mapped object). I am afraid that when I hand my project off to another developer, or as others join the team, using these immutable objects and lists correctly will completely fall apart without the proper governance.
Maybe this is more of a commentary on React, but if React intends for the state to be immutable, but it's not enforced, it just seems to me like this will end up a mess in a project that is moving quickly with multiple developers. I was trying my best not to mutate the state, but forgetting that modifying an object in a list/array is very easy mistake to make.
The immutable.js does not provide true immutability in the sense that you could not modify the Objects directly - it just provides API which helps you to maintain the immutable state.
The update -function should return completely new version of the indexed object:
var y = x.update(0, (element) => {
return { name : "bar"};
});
But doing something like this is a big no-no: x.get(0).name = 'baz';
Here is a much better explanation of the whole thing than I could ever write:
https://github.com/facebook/immutable-js/issues/481
The point of immutable.js is to allow re-use of objects which are not modified, which consumes less memory and gives a good practical performance.
There is also library "Seamless immutable", which freezes the objects, so that they can not be modified, but this comes with some performance penalty under JavaScript: https://github.com/rtfeldman/seamless-immutable

What is the accepted convention for when to use an object containing objects vs an array of objects in JSON?

I am currently in the process of writing a GUI which fundamentally allows users to edit/populate/delete a number of settings files, where the settings are stored in JSON, using AJAX.
I have limited experience with JavaScript (I have little experience with anything beyond MATLAB to be frank), however I find myself restructuring my settings structure because of the semantics of working with an object containing more objects, rather than an array of objects. In C# I would do this using a KeyValuePair, however the JSON structure prevents me from doing what I'd really like to do here, and I was wondering whether there was an accepted convention for do this in JavaScript which I should adopt now, rather than making these changes and finding that I cause more issues than I solve.
The sample data structure, which has similar requirements to many of my structures, accepts any number of years, and within these any number of events, and within these a set number of values.
Here is the previous structure:
{"2013":
{
"someEventName":
{
"data1":"foo",
"data2":"bar",
...},
...},
...}
Here is my ideal structure, where the year/event name operates as a key of type string for a value of type Array:
["2013":
[
"someEventName":
{
"data1":"foo",
"data2":"bar",
...},
...],
...]
As far as I am aware, this would be invalid JSON notation, so here is my proposed structure:
[{"Key":"2013",
"Value":
[{"Key":"someEventName",
"Value":
{
"data1":"foo",
"data2":"bar",
...}
},
...]
},
...]
My proposed "test" for whether something should be an object containing objects or an array of objects is "does my sub-structure take a fixed, known number of objects?" If yes, design as object containing objects; if no, design as array of objects.
I am required to filter through this structure frequently to find data/values, and I don't envisage ever exploiting the index functionality that using an array brings, however pushing and removing data from an array is much more flexible than to an object and it feels like using an object containing objects deviates from the class model of OOP; on the other hand, the methods for finding my data by "Key" all seem simpler if it is an object containing objects, and I don't envisage myself using Prototype methods on these objects anyway so who cares about breaking OOP.
Response 1
In the previous structure to add a year, for example, the code would be OBJ["2014"]={}; in the new structure it would be OBJ.push({"Key":"2014", "Value":{}}); both of these solutions are similarly lacking in their complexity.
Deleting is similarly trivial in both cases.
However, if I want to manipulate the value of an event, say, using a function, if I pass a pointer to that object to the function and try to superceed the whole object in the reference, it won't work: I am forced to copy the original event (using jQuery or worse) and reinsert it at the parent level. With a "Value" attribute, I can overwrite the whole value element however I like, provided I pass the entire {"Key":"", "Value":""} object to the function. It's an awful lot cleaner in this situation for me to use the array of objects method.
I am also basing this change to arrays on the wealth of other responses on stackoverflow which encourage the use of them instead of objects.
If all you're going to do is iterate over your objects, then an array of objects makes more sense. If these are settings and people are going to need to look up a specific one then the original object notation is better. the original allows people write code like
var foo = settings['2013'][someEventName].data1
whereas getting that data out of the array of objects would requires iterating through them to find the one with the key: 2013 which depending on the length of the list will cause performance issues.
Pushing new data to the object is as simple as
settings['2014'] = {...}
and deleting data from an object is also simple
delete settings['2014']

Categories

Resources