let oldMessages = Object.assign({}, this.state.messages);
// this.state.messages[0].id = 718
console.log(oldMessages[0].id);
// Prints 718
oldMessages[0].id = 123;
console.log(this.state.messages[0].id);
// Prints 123
How can I prevent oldMessages to be a reference, I want to change the value of oldMessages without changing the value of state.messages
You need to make a deep copy. Lodash's cloneDeep makes this easy:
import cloneDeep from 'lodash/cloneDeep';
const oldMessages = cloneDeep(this.state.messages);
oldMessages[0].id = 123;
First let's clarify the difference between shallow and deep clone:
A shallow clone is a clone that has its primitive properties cloned but his REFERENCE properties still reference the original.
Allow me to clarify:
let original = {
foo: "brlja",
howBigIsUniverse: Infinity,
mrMethodLookAtMe: () => "they call me mr. Method",
moo: {
moo: "MOO"
}
};
// shallow copy
let shallow = Object.assign({}, original);
console.log(original, shallow); // looks OK
shallow.moo.moo = "NOT MOO";
console.log(original, shallow); // changing the copy changed the original
Notice how changing the shallow copy's not primitive property's inner properties REFLECTED on the original object.
So why would we use shallow copy?
It is definitely FASTER.
It can be done in pure JS via 1 liner.
When would you use shallow copy?
All of your object's properties are primitives
You are making a partial copy where all your copied properties are primitives
You don't care about the fate of the original (is there a reason to copy and not use that one instead?)
Oke, let's get into making a propper (deep) copy. A deep copy should obviously have the original object coped into the clone by value, not references. And this should persist as we drill deeper into the object. So if we got X levels deep nested object inside of the original's property it should still be a copy not a reference to the same thing in memory.
What most people suggest is to abuse the JSON API. They think that turning an object into a string then back into an object via it will make a deep copy. Well, yes and NO. Let's attempt to do just that.
Extend our original example with:
let falseDeep = JSON.parse(JSON.stringify(original));
falseDeep.moo.moo = "HEY I CAN MOO AGAIN";
console.log(original, falseDeep); // moo.moo is decoupled
Seems ok, right? WRONG!
Take a look at what happened to the mrMethodLookAtMe and howBigIsUniverse properties that I sneaked in from the start :)
One gives back null which is definitely not Infinity and the other one is GONE. Well, that is no bueno.
In short: There are problems with "smarter" values like NaN or Infinity that get turned to null by JSON API. There are FURTHER problems if you use:
methods, RegExps, Maps, Sets, Blobs, FileLists, ImageDatas, sparse Arrays, Typed Arrays as your original object's properties.
Why? Well this produces some of the nastiest to track bugs out there.. I have nightmares tracking the disappearing methods or type being turned to another (which passed someone's bad input parameter check but then couldn't produce a valid result) before Typescript became a thing.
Time to wrap this up! So what is the correct answer?
You write your own implementation of a deep copy. I like you but please don't do this when we have a deadline to meet.
Use a deep cloning function provided to you by the library or framework you already use in the project.
Lodash's cloneDeep
Many people still use jQuery. So in our example (please put import where it belongs, on top of the file):
import jQ from "jquery";
let trueDeep = jQ.extend(true, original, {});
console.log(original, trueDeep);
This works, it makes a nice deep copy and is a one-liner. But we had to import the entire jQuery. Which is fine if it is already being used in project, but I tend to avoid it since it is over-bloated and has terribly inconsistent naming.
Similarly, AngularJS users can use angular.copy().
But what if my framework/library does not have a similar function?
You can use my personal SUPERSTAR among JS libraries (I am not involved in the project, just a big fan) - Lodash (or _ for friends).
So extend our example with (again, mind the position of import):
import _ from "lodash"; // cool kids know _ is low-dash
var fastAndDeepCopy = _.cloneDeep(objects);
console.log(original, lodashDeep);
It is a simple oneliner, it works, it is fast.
This is pretty much it :)
Now you know the difference between shallow and deep copy in JS. You realize JSON API abuse is just that, abuse and not a true solution. If you are using jQuery or AngularJS already you now know there is a solution already there for you. If not you can write your own or consider using lodash.
The entire example can be found here:
codesandbox - entire example
Try Using
let tempVar = JSON.parse(JSON.stringify(this.state.statename))
What actually you are doing
let oldMessages = Object.assign({}, this.state.messages);
is a shallow copy which is similar to {...this.state.message} with spread operator.
Object has its own reference in memory to destroy it you can use JSON.parse (JSON.stringify(object))
no matter how nested key it has, it will remove the reference of the object and you will get a new object.
This concept is called a deep copy or deep clone.
Related
I like the functional programming paradigm which according to me produces cleaner code and easier to understand, but I'd like to know if the performance loss is significative or not.
Let's take the example of a function that adds a new property to an object. A non-functional approach would look like this:
const addProp = (obj, prop, val) => {
obj[prop] = val; // Mutate the object
return obj;
}
While the functional approach would look like this:
const addProp = (obj, prop, val) => ({ ...obj, [prop]: val }); // Shallow clone
What's the cost of the shallow clone compared to the object mutation? I'd like to know how much functional programming patterns degrade performance (if they do).
The performance of making shallow copies is much worse than the performance of mutation. If I have an array with 400 elements and I make a shallow copy, that's going to be much more expensive than destructively modifying a single element of the array. And the longer the array is, the worse the gap becomes. The same problem occurs with an object with many properties.
One essential element of efficient functional programming is using the right data structures. In this case, you want a data structure where modifying part of the data structure does not mean you have to completely recreate the data structure. You want a local modification of a data structure to be able to share most of its memory with the original.
An extremely basic example of this is a singly linked list. If you wish to prepend a single element to a linked list, this is an O(1) operation because the new list shares most of its memory with the old list. However, adding a single element to an array without mutating the original array is a linear-time operation because the whole array must be copied.
For sequences and maps, one generally ends up using some sort of balanced tree. Functional programming languages include these data structures as part of their standard libraries, but I'm sure someone has written a library for functional Javascript.
If I have an existing object, array or map and I want to delete or add and item, is copying (shallow copy) a map first for example and then using the delete method on the new map, considered the correct way to maintain immutability?
EDIT
In functional languages I've learned like Elixir, data structures such as Lists return a new List. JS doesn't work this way. Even array methods like reduce under the hood are still taking an empty array as an initial parameter, and pushing items
(mutating the initial array) into it.
const map = new Map([
['dog', 'Dog'],
['cat', 'Cat'],
['chicken', 'Chicken'],
])
const map2 = new Map([
...map,
])
map2.delete('dog')
You are mutating map2. However, on reasonable encapsulation logic (such as putting the clone+delete in a function), it's still considered a pure operation, and the original that you pass as an argument would stay unmutated.
The functions
function withoutDog(map) {
const map2 = new Map(map);
map2.delete('dog');
return map2;
}
function withoutDog(map) {
return new Map(Array.from(map).filter(([key, _]) => key !== 'dog'));
}
are indistinguishable from the outside.
When something is immutable it simply means it does not change state.
Since you are not changing the object's state (i.e. object.foo = 'bar'), rather you are cloning it and mutating the clone, it is immutable.
No, this is not an example of immutability.
An object is immutable if it's not possible to modify its contents. Making a copy of an object allows you to modify the copy without modifying the original, but you can still modify the original if you want.
const map = new Map([
['dog', 'Dog'],
['cat', 'Cat'],
['chicken', 'Chicken'],
])
const map2 = new Map(map) // clone the Map
map2.delete('dog') // modify the clone -- doesn't affect the original
map.delete('cat') // modify the original -- it's not immutable
Is the OP asking if the data structure is immutable, or whether the patterns being used here are immutable? The question is inherently confusing because we typically use immutable as an adjective for data structures, not algorithm implementations; we typically describe algorithm implementations designed for use with immutable data structures as "pure functions" or "pure operations".
According to the usual definition of "immutable", #Barmar is correct that the answer is that the data structures in the question are not immutable, since objects in JS are mutable. Even when we use const declarations, the const keyword just makes the reference to the object immutable, but the values within the object can still be mutated; we're now holding an immutable, atomic reference to a mutable, compound value.
But the OP's wording ("is cloning ... considered immutable") suggests that really the question is asking whether the process in question is immutable. Therefore, #Bergi's answer is a good attempt to answer the question as it seems intended, by parsing "immutable" as "pure". If this logic were encapsulated in an API, that API would provide a pure operation / function to callers, even though the implementation would not be internally pure, since it modifies a local value before returning it.
Probably this thing was talked before but I haven't find a solution to solve my problem yet.
So, my issue is this:
I'm saving into a variable the value of an object.
const aux = this.myObj;
the Object (this.myObj) is changed by some operations after that but I want to save its initial value and be able to re-assing it to it.
Something like:
this.myObj = aux
but as expected, the aux is also modified when the original one is.
Is there a solution to this?
You need to clone the object and save it's data to the other variable.
var clone = Object.assign({}, obj);
In case of nested object, Deep cloning can be achieved using various methods. One of that for a simple structured object with nested key-value pairs.
var clone = JSON.parse(JSON.stringify(obj));
or use library like underscore and lodash.
Object.assign({}, obj) simply copies property values of the given object so it wouldn't a good fix, I would recommend lodash's _.cloneDeep as it does deep cloning for you.
You would just have just to do the following.
const aux = _.cloneDeep(this.myObj)
I'm loading in a variable from my app-config.js file and then copying it with .slice() in an attempt to prevent its state from being mutated. Much to my chagrin, the function I'm using to alter data seems to be failing to respect this attempt at avoiding mutation. mySensitivityVarskeeps changing, although I'm not quite sure how since I'm only directly acting on mySeries. Any ideas as to why this is happening? Here's the code:
var mySeries = mySensitivityVars.slice();
//Dummy Processing Algorithm
function myDummyAlgo(sliderIndex, newValue, range) {
console.log(mySeries[sliderIndex].data)
var modifier = newValue/(range/2)
var newSeries = mySensitivityVars[sliderIndex].data.map(function(num){
return num * modifier
})
console.log(newSeries)
// mySeries[sliderIndex].data = newSeries
// console.log(sensitivityChart.series[0].data)
sensitivityChart.series[sliderIndex].setData(newSeries);
};
Slice can copy the array but any objects that are referenced inside the array are not getting copied (only a reference is copied)
Without seeing the contents of mySensitivityVars it's hard to tell, but my guess is you're mutating a reference to the original object rather than a duplicate object.
Do you have any objects in mySensitivityVars? If so, the corresponding objects in mySeries will be pointing to the original objects in mySensitivityVars rather than standalone duplicates, which is why you're probably seeing the mutation issues.
You should clone the array instead of copying it if you'd like it to be mutated. you can use JSON.parse(JSON.stringify(mySensitivityVars)) which is pretty fast deep cloning technic.
that would ensure new objects are assigned and not copy of the references.
When you do:
mySeries[sliderIndex].data newSeries;
The array mySeries is a copy of mySensitivityVars, but the objects in the array are not copies. Both arrays contain references to the same objects, and modifying the data property affects them both. You need to copy the objects as well:
mySeries = mySentitivyVars.map(o => Object.assign({}, o));
And if the objects contain references to other objects, you may need to do a deep copy. See What is the most efficient way to deep clone an object in JavaScript?
I'm looking at the documentation for Immutable.js, specifically the following:
var map1 = Immutable.Map({a:1, b:2, c:3});
var clone = map1;
but I'm confused as to how simply assigning map1 to clone creates a clone rather than a reference?
Update:
The docs state "If an object is immutable, it can be "copied" simply by making another reference to it instead of copying the entire object. Because a reference is much smaller than the object itself, this results in memory savings and a potential boost in execution speed for programs which rely on copies (such as an undo-stack)."
I just tested this in a jsbin though, and clone does === map1. I think their use of the word 'clone' in the docs is a little misleading.
Since Immutable.Map is immutable, the notion of cloning is obsolete. Their point is that you don't have to bother about cloning or not, it doesn't matter.
The docs are indeed confusing, and indeed it is a reference not a clone. The effect of cloning would be the same anyways.