How to update a value in destructing and loop - javascript

What I try to achieve:
I want to update a value in an obj, which is part of the element of array. See the code below will give you better idea.
There is an issue that I update the value of object, via reference, instead of making a copy. This causes the state behave strangely.
I try to change it to making a copy, but I am not sure.
e.g.
const returnObj = {
...objs,
fields: [{name, value}, {name, value}, {name, value_update_this_only}, ...],
};
// This is the current code
export function* onChange(action) {
// get partial state from redux state
const list = yield select((state) => state.list);
let objs = list[action.index];
// * e.g. objs.fields === [{name, value}, {name, value}, ...]
// * basically following, find the correct field and update its value
// * following has problem, beause we change the value of a reference,
// * instead we should make a new copy, so redux can react
objs.fields.map((field) => {
if (field.name === action.fieldName) {
field["value"] = action.fieldValue;
}
return field;
});
// fire to redux reducer
yield put({
type: "UPDATE",
prop: obj,
docIndex: action.index,
});
}
// the problem: I don't know how to do it in destructing manner.
const returnObj = {
...objs,
fields: [],
};

I think rather than try and come up with a single destructuring statement that makes this work, it's easier to digest (and arguably more readable) in smaller steps:
Make a shallow copy of objs; call it copy for now
Recreate fields array and every item within it
For the desired array item, update its value
Set the copy.fields to the array created in 2
// Step 1: Shallow copy
let copy = { ...objs }
// Step 2: Recreate fields and every item
let fields = copy.fields.map((field) => ({
...field
}))
// Step 3: Update value of desired item
fields.forEach((field) => {
if (field.name === action.fieldName)
field.value = action.fieldValue
})
// Step 4: Reassign fields to the copy
copy.fields = fields
Refactoring this, steps 2-4 can be combined into one step without sacrificing that much readability:
let copy = { ...objs }
copy.fields = copy.fields.map((field) => ({
...field,
value: field.name === action.fieldName ? action.fieldValue : field.value,
}))
It's been a long time since I've used redux or sagas, so I'm not sure whether fields needs to be an entirely new array or if just the changed object within fields needs to be new, but the above can be modified to accommodate either need.

Related

How can the index of an object within an array be preserved when modifying an object property while using the spread operator?

I am having a React useState-Variable that stores an Array of Objects
which are of this Question type:
type Question = {
id: number,
text: string,
otherProps: string,
...... (and so on)
}
Example of my useState
const [questions, setQuestions] = React.useState<Question[]>([{id: 1, text: "hello?", otherProps: "Lorem Ipsum"}])
The order of these Question objects in the useState-Variable Array matters, so my question is: How should the following function be changed so that the text of the Question is changed but the array index of the modified object is maintained/kept?
I am aware that currently I am first deleting the object and then placing a newly created on at the end, but I can't figure out another way right now.
function setQuestionTextById(id:number, text:string) {
if (!questions) return;
const question:Question|undefined = questions.find(x => x.id === id);
if (!question) return;
const newQuestion: Question = {
...question,
text,
};
const filteredQuestions = questions.filter(item => item.id !== id);
setQuestions([...filteredQuestions, newQuestion]);
}
You should use map on the array with a function that checks if the id is the one you want - if so it modifies it with the new text, otherwise leaves it as is.
This way, your whole function becomes:
function setQuestionTextById(id:number, text:string) {
const updatedQuestions = questions.map(question => question.id === id ? { ...question, text } : question);
setQuestions(updatedQuestions);
}
Which I find much more readable than the original, as well as preserving order as you say you need.
One further refinement would be to use the functional update form of setQuestions so it doesn't depend on the previous value of questions, but I'll leave that up to you - it may not matter, depending on how this is being used in your component.

Redux modifying the a state value without being asked to?

i'm using redux to manage my state , my initial state in the reducer contains to arrays !
the first one(games) is the one that i want to modify , the second one (InitialGames) is Initial one that i don't want it to be modified !
the problem is i only make changes on the first array ! but when i console my state after the logic ! i see that both arrays got changed ??? which is confusing !
The case that i'm going into is PLAYER_DEAD
My Reducer
import { ADD_GAME, PLAYER_DEAD, PUT_INFOS, RESET_GAME } from "./actions";
const initialState = {
games: [],
InitialGames: [],
};
export default (state = initialState, action) => {
switch (action.type) {
case RESET_GAME:
state.games[action.payload.gameIndex] =
state.InitialGames[action.payload.gameIndex];
console.log(state);
return state;
case ADD_GAME:
return {
games: [...state.games, action.payload.game],
InitialGames: [...state.games, action.payload.game],
};
case PUT_INFOS:
return {
gameInfos: action.gameInfos,
};
case PLAYER_DEAD:
let newGames = state.games;
let newInitialGames = state.InitialGames;
console.log("Before Changings", newGames, newInitialGames);
let newTeam = newGames[action.payload.indexGame].teams[
action.payload.index
].players.splice(0, 1);
console.log(
"After changings",
newGames,
newInitialGames
);
return {games:newGames,InitialGames:newInitialGames};
}
return state;
};
This would be occurring because you're passing the same action.payload.game object to both of your arrays here:
return {
games: [...state.games, action.payload.game],
// same objects ---------^------v
InitialGames: [...state.games, action.payload.game],
}
When you access the .teams array in your PLAYER_DEAD case, you're accessing the same array in memory shared by both games and InitialGames, the same goes for anything within that array, including the .players array within your .teams array's objects. Because you're updating your array in place in a non-immutable way by using .splice(), you end up modifying your state directly and thus modifying the same .players array referenced by both games and InitialGames.
You need to ensure that you don't modify your state in place by using methods like .splice(). For your particular case, you would do something like so:
const newGames = state.games.map((game, i) => i === action.payload.indexGame
? game.teams.map((team, j) => j === action.payload.index
? {...team, players: team.players.slice(1)} // note slice, not splice
: team
)
: game
);
Above, we map your arrays, updating the items when the inndx matches the item to update. When updating the players array, we use .slice() to remove the first item from the array.
Writing immutable code isn't always easy, that's why redux toolkit has built-in support for immer when you use an API such as createSlice() API that will allow you to write code like you've been doing the mutates your state. See here for more info.

filter with react state not filtering values

I am stumped here and no idea why as I have filtered react state many times before but this time it is doing something unexpected.
I have a functional component that has an input. The input has an onChange event. The on change event sends e to a function called updateSocial. This function takes the param e, more specifically, e.target.name, and e.target.value, and creates an object to store in state. {inputType: e.target.name, value: e.target.value}.
The state, bandSocials is an array of these objects... If the user edits the field, the function should filter out the old value, and replace it with the new value.
Heres my code:
const updateSocial = (e) => {
//Turn the bandSocials state into a new array so that I don't change the existing state yet.
let socialsArray = Array.from(bandSocials)
//Define the input type and the value.
let inputType = e.target.name
let value = e.target.value
//Filter out the old value- This is where it is not working.
socialsArray.filter((s) => s.inputType !== inputType);
//Add in the new value
socialsArray.push({inputType, value})
//Replace the bandSocails with the new array.
setSocials((band) => {
return {
...band,
bandSocials : socialsArray,
};
});
};
Each time the onChange event is called, it adds another object and does not filter out the old object.
Unlike push, which modifies the array, filter creates a new array and returns it.
let filteredArray = socialsArray.filter( ... )

Is mutating accumulator in reduce function considered bad practice?

I'm new to functional programming and I'm trying rewrite some code to make it more functional-ish to grasp the concepts. Just now I've discovered Array.reduce() function and used it to create an object of arrays of combinations (I've used for loop before that). However, I'm not sure about something. Look at this code:
const sortedCombinations = combinations.reduce(
(accum, comb) => {
if(accum[comb.strength]) {
accum[comb.strength].push(comb);
} else {
accum[comb.strength] = [comb];
}
return accum;
},
{}
);
Obviously, this function mutates its argument accum, so it is not considered pure. On the other hand, the reduce function, if I understand it correctly, discards accumulator from every iteration and doesn't use it after calling callback function. Still, it's not a pure function. I can rewrite it like this:
const sortedCombinations = combinations.reduce(
(accum, comb) => {
const tempAccum = Object.assign({}, accum);
if(tempAccum[comb.strength]) {
tempAccum[comb.strength].push(comb);
} else {
tempAccum[comb.strength] = [comb];
}
return tempAccum;
},
{}
);
Now, in my understanding, this function is considered pure. However, it creates a new object every iteration, which consumes some time, and, obviously, memory.
So the question is: which variant is better and why? Is purity really so important that I should sacrifice performance and memory to achieve it? Or maybe I'm missing something, and there is some better option?
TL; DR: It isn't if you own the accumulator.
It's quite common in JavaScript to use the spread operator to create nice looking one-liner reducing functions. Developers often claim that it also makes their functions pure in the process.
const foo = xs => xs.reduce((acc, x) => ({...acc, [x.a]: x}), {});
//------------------------------------------------------------^
// (initial acc value)
But let's think about it for a second... What could possibly go wrong if you mutated acc? e.g.,
const foo = xs => xs.reduce((acc, x) => {
acc[x.a] = x;
return acc;
}, {});
Absolutely nothing.
The initial value of acc is an empty literal object created on the fly. Using the spread operator is only a "cosmetic" choice at this point. Both functions are pure.
Immutability is a trait not a process per se. Meaning that cloning data to achieve immutability is most likely both a naive and inefficient approach to it. Most people forget that the spread operator only does a shallow clone anyway!
I wrote this article a little while ago where I claim that mutation and functional programming don't have to be mutually exclusive and I also show that using the spread operator isn't a trivial choice to make.
Creating a new object on every iteration is common practice, and sometimes recommended, despite any potential performance issues.
(EDIT:) I guess that is because if you want to have only one general advice, then copying less likely causes
problems than mutating. The performance starts to become a "real" issue
if you have more than lets say about 1000 iterations. (For more details see my update below)
You can make your function pure in e.g. in this way:
const sortedCombinations = combinations.reduce(
(accum, comb) => {
return {
...accum,
[comb.strength]: [
...(accum[comb.strength] || []),
comb
]
};
},
{}
);
Purity might become more important if your state and reducer is defined somewhere else:
const myReducer = (accum, comb) => {
return {
...accum,
[comb.strength]: [
...(accum[comb.strength] || []),
comb
]
};
};
const initialState = {};
const sortedCombinations = combinations.reduce( myReducer, initialState );
const otherSortedCombinations = otherCombinations.reduce( myReducer, initialState );
const otherThing = otherList.reduce( otherReducer, initialState );
Update (2021-08-22):
preface to this update
As stated in the comments (and also mentioned in the question), of course copying on every iteration is less performant.
And I admit that in many cases, technically I can't see any disadvantages of mutating the accumulator (if you know what you are doing!).
Actually, thinking about it again, inspired from the comments and other answers,
I changed my mind a bit, and will consider mutating more often now, maybe at least
where I don't see any risk that e.g. somebody else misunderstands my code later.
But then again the question was explicitly about purity ... anyway, so here some more details:
purity
(Disclaimer: I must admit here that I know about React, but I don't know much about "the world of functional programming"
and their arguments about the advantages, e.g. in Haskell)
Using this "pure" approach is a tradeoff. You loose performance, and you win easier understandable and less coupled code.
E.g. in React, with many nested Components, you can always rely on the consistent state of the current component.
You know it will not be changed anywhere outside, except if you have passed down some 'onChange' callback explicitly.
If you define an object, you know for sure it will always stay unchanged.
If you need a modified version, you would have an new variable assignment,
this way it is obvious that you are working with a new version of the data
from here down, and any code that might use the old object will not be affected.:
const myObject = { a1: 1, a2: 2, a3: 3 }; <-- stays unchanged
// ... much other code ...
const myOtherObject = modifySomehow( myObject ); <-- new version of the data
Pros, Cons, and Caveats
I couldn't give a general advice which way (copy or mutate) is "the better one".
Mutating is more performant, but can cause lots of hard-to-debug problems, if you aren't absolutely sure what's happening.
At least in somewhat complex scenarios.
1. problem with non-pure reducer
As already mentioned in my original answer, a non-pure function
might unintentionally change some outside state:
var initialValue = { a1: 1, a2: 2, a3: 3, a4: 4 };
var newKeys = [ 'n1', 'n2', 'n3' ];
var result = newKeys.reduce( (acc, key) => {
acc[key] = 'new ' + key;
return acc
}, initialValue);
console.log( 'result:', result ); // We are interested in the 'result',
console.log( 'initialValue:', initialValue ); // but the initialValue has also changed.
Somebody might argue that you can copy the initial value beforehand:
var result = newKeys.reduce( (acc, key) => {
acc[key] = 'new ' + key;
return acc
}, { ...initialValue }); // <-- copy beforehand
But this might be even less efficient in cases where e.g. the object is very big and nested,
the reducer is called often, and maybe there are multiple conditionally used small modifications
inside the reducer, which are only changing little.
(think of useReducer in React,
or the Redux reducer)
2. shallow copies
An other answer stated correctly that even with the supposedly pure approach there might still be a reference to the original object.
And this is indeed something to be aware of, but the problems arise only if you do not follow this 'immutable' approach consequently enough:
var initialValue = { a1: { value: '11'}, a2: { value: '22'} }; // <-- an object with nested 'non-primitive' values
var newObject = Object.keys(initialValue).reduce( (acc, key) => {
return {
...acc,
['newkey_' + key]: initialValue[key], // <-- copies a reference to the original object
};
}, {}); // <-- starting with empty new object, expected to be 'pure'
newObject.newkey_a1.value = 'new ref value'; // <-- changes the value of the reference
console.log( initialValue.a1 ); // <-- initialValue has changed as well
This is not a problem, if it is taken care that no references are copied (which might be not trivial sometimes):
var initialValue = { a1: { value: '11'}, a2: { value: '22'} };
var newObject = Object.keys(initialValue).reduce( (acc, key) => {
return {
...acc,
['newkey_' + key]: { value: initialValue[key].value }, // <-- copies the value
};
}, {});
newObject.newkey_a1.value = 'new ref value';
console.log( initialValue.a1 ); // <-- initialValue has not changed
3. performance
The performance is no problem with a few elements, but if the object has several thousand items, the performance becomes indeed a significant issue:
// create a large object
var myObject = {}; for( var i=0; i < 10000; i++ ){ myObject['key' + i] = i; }
// copying 10000 items takes seconds (increasing exponentially!)
// (create a new object 10000 times, with each 1,2,3,...,10000 properties)
console.time('copy')
var result = Object.keys(myObject).reduce( (acc, key)=>{
return {
...acc,
[key]: myObject[key] * 2
};
}, {});
console.timeEnd('copy');
// mutating 10000 items takes milliseconds (increasing linearly)
console.time('mutate')
var result = Object.keys(myObject).reduce( (acc, key)=>{
acc[key] = myObject[key] * 2;
return acc;
}, {});
console.timeEnd('mutate');

Design pattern to check if a JavaScript object has changed

I get from the server a list of objects
[{name:'test01', age:10},{name:'test02', age:20},{name:'test03', age:30}]
I load them into html controls for the user to edit.
Then there is a button to bulk save the entire list back to the database.
Instead of sending the whole list I only want to send the subset of objects that were changed.
It can be any number of items in the array. I want to do something similar to frameworks like Angular that mark an object property like "pristine" when no change has been done to it. Then use that flag to only post to the server the items that are not "pristine", the ones that were modified.
Here is a function down below that will return an array/object of changed objects when supplied with an old array/object of objects and a new array of objects:
// intended to compare objects of identical shape; ideally static.
//
// any top-level key with a primitive value which exists in `previous` but not
// in `current` returns `undefined` while vice versa yields a diff.
//
// in general, the input type determines the output type. that is if `previous`
// and `current` are objects then an object is returned. if arrays then an array
// is returned, etc.
const getChanges = (previous, current) => {
if (isPrimitive(previous) && isPrimitive(current)) {
if (previous === current) {
return "";
}
return current;
}
if (isObject(previous) && isObject(current)) {
const diff = getChanges(Object.entries(previous), Object.entries(current));
return diff.reduce((merged, [key, value]) => {
return {
...merged,
[key]: value
}
}, {});
}
const changes = [];
if (JSON.stringify(previous) === JSON.stringify(current)) {
return changes;
}
for (let i = 0; i < current.length; i++) {
const item = current[i];
if (JSON.stringify(item) !== JSON.stringify(previous[i])) {
changes.push(item);
}
}
return changes;
};
For Example:
const arr1 = [1, 2, 3, 4]
const arr2 = [4, 4, 2, 4]
console.log(getChanges(arr1, arr2)) // [4,4,2]
const obj1 = {
foo: "bar",
baz: [
1, 2, 3
],
qux: {
hello: "world"
},
bingo: "name-o",
}
const obj2 = {
foo: "barx",
baz: [
1, 2, 3, 4
],
qux: {
hello: null
},
bingo: "name-o",
}
console.log(getChanges(obj1.foo, obj2.foo)) // barx
console.log(getChanges(obj1.bingo, obj2.bingo)) // ""
console.log(getChanges(obj1.baz, obj2.baz)) // [4]
console.log(getChanges(obj1, obj2)) // {foo:'barx',baz:[1,2,3,4],qux:{hello:null}}
const obj3 = [{ name: 'test01', age: 10 }, { name: 'test02', age: 20 }, { name: 'test03', age: 30 }]
const obj4 = [{ name: 'test01', age: 10 }, { name: 'test02', age: 20 }, { name: 'test03', age: 20 }]
console.log(getChanges(obj3, obj4)) // [{name:'test03', age:20}]
Utility functions used:
// not required for this example but aid readability of the main function
const typeOf = o => Object.prototype.toString.call(o);
const isObject = o => o !== null && !Array.isArray(o) && typeOf(o).split(" ")[1].slice(0, -1) === "Object";
const isPrimitive = o => {
switch (typeof o) {
case "object": {
return false;
}
case "function": {
return false;
}
default: {
return true;
}
}
};
You would simply have to export the full list of edited values client side, compare it with the old list, and then send the list of changes off to the server.
Hope this helps!
Here are a few ideas.
Use a framework. You spoke of Angular.
Use Proxies, though Internet Explorer has no support for it.
Instead of using classic properties, maybe use Object.defineProperty's set/get to achieve some kind of change tracking.
Use getter/setting functions to store data instead of properties: getName() and setName() for example. Though this the older way of doing what defineProperty now does.
Whenever you bind your data to your form elements, set a special property that indicates if the property has changed. Something like __hasChanged. Set to true if any property on the object changes.
The old school bruteforce way: keep your original list of data that came from the server, deep copy it into another list, bind your form controls to the new list, then when the user clicks submit, compare the objects in the original list to the objects in the new list, plucking out the changed ones as you go. Probably the easiest, but not necessarily the cleanest.
A different take on #6: Attach a special property to each object that always returns the original version of the object:
var myData = [{name: "Larry", age: 47}];
var dataWithCopyOfSelf = myData.map(function(data) {
Object.assign({}, data, { original: data });
});
// now bind your form to dataWithCopyOfSelf.
Of course, this solution assumes a few things: (1) that your objects are flat and simple since Object.assign() doesn't deep copy, (2) that your original data set will never be changed, and (3) that nothing ever touches the contents of original.
There are a multitude of solutions out there.
With ES6 we can use Proxy
to accomplish this task: intercept an Object write, and mark it as dirty.
Proxy allows to create a handler Object that can trap, manipulate, and than forward changes to the original target Object, basically allowing to reconfigure its behavior.
The trap we're going to adopt to intercept Object writes is the handler set().
At this point we can add a non-enumerable property flag like i.e: _isDirty using Object.defineProperty() to mark our Object as modified, dirty.
When using traps (in our case the handler's set()) no changes are applied nor reflected to the Objects, therefore we need to forward the argument values to the target Object using Reflect.set().
Finally, to retrieve the modified objects, filter() the Array with our proxy Objects in search of those having its own Property "_isDirty".
// From server:
const dataOrg = [
{id:1, name:'a', age:10},
{id:2, name:'b', age:20},
{id:3, name:'c', age:30}
];
// Mirror data from server to observable Proxies:
const data = dataOrg.map(ob => new Proxy(ob, {
set() {
Object.defineProperty(ob, "_isDirty", {value: true}); // Flag
return Reflect.set(...arguments); // Forward trapped args to ob
}
}));
// From now on, use proxied data. Let's change some values:
data[0].name = "Lorem";
data[0].age = 42;
data[2].age = 31;
// Collect modified data
const dataMod = data.filter(ob => ob.hasOwnProperty("_isDirty"));
// Test what we're about to send back to server:
console.log(JSON.stringify(dataMod, null, 2));
Without using .defineProperty()
If for some reason you don't feel comfortable into tapping into the original object adding extra properties as flags, you could instead populate immediately
the dataMod (array with modified Objects) with references:
const dataOrg = [
{id:1, name:'a', age:10},
{id:2, name:'b', age:20},
{id:3, name:'c', age:30}
];
// Prepare array to hold references to the modified Objects
const dataMod = [];
const data = dataOrg.map(ob => new Proxy(ob, {
set() {
if (dataMod.indexOf(ob) < 0) dataMod.push(ob); // Push reference
return Reflect.set(...arguments);
}
}));
data[0].name = "Lorem";
data[0].age = 42;
data[2].age = 31;
console.log(JSON.stringify(dataMod, null, 2));
Can I Use - Proxy (IE)
Proxy - handler.set()
Global Objects - Reflect
Reflect.set()
Object.defineProperty()
Object.hasOwnProperty()
Without having to get fancy with prototype properties you could simply store them in another array whenever your form control element detects a change
Something along the lines of:
var modified = [];
data.forEach(function(item){
var domNode = // whatever you use to match data to form control element
domNode.addEventListener('input',function(){
if(modified.indexOf(item) === -1){
modified.push(item);
}
});
});
Then send the modified array to server when it's time to save
Why not use Ember.js observable properties ? You can use the Ember.observer function to get and set changes in your data.
Ember.Object.extend({
valueObserver: Ember.observer('value', function(sender, key, value, rev) {
// Executes whenever the "value" property changes
// See the addObserver method for more information about the callback arguments
})
});
The Ember.object actually does a lot of heavy lifting for you.
Once you define your object, add an observer like so:
object.addObserver('propertyKey', targetObject, targetAction)
My idea is to sort object keys and convert object to be string to compare:
// use this function to sort keys, and save key=>value in an array
function objectSerilize(obj) {
let keys = Object.keys(obj)
let results = []
keys.sort((a, b) => a > b ? -1 : a < b ? 1 : 0)
keys.forEach(key => {
let value = obj[key]
if (typeof value === 'object') {
value = objectSerilize(value)
}
results.push({
key,
value,
})
})
return results
}
// use this function to compare
function compareObject(a, b) {
let aStr = JSON.stringify(objectSerilize(a))
let bStr = JSON.stringify(objectSerilize(b))
return aStr === bStr
}
This is what I think up.
It would be cleanest, I’d think to have the object emit an event when a property is added or removed or modified.
A simplistic implementation could involve an array with the object keys; whenever a setter or heck the constructor returns this, it first calls a static function returning a promise; resolving: map with changed values in the array: things added, things removed, or neither. So one could get(‘changed’) or so forth; returning an array.
Similarly every setter can emit an event with arguments for initial value and new value.
Assuming classes are used, you could easily have a static method in a parent generic class that can be called through its constructor and so really you could simplify most of this by passing the object either to itself, or to the parent through super(checkMeProperty).

Categories

Resources