React JS class component
I know there have been many posts on this subject but I can't seem to get this scenario to work.
Basically on my HandleClickSave event I want to update an item in my object in state without affecting the other values and then passing this updated oblect onto my service to get updated in the db.
The 'item' in question is the 'design' from the (unLayer) React-email-editor.
Problem is after the service is run in 'HandleClickSave' point 3 below, the receiving field 'DesignStructure' in the db is NULL every time. The other two fields are fine as these are saved to state object elsewhere.
Part of the problem is that the Email-Editor doesn't have an 'onChange' property which is where I would normally update the state. The other two values in the object are input texts and they do have an onChange which is how their state counterparts are updated.
This is the object 'NewsletterDesign':
{
"DesignId": "1",
"DesignName": "DesignLayout 1 Test",
"DesignStructure": null
}
In my React class component...
this.state = {
NewsletterDesign: {}
}
And the HandleClickSave event....
HandleClickSave () {
const { NewsletterDesign } = this.state
this.editor.saveDesign((design) => {
this.setState(prevState => ({
NewsletterDesign: {
...prevState.NewsletterDesign,
DesignStructure: design
}
}));
// Update to database passing in the object 'NewsletterDesign'. Field 'DesignStructure' in db is null every time, but other two fields are updated.
NewsletterService.UpdateCreateNewsletterDesign(NewsletterDesign)
etc....etc..
React's setState is not update immediately. read more here.
You can simply do it inside setState by
this.setState(prevState => {
const newState = {
NewsletterDesign: {
...prevState.NewsletterDesign,
DesignStructure: design
}
};
NewsletterService.UpdateCreateNewsletterDesign(newState.NewsletterDesign);
return newState;
});
The setState is an async operation. Meaning, that it's not guaranteed that the new state that you have updated will be accessible just after the state is updated. You can read more here
So in such cases, one of the way is to do the required operation first and then use the result at multiple places.
HandleClickSave () {
const { NewsletterDesign } = this.state
this.editor.saveDesign((design) => {
let newNewsletterDesign = { ...NewsletterDesign,
DesignStructure: design
};
this.setState(newNewsletterDesign);
NewsletterService.UpdateCreateNewsletterDesign(newNewsletterDesign)
For example, could I iterate over Vuex data in a Vue file and choose the data needing updating, then pass the found data to an action, which commits it and then the mutation only makes the update?
The reason I'm unsure about it is because the typical format of a Vuex mutation contains the parameter for 'state', so I assume it needs to be used, and the only way to do that is either by doing all the looping inside the mutation, or to pass indexes to it to more quickly find the exact fields needing changing.
For who asked, a code example:
someVueFile.vue
computed: {
...mapState({
arrayOfObjects: (state) => state.someVuexStore.arrayOfObjects
}),
},
methods: {
myUpdateMethod() {
let toBePassedForUpdate = null;
let newFieldState = "oneValue";
this.arrayOfObjects.forEach((myObject) => {
if (myObject.someDataField !== "oneValue") {
toBePassedForUpdate = myObject.someDataField;
}
})
if (toBePassedForUpdate) {
let passObject = {
updateThis: toBePassedForUpdate,
newFieldState: newFieldState
}
this.$store.dispatch("updateMyObjectField", passObject)
}
}
}
someVuexStore.js
const state = {
arrayOfObjects: [],
/* contains some object such as:
myCoolObject: {
someDataField: "otherValue"
}
*/
}
const mutations = {
updateMyObjectField(state, data) {
data.updateThis = data.newFieldState;
}
}
const actions = {
updateMyObjectField(state, data) {
state.commit("updateMyObjectField", data);
}
}
Yes, it's alright to mutate state passed in through the payload argument rather than state. Vuex doesn't bother to distinguish between the two. In either case, it's the same state, and neither option detracts from the purposes of using mutations.
To feel more sure of that, you can ask what are the purposes of mutations and of enforcing their use. The answer is to keep a centralized, trackable location for concretely defined changes to state.
To illustrate this is a good thing, imagine an app with 1000 components, each one changing state locally, outside of a mutation, and in different ways. This could be a nightmare to debug or comprehend as a 3rd party, because you don't know how or where state changes.
So mutations enforce how and a centralized where. Neither of these are damaged by only using the payload argument in a mutation.
I would do all of the logic from one action, you can desctructured the context object in the action signature like so :
actions: {
myAction ({ state, commit, getters, dispacth } ,anyOtherParameter) {
let myVar = getters.myGetter//use a getter to get your data
//execute logic
commit('myCommit', myVar)//commit the change
}
}
If you need to do the logic in your component you can easily extract the getter and the logic from the action.
I've read multiple similar questions about this here and elsewhere, but I can't figure it out.
I have a form with mapGetters and input values that should update based on Vuex state:
...mapGetters({
show: "getShow"
}),
sample form input (I'm using Bootstrap Vue):
<b-form-input
id="runtime"
name="runtime"
type="text"
size="sm"
v-model="show.runtime"
placeholder="Runtime"
></b-form-input>
Then I have this method on the form component:
async searchOnDB() {
var showId = this.show.showId;
if (!showId) {
alert("Please enter a showId");
return;
}
try {
await this.$store.dispatch("searchShowOnDB", showId);
} catch (ex) {
console.log(ex);
alert("error searching on DB");
}
},
and this action on the store:
async searchShowOnDB({ commit, rootState }, showId) {
var response = await SearchAPI.searchShowOnDB(showId);
var show = {
show_start: response.data.data.first_aired,
runtime: response.data.data.runtime,
description: response.data.data.overview
};
//I'm updating the object since it could already contain something
var new_show = Object.assign(rootState.shows.show, show);
commit("setShow", new_show);
}
mutation:
setShow(state, show) {
Vue.set(state, "show", show);
}
searchAPI:
export default {
searchShowOnDB: function (showId) {
return axios.get('/search/?id=' + showId);
},
}
Everything works, the API call is executed, I can even see the Vuex updated state in Vue Devtools, but the form is not updated.
As soon as I write something in an input field or hit commit in Vue Devtools, the form fields show_start, runtime, description all get updated.
Also, this works correctly and updates everything:
async searchShowOnDB({ commit, rootState }, showId) {
var show = {
show_start: "2010-03-12",
runtime: 60,
description: "something"
};
//I'm updating the object since it could already contain something
var new_show = Object.assign(rootState.shows.show, show);
commit("setShow", new_show);
}
I don't know what else to do, I tried by resolving Promises explicitly, remove async/await and use axios.get(...).then(...), moving stuff around... nothing seems to work.
On line 15 of your /modules/search.js you're using Object.assign() on rootState.search.show. This mutates the search prop of the state (which is wrong, btw, you should only mutate inside mutations!). Read below why.
And then you're attempting to trigger the mutation. But, guess what? Vue sees it's the same value, so no component is notified, because there was no change. This is why you should never mutate outside of mutations!
So, instead of assigning the value to the state in your action, just commit the new show (replace lines 15-16 with:
commit('setShow', show);
See it here: https://codesandbox.io/s/sharp-hooks-kplp7?file=/src/modules/search.js
This will completely replace state.show with show. If you only want to merge the response into current state.show (to keep some custom stuff you added to current show), you could spread the contents of state.show and overwrite with contents of show:
commit("setShow", { ...rootState.search.show, ...show });
Also note you don't need Vue.set() in your mutation. You have the state in the first parameter of any mutation and that's the state of the current module. So just assign state.show = show.
And one last note: when your vuex gets bigger, you might want to namespace your modules, to avoid any name clashes.
All props of objects in a state that is used in templates must exist or you should call Vue.set for such properties.
state: {
show: {
runtime: null // <- add this line
}
},
You call Vue.set for the whole object but it already exist in the state and you do not replace it by a new one you just replace props. In your case you have an empty object and add the 'runtime' prop it it using Object.assign.
Also all manipulations with state should be done in mutations:
var new_show = {
runtime: response.data.url
};
commit("setShow", new_show);
...
mutations: {
setShow(state, new_show) {
Object.assign(state.show, new_show)
}
},
I have a table like
When you edit the quantity using this onChange
onChange={this.handleInputChange.bind(null, cellInfo)}
I run the below code
handleInputChange = (cellInfo, event) => {
let data = { ...this.props.Data };
data[cellInfo.index][cellInfo.column.id] = parseInt(event.target.value);
this.props.APISummaryData(data);
};
Goal being first get the data in the store, then reflect the value you changed and then update it with action this.props.APISummaryData(data); and this.props.APISummaryData({ ...data }); both give same State mutation error.
Here's the reducer
case types.API_SUMMARY_DATA:
return {
...state,
Summary: {
...state.Summary,
Data: action.Summary
}
};
If I manually dispatch an action within Redux inside DevTools doing
{
type: 'API_SUMMARY_DATA',
Summary: [
{
cusip: '019I',
quantity: 55,
}
]
}
This is the action
export const APISummaryData = Summary => ({ type: types.API_SUMMARY_DATA, Summary });
I don't get any error and data gets updated. I am so puzzled where in this scheme I mutate the state?
Note: it is possible I am not sharing some code that's important to take a look here, so please let me know and I'll share it.
exact error
I assume that you're using configureStore() from Redux Starter Kit, which sets up a mutation checking middleware by default. Good! This means that the mutation checker is doing its job correctly.
These lines right here are mutating:
let data = { ...this.props.Data };
data[cellInfo.index][cellInfo.column.id] = parseInt(event.target.value);
That's because the {...} object spread operator does a shallow copy, not a deep copy. This is a very common mistake.
I personally would recommend dispatching an action that looks like:
{type: "API_SUMMARY_DATA", payload: {index, columnId, inputValue}}
and then use the reducer to do all the updating.
Also, if you are using Redux Starter Kit, you can use our createReducer() function to write "mutative" code in the reducer that actually does immutable updates.
I've currently got a situation whereby I need Redux Actions to be run consecutively. I've taken a look at various middlewares, such a redux-promise, which seem to be fine if you know what the successive actions are at the point of the root (for lack of a better term) action being triggered.
Essentially, I'd like to maintain a queue of actions that can be added to at any point. Each object has an instance of this queue in its state and dependent actions can be enqueued, processed and dequeued accordingly. I have an implementation, but in doing so I'm accessing state in my action creators, which feels like an anti-pattern.
I'll try and give some context on use case and implementation.
Use Case
Suppose you want to create some lists and persist them on a server. On list creation, the server responds with an id for that list, which is used in subsequent API end points pertaining to the list:
http://my.api.com/v1.0/lists/ // POST returns some id
http://my.api.com/v1.0/lists/<id>/items // API end points include id
Imagine that the client wants to perform optimistic updates on these API points, to enhance UX - nobody likes looking at spinners. So when you create a list, your new list instantly appears, with an option at add items:
+-------------+----------+
| List Name | Actions |
+-------------+----------+
| My New List | Add Item |
+-------------+----------+
Suppose that someone attempts to add an item before the response from the initial create call has made it back. The items API is dependent on the id, so we know we can't call it until we have that data. However, we might want to optimistically show the new item and enqueue a call to the items API so that it triggers once the create call is done.
A Potential Solution
The method I'm using to get around this currently is by giving each list an action queue - that is, a list of Redux actions that will be triggered in succession.
The reducer functionality for a list creation might look something like this:
case ADD_LIST:
return {
id: undefined, // To be filled on server response
name: action.payload.name,
actionQueue: []
}
Then, in an action creator, we'd enqueue an action instead of directly triggering it:
export const createListItem = (name) => {
return (dispatch) => {
dispatch(addList(name)); // Optimistic action
dispatch(enqueueListAction(name, backendCreateListAction(name));
}
}
For brevity, assume the backendCreateListAction function calls a fetch API, which dispatches messages to dequeue from the list on success/failure.
The Problem
What worries me here is the implementation of the enqueueListAction method. This is where I'm accessing state to govern the advancement of the queue. It looks something like this (ignore this matching on name - this actually uses a clientId in reality, but I'm trying to keep the example simple):
const enqueueListAction = (name, asyncAction) => {
return (dispatch, getState) => {
const state = getState();
dispatch(enqueue(name, asyncAction));{
const thisList = state.lists.find((l) => {
return l.name == name;
});
// If there's nothing in the queue then process immediately
if (thisList.actionQueue.length === 0) {
asyncAction(dispatch);
}
}
}
Here, assume that the enqueue method returns a plain action that inserts an async action into the lists actionQueue.
The whole thing feels a bit against the grain, but I'm not sure if there's another way to go with it. Additionally, since I need to dispatch in my asyncActions, I need to pass the dispatch method down to them.
There is similar code in the method to dequeue from the list, which triggers the next action should one exist:
const dequeueListAction = (name) => {
return (dispatch, getState) => {
dispatch(dequeue(name));
const state = getState();
const thisList = state.lists.find((l) => {
return l.name === name;
});
// Process next action if exists.
if (thisList.actionQueue.length > 0) {
thisList.actionQueue[0].asyncAction(dispatch);
}
}
Generally speaking, I can live with this, but I'm concerned that it's an anti-pattern and there might be a more concise, idiomatic way of doing this in Redux.
Any help is appreciated.
I have the perfect tool for what you are looking for. When you need a lot of control over redux, (especially anything asynchronous) and you need redux actions to happen sequentially there is no better tool than Redux Sagas. It is built on top of es6 generators giving you a lot of control since you can, in a sense, pause your code at certain points.
The action queue you describe is what is called a saga. Now since it is created to work with redux these sagas can be triggered to run by dispatching in your components.
Since Sagas use generators you can also ensure with certainty that your dispatches occur in a specific order and only happen under certain conditions. Here is an example from their documentation and I will walk you through it to illustrate what I mean:
function* loginFlow() {
while (true) {
const {user, password} = yield take('LOGIN_REQUEST')
const token = yield call(authorize, user, password)
if (token) {
yield call(Api.storeItem, {token})
yield take('LOGOUT')
yield call(Api.clearItem, 'token')
}
}
}
Alright, it looks a little confusing at first but this saga defines the exact order a login sequence needs to happen. The infinite loop is allowed because of the nature of generators. When your code gets to a yield it will stop at that line and wait. It will not continue to the next line until you tell it to. So look where it says yield take('LOGIN_REQUEST'). The saga will yield or wait at this point until you dispatch 'LOGIN_REQUEST' after which the saga will call the authorize method, and go until the next yield. The next method is an asynchronous yield call(Api.storeItem, {token}) so it will not go to the next line until that code resolves.
Now, this is where the magic happens. The saga will stop again at yield take('LOGOUT') until you dispatch LOGOUT in your application. This is crucial since if you were to dispatch LOGIN_REQUEST again before LOGOUT, the login process would not be invoked. Now, if you dispatch LOGOUT it will loop back to the first yield and wait for the application to dispatch LOGIN_REQUEST again.
Redux Sagas are, by far, one of my favorite tools to use with Redux. It gives you so much control over your application and anyone reading your code will thank you since everything now reads one line at a time.
Have a look at this: https://github.com/gaearon/redux-thunk
The id alone shouldn't go through the reducer. In your action creator (thunk), fetch the list id first, and then() perform a second call to add the item to the list. After this, you can dispatch different actions based on whether or not the addition was successful.
You can dispatch multiple actions while doing this, to report when the server interaction has started and finished. This will allow you to show a message or a spinner, in case the operation is heavy and might take a while.
A more in-depth analysis can be found here: http://redux.js.org/docs/advanced/AsyncActions.html
All credit to Dan Abramov
I was facing a similar problem to yours. I needed a queue to guarantee that optimistic actions were committed or eventually committed (in case of network problems) to remote server in same sequential order they were created, or rollback if not possible. I found that with Redux only, fells short for this, basically because I believe it was not designed for this and doing it with promises alone can be really a hard problem to reason with, besides the fact you need to manage your queue state somehow... IMHO.
I think #Pcriulan's suggestion on using redux-saga was a good one. At first sight, redux-saga doesn't provide anything to help you with until you get to channels. This opens you a door to deal with concurrency in other ways other languages do, CSP specifically (see Go or Clojure's async for example), thanks to JS generators. There are even questions on why is named after the Saga pattern and not CSP haha... anyway.
So here is how a saga could help you with your queue:
export default function* watchRequests() {
while (true) {
// 1- Create a channel for request actions
const requestChan = yield actionChannel('ASYNC_ACTION');
let resetChannel = false;
while (!resetChannel) {
// 2- take from the channel
const action = yield take(requestChan);
// 3- Note that we're using a blocking call
resetChannel = yield call(handleRequest, action);
}
}
}
function* handleRequest({ asyncAction, payload }) {
while (true) {
try {
// Perform action
yield call(asyncAction, payload);
return false;
} catch(e) {
if(e instanceof ConflictError) {
// Could be a rollback or syncing again with server?
yield put({ type: 'ROLLBACK', payload });
// Store is out of consistency so
// don't let waiting actions come through
return true;
} else if(e instanceof ConnectionError) {
// try again
yield call(delay, 2000);
}
}
}
}
So the interesting part here is how the channel acts as a buffer (a queue) which keeps "listening" for incoming actions but won't proceed with future actions until it finish with the current one. You might need to go over their documentation in order to grasp the code better, but I think it's worth it. The resetting channel part might or not work for your needs :thinking:
Hope it helps!
This is how I would tackle this problem:
Make sure each local list have an unique identifier. I'm not talking about the backend id here. Name is probably not enough to identify a list? An "optimistic" list not yet persisted should be uniquely identifiable, and user may try to create 2 lists with the same name, even if it's an edge case.
On list creation, add a promise of backend id to a cache
CreatedListIdPromiseCache[localListId] = createBackendList({...}).then(list => list.id);
On item add, try to get the backend id from Redux store. If it does not exist, then try to get it from CreatedListIdCache. The returned id must be async because CreatedListIdCache returns a promise.
const getListIdPromise = (localListId,state) => {
// Get id from already created list
if ( state.lists[localListId] ) {
return Promise.resolve(state.lists[localListId].id)
}
// Get id from pending list creations
else if ( CreatedListIdPromiseCache[localListId] ) {
return CreatedListIdPromiseCache[localListId];
}
// Unexpected error
else {
return Promise.reject(new Error("Unable to find backend list id for list with local id = " + localListId));
}
}
Use this method in your addItem, so that your addItem will be delayed automatically until the backend id is available
// Create item, but do not attempt creation until we are sure to get a backend id
const backendListItemPromise = getListIdPromise(localListId,reduxState).then(backendListId => {
return createBackendListItem(backendListId, itemData);
})
// Provide user optimistic feedback even if the item is not yet added to the list
dispatch(addListItemOptimistic());
backendListItemPromise.then(
backendListItem => dispatch(addListItemCommit()),
error => dispatch(addListItemRollback())
);
You may want to clean the CreatedListIdPromiseCache, but it's probably not very important for most apps unless you have very strict memory usage requirements.
Another option would be that the backend id is computed on frontend, with something like UUID. Your backend just need to verify the unicity of this id. Thus you would always have a valid backend id for all optimistically created lists, even if backend didn't reply yet.
You don't have to deal with queuing actions. It will hide the data flow and it will make your app more tedious to debug.
I suggest you to use some temporary ids when creating a list or an item and then update those ids when you actually receive the real ones from the store.
Something like this maybe ? (don't tested but you get the id) :
EDIT : I didn't understand at first that the items need to be automatically saved when the list is saved. I edited the createList action creator.
/* REDUCERS & ACTIONS */
// this "thunk" action creator is responsible for :
// - creating the temporary list item in the store with some
// generated unique id
// - dispatching the action to tell the store that a temporary list
// has been created (optimistic update)
// - triggering a POST request to save the list in the database
// - dispatching an action to tell the store the list is correctly
// saved
// - triggering a POST request for saving items related to the old
// list id and triggering the correspondant receiveCreatedItem
// action
const createList = (name) => {
const tempList = {
id: uniqueId(),
name
}
return (dispatch, getState) => {
dispatch(tempListCreated(tempList))
FakeListAPI
.post(tempList)
.then(list => {
dispatch(receiveCreatedList(tempList.id, list))
// when the list is saved we can now safely
// save the related items since the API
// certainly need a real list ID to correctly
// save an item
const itemsToSave = getState().items.filter(item => item.listId === tempList.id)
for (let tempItem of itemsToSave) {
FakeListItemAPI
.post(tempItem)
.then(item => dispatch(receiveCreatedItem(tempItem.id, item)))
}
)
}
}
const tempListCreated = (list) => ({
type: 'TEMP_LIST_CREATED',
payload: {
list
}
})
const receiveCreatedList = (oldId, list) => ({
type: 'RECEIVE_CREATED_LIST',
payload: {
list
},
meta: {
oldId
}
})
const createItem = (name, listId) => {
const tempItem = {
id: uniqueId(),
name,
listId
}
return (dispatch) => {
dispatch(tempItemCreated(tempItem))
}
}
const tempItemCreated = (item) => ({
type: 'TEMP_ITEM_CREATED',
payload: {
item
}
})
const receiveCreatedItem = (oldId, item) => ({
type: 'RECEIVE_CREATED_ITEM',
payload: {
item
},
meta: {
oldId
}
})
/* given this state shape :
state = {
lists: {
ids: [ 'list1ID', 'list2ID' ],
byId: {
'list1ID': {
id: 'list1ID',
name: 'list1'
},
'list2ID': {
id: 'list2ID',
name: 'list2'
},
}
...
},
items: {
ids: [ 'item1ID','item2ID' ],
byId: {
'item1ID': {
id: 'item1ID',
name: 'item1',
listID: 'list1ID'
},
'item2ID': {
id: 'item2ID',
name: 'item2',
listID: 'list2ID'
}
}
}
}
*/
// Here i'm using a immediately invoked function just
// to isolate ids and byId variable to avoid duplicate
// declaration issue since we need them for both
// lists and items reducers
const lists = (() => {
const ids = (ids = [], action = {}) => ({
switch (action.type) {
// when receiving the temporary list
// we need to add the temporary id
// in the ids list
case 'TEMP_LIST_CREATED':
return [...ids, action.payload.list.id]
// when receiving the real list
// we need to remove the old temporary id
// and add the real id instead
case 'RECEIVE_CREATED_LIST':
return ids
.filter(id => id !== action.meta.oldId)
.concat([action.payload.list.id])
default:
return ids
}
})
const byId = (byId = {}, action = {}) => ({
switch (action.type) {
// same as above, when the the temp list
// gets created we store it indexed by
// its temp id
case 'TEMP_LIST_CREATED':
return {
...byId,
[action.payload.list.id]: action.payload.list
}
// when we receive the real list we first
// need to remove the old one before
// adding the real list
case 'RECEIVE_CREATED_LIST': {
const {
[action.meta.oldId]: oldList,
...otherLists
} = byId
return {
...otherLists,
[action.payload.list.id]: action.payload.list
}
}
}
})
return combineReducers({
ids,
byId
})
})()
const items = (() => {
const ids = (ids = [], action = {}) => ({
switch (action.type) {
case 'TEMP_ITEM_CREATED':
return [...ids, action.payload.item.id]
case 'RECEIVE_CREATED_ITEM':
return ids
.filter(id => id !== action.meta.oldId)
.concat([action.payload.item.id])
default:
return ids
}
})
const byId = (byId = {}, action = {}) => ({
switch (action.type) {
case 'TEMP_ITEM_CREATED':
return {
...byId,
[action.payload.item.id]: action.payload.item
}
case 'RECEIVE_CREATED_ITEM': {
const {
[action.meta.oldId]: oldList,
...otherItems
} = byId
return {
...otherItems,
[action.payload.item.id]: action.payload.item
}
}
// when we receive a real list
// we need to reappropriate all
// the items that are referring to
// the old listId to the new one
case 'RECEIVE_CREATED_LIST': {
const oldListId = action.meta.oldId
const newListId = action.payload.list.id
const _byId = {}
for (let id of Object.keys(byId)) {
let item = byId[id]
_byId[id] = {
...item,
listId: item.listId === oldListId ? newListId : item.listId
}
}
return _byId
}
}
})
return combineReducers({
ids,
byId
})
})()
const reducer = combineReducers({
lists,
items
})
/* REDUCERS & ACTIONS */