Currently I have a large list of data (500 Rows) that at times many records can be updated a second. I'm using firebase's Real time Database. I am using React and Redux and basically whenever a record is changed, I fire a dispatch event to update the state in my app. When there are many records being updated it slows down and almost crashes the browser.
I've narrowed down my performance issues to it trying to dispatch 200+ actions at once. But since it is websockets/firebase I have no way of getting these updates in groups.
I am wondering if there is a library to use that will queue the dispatch requests and update the state one at a time, in order. Instead of trying to do it all at once.
Are these issues by any chance occurring in development with Redux dev tools also running?
Redux is fairly optimised to handle large data sets (particularly if you normalise your data structure). However, if you are dispatching a large number of actions and also have a large amount of data in your Redux store, using Redux dev tools can give a somewhat false impression of poor performance.
In a production build of your application there would only ever be one instance of your Redux state at a particular moment. Hence the first of the three Redux principles; single source of truth.
The difference whilst using Redux-dev tools in development however is that the dev tools keep a history of your actions and the state for each action dispatch you trigger. This subsequently can lead to large amounts of that data filling up your browser’s memory and thus giving the perception of poor performance.
You can also take a look at the Redux documentation on performance which has several further suggestions for how you can optimise your application.
If you would also like to show us how your data is structured in your reducer, or how you are handling your action dispatches, perhaps we can make further suggestions to improve your performance.
Related
I couldn't find too much information about my question after googling a lot. So maybe somebody has an answer.
So I use redux and redux-saga in my react-native app. I store a countdown timer state in the store because I have to persist the timer state.
The redux store state cca. over 15MB because the user has a lot of data while using the app. I recognized, that the JS Thread has a huge drop rate if an action is called and the store state has been changed. I tried with different sizes of data and I think the store size and the frame rate drop are in connection.
Does anybody have the same experience or any solution for this issue? I checked a lot of performance blog posts about that but nowhere could find a solution for this effect.
I am thinking about migrating to Zustand where I can apply separated little chunks for the global state and maybe it has better performance, but in the redux docs if I check the performance topic I can see a much bigger state like mine. So I don't have any idea at this point.
Otherway I checked my render state on these actions maybe it causes this but nothing renders unnecessarily.
So my conclusion is that when the new state is returned for some reason always generated a new large object which can happen slowly if the state size is over 15-20Mb.
I'm developing an application using React Native + Redux and Redux Thunk. In one of actions, a large json data is fetched from our server and then dispatched to the store. When the dispatch happens, the JS thread frame rate drops from 60fps to 0 or 1 fps and so all touchables and buttons become not responsive and it's impossible to navigate through the app for a couple of seconds until the dispatch concludes and everything become normal again.
We've already made sure that only the components that needs this reducer data are re-rendering but the problem persists.
The data we are downloading is a json similar to a Map that may have hundreds or thousands of values.
Is there any way I can make React Native and Redux deal with this kind of data without dropping frames
"[...], there's nothing inherently slow or inefficient about how Redux is implemented. In fact, React Redux in particular is heavily optimized to cut down on unnecessary re-renders [...]"
Shouldn't be a problem for react-redux. Try to optimize your overall component structure:
Implement shouldComponentUpdate()-Method or PureComponents to get away from pointless rerenders. You could also consider designing a normalized redux state.
Split components in many individual pieces before connecting to the store.
Avoid overfetching and optimize lazy loading. You won't be able to display those data all at once.
This repository contains all articles treating the subject of 'Performance & Redux'. It's worth a visit ;)
I am currently dealing with a very large table of data. Some actions the user can take are quite complex - complex enough that I might end up dispatching a hundred actual actions to Redux, many of which will depend on the state being updated by previous actions, and causing a hundred state changes... which will result in the page seeming to lock up as numerous things in a very large table are rendered a hundred times, even though none of these updates will be individually meaningful to the user (or actively confusing)
Is there a way to delay Redux/React from seeing these changes - to say "okay, don't bother pestering React about this stuff, don't recalculate any props, don't do anything but throw this stuff through the reducers until it's done and I tell you it's done, and then return to normal behaviour" ?
I know I could set some React state property and then have a shouldUpdateComponent in each of my many components, but I was hoping there was a solution that involved less duplicate code scattered across dozens of files and perhaps even a bit more efficiency of avoiding calling the same exact function dozens of times per update.
Any suggestions?
Dan Abramov himself wrote on twitter an example of how to do this using batched actions and a higher order reducer.
https://twitter.com/dan_abramov/status/656074974533459968?lang=en
The gist of the idea is to wrap the actions you want to batch in another action, and define a higher order reducer (a reducer that returns another reducer, eg. redux-undo) that will apply all these actions when it handles the batched action update.
I am currently using reactJS version "15.0.1" in my web application. In one of the feature we need to keep pooling some information continuously after each 2 seconds. So we receive the response which is List of some object(700/1000 items in list) which we update and show in the react web application. The Problem is after some time the application becomes unresponsive and takes too much time for any operation. On profiling I found its render, batch updates and dispatch event in react js that takes the longest time. Is there any recommended way to get away with the performance issue in react. The feature needs to be refreshed every 2 seconds and list size is more than 1000 items each time.
The performance issue is observed in IE and Chrome browser.
It's hard to tell without seeing your code, maybe you have a memory leak? You could try to mark your objects for garbage collection at the end of your methods.
listOfSomeObject = null;
Here is a good article capturing some methods to identify and fix memory leaks.
https://auth0.com/blog/four-types-of-leaks-in-your-javascript-code-and-how-to-get-rid-of-them/
I've worked with Redux/sagas workflows on small projects based off of this real-world example, but the logic of those is not nearly as complex. How should I be approach working with a more comprehensive api (i.e., Reddit's API), without making things overly verbose?
Do I make a const for every endpoint? i.e.,
export const fetchUser = login => callApi(`users/${login}`, userSchema)
Should I be worried about managing the entity cache?
Is there a way how to further reduce complexity/boilerplate (i.e. further grouping request types with get/put/post/delete for the same endpoint)?
Are there any examples out there that deal with bigger/more complex than the real-world?
I think the answer depends on how fluid you want your components to be.
I'm working on a large codebase using sagas, our pages are separated into "types", for example a "list" type, "form" type etc.
We have one saga responsible for fetching content, while each pageComponent when being rendered is responsible for supplying the endpoints.
this allows a very modular approach, to add a component you need to deal with one subsection of your file system.
Our pages are mostly a configuration file that contains all this information, and we use this configuration to render a "generic" component with the correct data.
Saga reusability
I see Sagas as sequential processes, they can be for async fetching data, but they're also useful for anything that needs to be dealt with in sequence.
These "Flows" are sometimes very similar in a codebase, and those are the ones you want to generalize.
Like you said, the most common operations are CRUD for any endpoint, that can be easily grouped together.
Login is extremely different than loadUserList and different things need to happen afterwards, however loadUserList and loadRepoList is extremely similar.
Things that impact reusability
Your ability to control your API's, if you can dictate the shape of the API you consume, you can get away with even more generalizations in front end.
The shape of the application(Front-wise) - are your pages strangely dependent on one another's state? for example it's not uncommon for insurance programs to have forms that link to one another, you can fill the first 3 forms in any order you want, but once all three are complete the 4th unlocks.
Each of these dependancies will normally have their own saga that controls the flow of your use story.
Does your application require syncing? You can easily create sagas that automatically sync data with your different endpoints and update your Redux State, there's much to consider here, including if we decide to interrupt the user with new data(we might want to let him know that the form he's editing has out dated data) - syncs require a distinct saga as there are usually various business rules when to sync what data - if the rules are very different this can force you to create multiple sagas)
Common Sagas that can be unified
UserSagas - login, logout.
FetchData - fetch a single record, or a collection.
DeleteData - Delete a single record, or a collection of IDs
Data Syncing - Updating your local data from a remote periodically.
Regarding the entity cache
Entity cache is just a name they picked, but this goes back the points mentioned before.
Does your application run on stale data, or do you fetch from the server every time your component is loaded?
If the data is only fetched once and you display stale data, you'll store it in a type of cache(that's basically the redux store).
If you show stale data, this is the way to go.