Cohars Cohars - 2 months ago 19
React JSX Question

React/Redux performances with big states

I'm dealing with a big json with a lot of editable values (*big means > 1000), entirely rendered on the same page, so my state is simply

{ data: bigBigJson }
.

The initial rendering is quite long but it's ok.

The problem is that when an input triggers an
onChange
(and a redux action), the value is updated in the state, and the whole rendering happens again.

I wonder how people deal with that? Is there simple solutions (even not necessarily best practices).

Notes:


  • The json document is provided by an external API, I can't change it

  • I could separate the state in several sub-states (it's a multiple levels json), but hoping for a simpler/faster solution (I know it would probably be a best practice though)

  • I'm using react and redux, not immutable.js but everything is immutable (obviously)



––

Update (about DSS answer)

• (Case 1) Let's say the state is:

{
data: {
key1: value1,
// ...
key1000: value1000
}
}


If
keyN
is updated, all the state would be re-rendered anyway right? The reducer would return something like:

{
data: {
...state.data,
keyN: newValueN
}


That's one thing but it's not really my case.

• (Case 2) The state is more like (over simplified):

{
data: {
dataSet1: {
key1: value1,
// ...
key10: value1000
},
// ...
dataSet100: {
key1: value1,
// ...
key10: value1000
}
}
}


If
dataN.keyN
is updated, I would return in the reducer

{
data: {
...state.data,
dataN: {
...state.data.dataN,
keyN: newValueN
}
}
}


I guess i'm doing something wrong as it doesn't look really nice.
Would it change anything like that:

// state
{
dataSet1: {
key1: value1,
// ...
key10: value1000
},
// ...
dataSet100: {
key1: value1,
// ...
key10: value1000
}
}

// reducer
{
...state,
dataN: {
...state.dataN,
keyN: newValueN
}
}


Finally, just to be more specific about my case, here is more what my reducer looks like (still a bit simplified):

import get from 'lodash/fp/get'
import set from 'lodash/fp/set'
// ...
// reducer:
// path = 'values[3].values[4].values[0]'
return {
data: set(path, {
...get(path, state.data),
value: newValue
}, state.data)
}


• In case you are wondering, i can't just use:

data: set(path + '.value', newValue, state.data)


as other properties needs to be updated as well.

DDS DDS
Answer

The reason everything gets rerendered is because everything in your store changes. It may look the same. All properties may have the same values. But all object references have changed. That is to say that even if two objects have the same properties, they still have separate identities.

Since React-Redux uses object identity to figure out if an object has changed, you should always make sure to use the same object reference whenever an object has not changed. Since Redux state must be immutable, using the old object in the new state is a guaranteed not to cause problems. Immutable objects can be reused in the same way an integer or a string can be reused.

To solve your dilemma, you can, in your reducer, go over the JSON and the store state sub objects and compare them. If they are the same, make sure to use the store object. By reusing the same object React-Redux will make sure the components that represent those objects will not be rerendered. This means that if only one of those 1000 objects changes, only one component will update.

Also make sure to use the React key property correctly. Each of those 1000 items needs it's own ID that stays the same from JSON to JSON.

Finally, consider making your state itself more amenable to such updates. You could transform the JSON when loading and updating the state. You could store the items keyed by ID for instance which would make the update process a lot faster.