New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
enhance: Maintain referential equality globally #403
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ntucker
force-pushed
the
denormalize-cache
branch
from
August 20, 2020 20:37
c80fd41
to
d234fd3
Compare
Size Change: +9.56 kB (9%) 🔍 Total Size: 105 kB
ℹ️ View Unchanged
|
nickcherry
reviewed
Sep 4, 2020
ntucker
force-pushed
the
denormalize-cache
branch
2 times, most recently
from
September 15, 2020 18:51
09b463e
to
7872fb3
Compare
ntucker
force-pushed
the
denormalize-cache
branch
3 times, most recently
from
January 12, 2021 08:06
fdc4a23
to
95aeee5
Compare
ntucker
force-pushed
the
denormalize-cache
branch
from
January 12, 2021 08:22
95aeee5
to
da55507
Compare
ntucker
changed the title
feat: Add WeakListMap data structure
feat: Use global entity+results cache
Jan 12, 2021
ntucker
changed the title
feat: Use global entity+results cache
feat: Maintain referential equality globally
Jan 12, 2021
ntucker
changed the title
feat: Maintain referential equality globally
enhance: Maintain referential equality globally
Jan 12, 2021
ljharb
reviewed
Jan 12, 2021
Co-authored-by: Jordan Harband <ljharb@gmail.com>
ntucker
force-pushed
the
denormalize-cache
branch
from
January 13, 2021 01:24
80b4a6f
to
acb376d
Compare
ljharb
reviewed
Jan 13, 2021
ljharb
approved these changes
Jan 13, 2021
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM but would be good to get more eyes on it if possible
Co-authored-by: Jordan Harband <ljharb@gmail.com>
Co-authored-by: Jordan Harband <ljharb@gmail.com>
Co-authored-by: Jordan Harband <ljharb@gmail.com>
Co-authored-by: Jordan Harband <ljharb@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation
SSR
To make SSR easy, the cache itself should be easily serializable/deserializable. Currently the cache stores instances of Entities, which means any deserialization construct would have to deal with knowing how to load the Entity classes. While this may be achievable, this amount of complexity while maintaining codesplitting seems undesirable.
Entities nested in entities
Since the flat entities are stored as results of fromJS() (normalized form) - they simply contain ids to their nested counterparts. This makes the typing somewhat odd, but more importantly means that denormalizing them on the fly violates referential equality expectations of any entity. In the past this hasn’t been a huge deal because we mostly don’t use nested entities in retail.
Update Performance
Referential guarantees are currently provided by a useDenormalized() hook. Using useMemo() ensures concurrent mode compatibility - but it also means all caches are distinct. This has several downsides.
Solution
Denormalize function will take in an extra optional argument that is a denormalization cache. This will be used to short-circuit computations if the pieces exist. This also means useDenormalized() will be simplified - possibly removed later.
The denormalized cache will be provided via context, but maintain referential equality due to its mutable nature.
Design constraints
Since concurrent mode compatibility is necessary, extra care is needed when creating a cache that is global. Unlike component-level caches which only need the last value, global caches need to be able to provide answers to questions from disjoint state trees.
Hence, we need a mechanism to look up based on the current state tree, which cache result might exist. Furthermore, once React has discarded a tree, and thus the state representation - these cache answers should be released to be collected from the GC.
This makes WeakMap an ideal supplementary data structure. However, there is an additional challenge to provide a list of keys that will be outlined below.
Cache data structure
This structure mirrors the state cache itself, with the values of entities, and results being supplemented with a WeakMap.
WeakListMap
Mapping a list of referential objects to a value is not actually provided by the WeakMap interface. WeakMap can only map a singular object. To create the necessary data structure, we pull from lisps’
cons
data construct. We can recursively apply a WeakMap to each entry in the list to achieve the appropriate mapping. This also resembles the Trie data structure.This structure
Is represented by
Open questions
Future work