Optionally measure size of cache by sum of length of values #1815

Merged
merged 11 commits into from Jan 17, 2017

Projects

None yet

2 participants

@erikjohnston
Member

No description provided.

erikjohnston added some commits Jan 13, 2017
@erikjohnston erikjohnston Optionally measure size of cache by sum of length of values
2fae34b
@erikjohnston erikjohnston Increase cache size limit
0152129
erikjohnston added some commits Jan 16, 2017
@erikjohnston erikjohnston Add support for 'iterable' to ExpiringCache
46aebbb
@erikjohnston erikjohnston Up cache max entries for state
897f875
synapse/util/caches/lrucache.py
@@ -58,6 +58,18 @@ def __init__(self, max_size, keylen=1, cache_type=dict):
lock = threading.Lock()
+ def cache_len():
+ if size_callback is not None:
+ return sum(size_callback(node.value) for node in cache.itervalues())
@NegativeMjark
NegativeMjark Jan 16, 2017 Contributor

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

@NegativeMjark
NegativeMjark Jan 16, 2017 Contributor

If you want to be awful you can move the if statement outside the function def and define the function twice...

synapse/util/caches/expiringcache.py
)
def __len__(self):
- return len(self._cache)
+ if self.iterable:
+ return sum(len(value.value) for value in self._cache.itervalues())
@NegativeMjark
NegativeMjark Jan 16, 2017 Contributor

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

erikjohnston added some commits Jan 16, 2017
@erikjohnston erikjohnston Use OrderedDict in ExpiringCache 6d00213
@erikjohnston erikjohnston Add ExpiringCache tests
f2f179d
@erikjohnston erikjohnston Speed up cache size calculation
Instead of calculating the size of the cache repeatedly, which can take
a long time now that it can use a callback, instead cache the size and
update that on insertion and deletion.

This requires changing the cache descriptors to have two caches, one for
pending deferreds and the other for the actual values. There's no reason
to evict from the pending deferreds as they won't take up any more
memory.
f85b6ca
@erikjohnston erikjohnston Increase state_group_cache_size
d906206
synapse/util/caches/lrucache.py
@synchronized
def cache_set_default(key, value):
node = cache.get(key, None)
if node is not None:
+ evict() # As the new node may be bigger than the old node.
@NegativeMjark
NegativeMjark Jan 17, 2017 Contributor

This doesn't seem necessary if we aren't modifying the cache.

tests/util/test_lrucache.py
@@ -128,7 +128,7 @@ def test_set(self):
m = Mock()
cache = LruCache(1)
- cache.set("key", "value", m)
+ cache.set("key", "value", [m])
@NegativeMjark
NegativeMjark Jan 17, 2017 Contributor

Maybe use callbacks=[m] here?

erikjohnston added some commits Jan 17, 2017
@erikjohnston erikjohnston Rename and comment tree_to_leaves_iterator
d6c75cb
@erikjohnston erikjohnston Tidy up test
9e8e236
@NegativeMjark

LGTM

@erikjohnston erikjohnston merged commit d11d7cd into develop Jan 17, 2017

0 of 5 checks passed

Sytest Dendron (Merged PR) Build triggered. sha1 is merged.
Details
Sytest Postgres (Merged PR) Build triggered. sha1 is merged.
Details
Sytest SQLite (Merged PR) Build triggered. sha1 is merged.
Details
continuous-integration/travis-ci/pr The Travis CI build is in progress
Details
continuous-integration/travis-ci/push The Travis CI build is in progress
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment