Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Optionally measure size of cache by sum of length of values #1815

Merged
merged 11 commits into from Jan 17, 2017

Conversation

Projects
None yet
2 participants
Owner

erikjohnston commented Jan 16, 2017

No description provided.

erikjohnston added some commits Jan 13, 2017

erikjohnston added some commits Jan 16, 2017

synapse/util/caches/lrucache.py
@@ -58,6 +58,18 @@ def __init__(self, max_size, keylen=1, cache_type=dict):
lock = threading.Lock()
+ def cache_len():
+ if size_callback is not None:
+ return sum(size_callback(node.value) for node in cache.itervalues())
@NegativeMjark

NegativeMjark Jan 16, 2017

Contributor

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

@NegativeMjark

NegativeMjark Jan 16, 2017

Contributor

If you want to be awful you can move the if statement outside the function def and define the function twice...

synapse/util/caches/expiringcache.py
)
def __len__(self):
- return len(self._cache)
+ if self.iterable:
+ return sum(len(value.value) for value in self._cache.itervalues())
@NegativeMjark

NegativeMjark Jan 16, 2017

Contributor

This is probably sub-optimal since it iterates the entire cache. You will probably need to store the current size somewhere and update it as things are added or removed.

erikjohnston added some commits Jan 16, 2017

Speed up cache size calculation
Instead of calculating the size of the cache repeatedly, which can take
a long time now that it can use a callback, instead cache the size and
update that on insertion and deletion.

This requires changing the cache descriptors to have two caches, one for
pending deferreds and the other for the actual values. There's no reason
to evict from the pending deferreds as they won't take up any more
memory.
synapse/util/caches/lrucache.py
@synchronized
def cache_set_default(key, value):
node = cache.get(key, None)
if node is not None:
+ evict() # As the new node may be bigger than the old node.
@NegativeMjark

NegativeMjark Jan 17, 2017

Contributor

This doesn't seem necessary if we aren't modifying the cache.

tests/util/test_lrucache.py
@@ -128,7 +128,7 @@ def test_set(self):
m = Mock()
cache = LruCache(1)
- cache.set("key", "value", m)
+ cache.set("key", "value", [m])
@NegativeMjark

NegativeMjark Jan 17, 2017

Contributor

Maybe use callbacks=[m] here?

erikjohnston added some commits Jan 17, 2017

@erikjohnston erikjohnston merged commit d11d7cd into develop Jan 17, 2017

0 of 5 checks passed

Sytest Dendron (Merged PR) Build triggered. sha1 is merged.
Details
Sytest Postgres (Merged PR) Build triggered. sha1 is merged.
Details
Sytest SQLite (Merged PR) Build triggered. sha1 is merged.
Details
continuous-integration/travis-ci/pr The Travis CI build is in progress
Details
continuous-integration/travis-ci/push The Travis CI build is in progress
Details

@erikjohnston erikjohnston deleted the erikj/iter_cache_size branch Mar 29, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment