Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit discovery cache by size #16

Merged
merged 8 commits into from
Aug 24, 2020
Merged

Limit discovery cache by size #16

merged 8 commits into from
Aug 24, 2020

Conversation

golanha
Copy link
Member

@golanha golanha commented Aug 19, 2020

This change is Reviewable

@golanha golanha requested a review from yehiyam August 19, 2020 14:00
Copy link
Contributor

@yehiyam yehiyam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed 5 of 5 files at r1.
Reviewable status: all files reviewed, 6 unresolved discussions (waiting on @golanha)


hkube_python_wrapper/cache/caching.py, line 2 at r1 (raw file):

import datetime
import msgpack

Why do you use msgpack and not the configured encoding?


hkube_python_wrapper/cache/caching.py, line 5 at r1 (raw file):

import hkube_python_wrapper.util.type_check as typeCheck

class CustomCache:

Maybe just Cache?


hkube_python_wrapper/cache/caching.py, line 20 at r1 (raw file):

            return None
        while (self.sumSize + size) >= self._maxCacheSize * 1000 * 1000:
            if (self._cache.keys()):

shouldn't it be if (!self._cache.keys())? You check if the cache is empty


hkube_python_wrapper/config/config.py, line 28 at r1 (raw file):

    "timeout": getIntEnv('DISCOVERY_TIMEOUT', 10000),
    "networkTimeout": getIntEnv('DISCOVERY_NETWORK_TIMEOUT', 1000),
    "maxCacheSize": getIntEnv('DISCOVERY_MAX_CACHE_SIZE_MB', 1500)

Isn't it too large as default?


hkube_python_wrapper/config/config.py, line 39 at r1 (raw file):

    "mode": os.environ.get('STORAGE_PROTOCOL', 'v2'),
    "encoding": os.environ.get('STORAGE_ENCODING', 'bson'),
    "maxCacheSize": getIntEnv('STORAGE_MAX_CACHE_SIZE_MB', 1500),

Together with this cache it is 3GB of cache.


hkube_python_wrapper/wrapper/data_adapter.py, line 198 at r1 (raw file):

        return self._storageCache.get(path)

    def _setToCache(self, path, data, size):

maybe add @timing to this method to see if the encoding takes too long

Copy link
Member Author

@golanha golanha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: all files reviewed, 6 unresolved discussions (waiting on @timing and @yehiyam)


hkube_python_wrapper/cache/caching.py, line 2 at r1 (raw file):

Previously, yehiyam wrote…

Why do you use msgpack and not the configured encoding?

It doesn't really matter which encoding I use, since it is not saved as encoded any ways.


hkube_python_wrapper/cache/caching.py, line 5 at r1 (raw file):

Previously, yehiyam wrote…

Maybe just Cache?

refactored


hkube_python_wrapper/cache/caching.py, line 20 at r1 (raw file):

Previously, yehiyam wrote…

shouldn't it be if (!self._cache.keys())? You check if the cache is empty

Right, Done.


hkube_python_wrapper/config/config.py, line 28 at r1 (raw file):

Previously, yehiyam wrote…

Isn't it too large as default?

changed to 400MB


hkube_python_wrapper/config/config.py, line 39 at r1 (raw file):

Previously, yehiyam wrote…

Together with this cache it is 3GB of cache.

changed to 400 MB each cache


hkube_python_wrapper/wrapper/data_adapter.py, line 198 at r1 (raw file):

Previously, yehiyam wrote…

maybe add @timing to this method to see if the encoding takes too long

Done.

Copy link
Contributor

@yehiyam yehiyam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewed 7 of 7 files at r2.
Reviewable status: all files reviewed, 1 unresolved discussion (waiting on @golanha)


hkube_python_wrapper/cache/caching.py, line 19 at r2 (raw file):

                size = len(value)
            else:
                size = asizeof(value)

fix

Copy link
Contributor

@yehiyam yehiyam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:lgtm:

Reviewed 1 of 1 files at r3, 1 of 1 files at r4.
Reviewable status: :shipit: complete! all files reviewed, all discussions resolved

@yehiyam yehiyam merged commit f1cf239 into master Aug 24, 2020
@yehiyam yehiyam deleted the limit_by_size branch August 24, 2020 15:45
hkube-ci pushed a commit that referenced this pull request Aug 24, 2020
* limit discovery chache by size

* pylint

* pylint

* cache for storage as well as discovery

* cr

* use size of

* sizeof.sizeof

* fix lint .... bump version [skip ci]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants