Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added helper functions to support fine grained control over the cache #36

Closed
wants to merge 1 commit into from
Closed

Conversation

vladwing
Copy link

Hi,

I found usefull the following two cases for method calls, but I think they might be useful for function calls as well:

  1. Invalidating only one entry in the cache.
  2. Setting manually a cache entry, given the parameters of the method call

Here is an example of how I see it being used:

class Cached(object):
    def __init__(self):
        self._cache = LRUCache(maxsize=100)

    @cachedmethod(operator.attrgetter('_cache'))
    def get_id(self, id):
        return expensive_call(uuid)

    def set_id(self, id, value):
        self.fast_get.cache_set(self, value, id)

    def del_id(self, id):
        self.fast_get.cache_invalidate(self, id)

Tell me what you think, and I will update the function decorator as well, if you decide to merge my commit.

Cheers,
Vlad

@coveralls
Copy link

Coverage Status

Coverage decreased (-3.64%) to 95.88% when pulling eb3226a on vladwing:master into 1daf0c7 on tkem:master.

2 similar comments
@coveralls
Copy link

Coverage Status

Coverage decreased (-3.64%) to 95.88% when pulling eb3226a on vladwing:master into 1daf0c7 on tkem:master.

@coveralls
Copy link

Coverage Status

Coverage decreased (-3.64%) to 95.88% when pulling eb3226a on vladwing:master into 1daf0c7 on tkem:master.

@tkem
Copy link
Owner

tkem commented Apr 18, 2015

I see what you mean, but for now I don't want to add to the interface, especially not the decorators. I'd also prefer if users could implement something like this themselves, since the cache is already exposed. So probably the makekey function could also be exposed, and maybe it should even be possible to specify your own key function to be used (I never liked the typed parameter anyway, so this could be replaced with something more generic).

@tkem
Copy link
Owner

tkem commented Apr 18, 2015

Thinking a little more about this, passing your own key function and making it a property of the wrapper, one could also define a special-purpose function for arguments which are not hashable, and so cannot be used as cache keys by default.

@vladwing
Copy link
Author

Having a make_key parameter to the decorator would be more general, but I still don't see how you can use it to invalidate single keys, unless you put a reference to the decorated function/method outside the wrapper, for example wrapper.original = method.

My idea started from the imposibility of accessing the original method, which becomes a function, outside the decorator function. Those functions - used to invalidate and manually set the value of a single key - should, therefore, be defined in a context where that reference is available. I don't see many things one can possibly do with a memoizing decorator:

  • use it directly to cache results based on parameters
  • get stats of the cache - hits, misses, etc
  • invalidate a cache entry
  • invalidate all the cache entries
  • set a single cache entry

@tkem
Copy link
Owner

tkem commented Apr 18, 2015

For the original method, there's alreadythe __wrapped__ attribute.

@vladwing
Copy link
Author

For 2.7 as well?

@tkem
Copy link
Owner

tkem commented Apr 18, 2015

Good point. Didn't know this was introduced in 3.2. So another reason for people to move to Python 3, good ;-)

@tkem
Copy link
Owner

tkem commented Aug 28, 2015

Thanks for your input; I think this has been solved (somewhat differently) with cachetools 1.1 and the new @cached and @cachedmethod decorators.

@tkem tkem closed this Aug 28, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants