Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supply cache object to decorator #567

Closed
padraic-shafer opened this issue Apr 21, 2022 · 4 comments
Closed

Supply cache object to decorator #567

padraic-shafer opened this issue Apr 21, 2022 · 4 comments

Comments

@padraic-shafer
Copy link
Contributor

padraic-shafer commented Apr 21, 2022

Is it possible to supply a cache object to the decorator @cached?

It seems like there are two separarte tracks for using aiocache:

  1. Create a cache with the Cache() constructor. This allows inspection and manual reuse of the cache. In particular, this is the track used in the example for plugins, such as inspecting the hit_miss_ratio stats on a HitMissRatioPlugin.
  2. Decorate a coroutine with @cached. This track accepts a class type for a cache rather than a cache object. This nominally allows plugins, and indeed the generated cache object receives and contains the list of plugins. However, the generated cache does not appear to implement the plugin functionality. For example, after passing a HitMissRatioPlugin() object to the plugins parameter of @cached, the cache injected into the coroutine does not have a member called hit_miss_ratio.

Is it possible to pass a cache object, rather than a cache class to the @cached decorator? This would allow the same cache to be shared between multiple function calls. Also it seems to be necessary helpful for utilizing plugin functionality.

On the flip side: Is there a driving force for supplying a cache class (rather than a cache object) to the @cached decorator?

**** Or am I missing something: Dose this functionality already exists and I am simply not using it correctly?

Thanks!

@padraic-shafer
Copy link
Contributor Author

Update: The issue I was experiencing with the plugins was due to the fact that I was trying to access the hit_miss_ratio member before cache.get() had been called. Therefore the member hit_miss_ratio did not yet exist. I will update the post above to reflect this.

My primary question still remains: Why does the @cached decorator accept a cache class rather than a cache object?

Accepting an object would:

  1. Allow finer control of whether caches should be shared or not between decorated coroutines;
  2. Enable simple access to the cache -- to access Plugin stats, for example. Currently I resort to awkward workarounds (see below); this gets quite intrusive for nested coroutines.
  3. The aiocache library would be self-consistent -- using the same mechanism for both the Cache() constructor paradigm and the @cached paradigm.

How to access plugin stats for a decorated coroutine (and a nested coroutine):

# Configure the caches
DEFAULT_CACHE_TTL = 300

default_cached_config = {
    "cache": Cache.MEMORY,
    "ttl": DEFAULT_CACHE_TTL,
    "noself": True,  # True for @classmethod
    "plugins": [HitMissRatioPlugin(), TimingPlugin()],
}
inner_cached_config = {
    **default_cached_config,
    "key_builder": inner_key_builder,
    }
outer_cached_config = {
    **default_cached_config,
    }

# Cache the calls on the members of this class
class MyClass(object):

    @classmethod
    @cached(**outer_cached_config)
    async def outer_coroutine(cls, **kwargs):
        
        @cached(**inner_cached_config)
        async def inner_coroutine(**kwargs):
            # Do some work here
            return inner_result

        # Leak details of implementation here, to get access to inner cache
        cls._inner_cache = inner_coroutine.cache
        
        result = await inner_coroutine(**kwargs)
        # Do some more work here
        return outer_result

# Make some calls that set/get the cached values
MyClass.outer_coroutine(1)
MyClass.outer_coroutine(2)
MyClass.outer_coroutine(1)
MyClass.outer_coroutine(1)

# Get cache stats
try:
    cache = MyClass._inner_cache
    print("MyClass.inner_coroutine")
    print(f'{cache.hit_miss_ratio["hit_ratio"]=}')
    print(f'{cache.hit_miss_ratio["total"]=}')
    print(f'{cache.hit_miss_ratio["hits"]=}')
except AttributeError as e:
    print(e)

try:
    cache = MyClass.outer_coroutine.cache
    print("MyClass.outer_coroutine")
    print(f'{cache.hit_miss_ratio["hit_ratio"]=}')
    print(f'{cache.hit_miss_ratio["total"]=}')
    print(f'{cache.hit_miss_ratio["hits"]=}')
except AttributeError as e:
    print(e)

It would be much simpler and cleaner to do this instead:

# Configure the caches
DEFAULT_CACHE_TTL = 300

default_cached_config = {
    "cache": Cache.MEMORY,
    "ttl": DEFAULT_CACHE_TTL,
    "noself": True,  # True for @classmethod
    "plugins": [HitMissRatioPlugin(), TimingPlugin()],
}
inner_cached_config = {
    **default_cached_config,
    "key_builder": inner_key_builder,
    }
outer_cached_config = {
    **default_cached_config,
    }

inner_cache = Cache(** inner_cached_config)  # This can be reused for multiple classes
outer_cache = Cache(** outer_cached_config)  # So can this

# Cache the calls on the members of this class
class MyClass(object):

    @classmethod
    @cached(cache=outer_cache, **outer_cached_config)
    async def outer_coroutine(cls, **kwargs):
        
        @cached(cache=inner_cache, **inner_cached_config)
        async def inner_coroutine(**kwargs):
            # Do some work here
            return inner_result

        # NOT NEEDED
        # Leak details of implementation here, to get access to inner cache
        # cls._inner_cache = inner_coroutine.cache
        
        result = await inner_coroutine(**kwargs)
        # Do some more work here
        return outer_result

# Make some calls that set/get the cached values
MyClass.outer_coroutine(1)
MyClass.outer_coroutine(2)
MyClass.outer_coroutine(1)
MyClass.outer_coroutine(1)

# Get cache stats
try:
    cache = inner_cache  # This is already defined in an accessible scope
    print("MyClass.inner_coroutine")
    print(f'{cache.hit_miss_ratio["hit_ratio"]=}')
    print(f'{cache.hit_miss_ratio["total"]=}')
    print(f'{cache.hit_miss_ratio["hits"]=}')
except AttributeError as e:
    print(e)

try:
    cache = outer_cache  # This is already defined in an accessible scope
    print("MyClass.outer_coroutine")
    print(f'{cache.hit_miss_ratio["hit_ratio"]=}')
    print(f'{cache.hit_miss_ratio["total"]=}')
    print(f'{cache.hit_miss_ratio["hits"]=}')
except AttributeError as e:
    print(e)

@padraic-shafer
Copy link
Contributor Author

After some poking around, I figured out that the intended way of re-using cache objects is to store a configuration with caches.add() or caches.set_config(), and then supply an alias name to @cached. This was not intuitive for me, but I suppose it works...at least for SimpleMemoryCache.

I noticed that this approach works as long as namespace==None. There are some parts of aiocache that do not honor namespaces ; meaning that BaseCache.build_key(key, namespace) is never called to join the namespace with the key. I plan to submit a PR to resolve this inconsistency. I've worked around it for now by derving custom classes from several of the aiocache classes.

@padraic-shafer
Copy link
Contributor Author

Submitted PR #570: Use build_key(key, namespace) consistently across modules and classes

@Dreamsorcerer
Copy link
Member

Tracking this in #609.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants