Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhanced cache access API for functools.lru_cache #54795

ncoghlan opened this issue Nov 30, 2010 · 5 comments

Enhanced cache access API for functools.lru_cache #54795

ncoghlan opened this issue Nov 30, 2010 · 5 comments
type-feature A feature request or enhancement


Copy link

BPO 10586
Nosy @birkenfeld, @rhettinger, @ncoghlan
  • functools_lru_cache_introspection.diff: Code and test changes for f.cache.* introspection API
  • functools_lru_cache_info_method.diff: Simpler alternative using a cache_info() method
  • Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

    Show more details

    GitHub fields:

    assignee = ''
    closed_at = <Date 2010-11-30.06:21:52.931>
    created_at = <Date 2010-11-30.04:12:04.301>
    labels = ['type-feature']
    title = 'Enhanced cache access API for functools.lru_cache'
    updated_at = <Date 2010-11-30.06:35:27.648>
    user = '' fields:

    activity = <Date 2010-11-30.06:35:27.648>
    actor = 'rhettinger'
    assignee = 'ncoghlan'
    closed = True
    closed_date = <Date 2010-11-30.06:21:52.931>
    closer = 'ncoghlan'
    components = []
    creation = <Date 2010-11-30.04:12:04.301>
    creator = 'ncoghlan'
    dependencies = []
    files = ['19880', '19882']
    hgrepos = []
    issue_num = 10586
    keywords = ['patch']
    message_count = 5.0
    messages = ['122879', '122881', '122882', '122886', '122888']
    nosy_count = 3.0
    nosy_names = ['georg.brandl', 'rhettinger', 'ncoghlan']
    pr_nums = []
    priority = 'high'
    resolution = 'accepted'
    stage = 'resolved'
    status = 'closed'
    superseder = None
    type = 'enhancement'
    url = ''
    versions = ['Python 3.2']

    Copy link
    Contributor Author

    As per private email to Raymond, I would like to move the lru_cache access attributes and methods into a CacheInfo class, exposed as a "cache" attribute with several methods and read-only properties.

    Read-only properties: hits, misses, maxsize
    Methods: clear(), __len__()

    As an implementation detail, cache_misses and cache_hits become nonlocal variables rather than attributes of the wrapper function.

    Priority set to high, since the current API will be locked in as soon the first beta goes out.

    Copy link
    Contributor Author

    Raymond suggested a simple cache_info() method that returns a named tuple as a possible alternative.

    I like that idea, as it makes displaying debugging information (the intended use case for these attributes) absolutely trivial:

    >>> import functools
    >>> @functools.lru_cache()
    ... def f(x):
    ...   return x
    >>> f.cache_info()
    lru_cache_info(maxsize=100, currsize=0, hits=0, misses=0)

    (Alternative patch attached)

    Copy link

    Okay, go ahead with the second patch.
    With the following changes:

    _CacheInfo = namedtuple("CacheInfo", "maxsize size hits misses")

    Change the variable names:
    cache_hits --> hits
    cache_misses --> misses

    Add a "with lock:" to the cache_info() function.

    Update the docs.


    @rhettinger rhettinger assigned ncoghlan and unassigned rhettinger Nov 30, 2010
    Copy link
    Contributor Author

    Committed in r86878.

    I put a XXX note in the relevant part of the 3.2 What's New rather than updating it directly.

    @ncoghlan ncoghlan added the type-feature A feature request or enhancement label Nov 30, 2010
    Copy link


    @ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    type-feature A feature request or enhancement
    None yet

    No branches or pull requests

    2 participants