Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[functools] provide an async-compatible version of functools.lru_cache #79221

Closed
LiraNuna mannequin opened this issue Oct 22, 2018 · 7 comments
Closed

[functools] provide an async-compatible version of functools.lru_cache #79221

LiraNuna mannequin opened this issue Oct 22, 2018 · 7 comments
Labels
3.8 only security fixes topic-asyncio type-feature A feature request or enhancement

Comments

@LiraNuna
Copy link
Mannequin

LiraNuna mannequin commented Oct 22, 2018

BPO 35040
Nosy @brettcannon, @asvetlov, @1st1, @LiraNuna, @tirkarthi
Files
  • test-case.py
  • Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

    Show more details

    GitHub fields:

    assignee = None
    closed_at = None
    created_at = <Date 2018-10-22.01:39:51.011>
    labels = ['type-feature', '3.8', 'expert-asyncio']
    title = '[functools] provide an async-compatible version of functools.lru_cache'
    updated_at = <Date 2019-05-28.20:44:49.370>
    user = 'https://github.com/LiraNuna'

    bugs.python.org fields:

    activity = <Date 2019-05-28.20:44:49.370>
    actor = 'Liran Nuna'
    assignee = 'none'
    closed = False
    closed_date = None
    closer = None
    components = ['asyncio']
    creation = <Date 2018-10-22.01:39:51.011>
    creator = 'Liran Nuna'
    dependencies = []
    files = ['47887']
    hgrepos = []
    issue_num = 35040
    keywords = []
    message_count = 6.0
    messages = ['328228', '328237', '328274', '343687', '343816', '343818']
    nosy_count = 5.0
    nosy_names = ['brett.cannon', 'asvetlov', 'yselivanov', 'Liran Nuna', 'xtreak']
    pr_nums = []
    priority = 'normal'
    resolution = None
    stage = None
    status = 'open'
    superseder = None
    type = 'enhancement'
    url = 'https://bugs.python.org/issue35040'
    versions = ['Python 3.8']

    @LiraNuna
    Copy link
    Mannequin Author

    LiraNuna mannequin commented Oct 22, 2018

    lru_cache is a very useful method but it does not work well with coroutines since they can only be executed once.

    Take for example, the attached code (test-case.py) - It will throw a RuntimeError because you cannot reuse an already awaited coroutine.

    A solution would be to call asyncio.ensure_future on the result of the coroutine if detected.

    @LiraNuna LiraNuna mannequin added 3.7 (EOL) end of life 3.8 only security fixes topic-asyncio labels Oct 22, 2018
    @asvetlov
    Copy link
    Contributor

    A coroutine detection is a relatively slow check.
    I don't think we need to do it in functools.lru_cache.

    There is a specialized asyncio compatible version: https://github.com/aio-libs/async_lru
    Please use it.

    @brettcannon
    Copy link
    Member

    Making this a feature request.

    @brettcannon brettcannon removed the 3.7 (EOL) end of life label Oct 22, 2018
    @brettcannon brettcannon changed the title functools.lru_cache does not work with coroutines [functools] provide an async-compatible version of functools.lru_cache Oct 22, 2018
    @brettcannon brettcannon added the type-feature A feature request or enhancement label Oct 22, 2018
    @asvetlov
    Copy link
    Contributor

    Brett please elaborate.
    Do you want to incorporate async_lru library into CPython Core?

    @brettcannon
    Copy link
    Member

    I was just saying that this is an enhancement request, no judgment about whether we want to solve the enhancement request.

    @LiraNuna
    Copy link
    Mannequin Author

    LiraNuna mannequin commented May 28, 2019

    A coroutine detection is a relatively slow check.
    I don't think we need to do it in functools.lru_cache.

    Wouldn't a coroutine check only happen during decoration time? To successfully solve this easily and efficiently, we only really need to wrap the coroutine with asyncio.ensure_future if the decorated function is a coroutine, and it will only happen when a result comes back from the decorated function which would have minimal impact.

    Of course, I don't know much about the internals of lru_cache so my assumptions could be wrong. I should familiar myself with the implementation and figure out how doable it would be.

    @ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
    @kumaraditya303
    Copy link
    Contributor

    Duplicate of #90780

    @kumaraditya303 kumaraditya303 marked this as a duplicate of #90780 Sep 29, 2022
    @kumaraditya303 kumaraditya303 closed this as not planned Won't fix, can't repro, duplicate, stale Sep 29, 2022
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    3.8 only security fixes topic-asyncio type-feature A feature request or enhancement
    Projects
    Status: Done
    Development

    No branches or pull requests

    3 participants