You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The recent discussions on python-ideas showed that people have a hard time finding the infinity-cache option for lru_cache(). Also, in the context of straight caching without limits, the name *lru_cache()* makes the tool seem complex and heavy when in fact, it is simple, lightweight, and fast (doing no more than a simple dictionary lookup).
We could easily solve both problems with a helper function:
'Simple unbounded cache. Sometimes called "memoize".'returnlru_cache(maxsize=None, typed=False)
There was some discussion about a completely new decorator with different semantics (holding a lock across a call to an arbitrary user function and being limited to zero argument functions). It all the examples that were presented, this @cache decorator would suffice. None of examples presented actually locking behavior.
FWIW, this doesn't preclude the other proposal if it ever gains traction and moves forward. This just takes existing functionality and improves clarity and discoverability.
The core issue is that if you only want a simple unbounded cache, it isn't obvious that that behavior is buried in the lru_cache() API. In hindsight, this always should have been separate functionality. And some day we may deprecate the maxsize=None option which is somewhat opaque.