You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you're going to be doing a large number of table.lookup(...) calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.
The text was updated successfully, but these errors were encountered:
This could default to a 100 item LRU cache. You could perhaps modify that with cache_size=500 or with cache_size=None to disable the size limit on that cache.
Would be interesting to micro-benchmark this to get an idea for how much of a performance boost it is, since the indexed SQLite lookups used by table.lookup() should be really fast already.
Inspired by work on
git-history
where I used this pattern:If you're going to be doing a large number of
table.lookup(...)
calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.The text was updated successfully, but these errors were encountered: