Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

keep loaded models in a shared LRU cache #124

Closed
ssube opened this issue Feb 11, 2023 · 0 comments · Fixed by #136
Closed

keep loaded models in a shared LRU cache #124

ssube opened this issue Feb 11, 2023 · 0 comments · Fixed by #136
Labels
status/fixed issues that have been fixed and released type/feature new features
Milestone

Comments

@ssube
Copy link
Owner

ssube commented Feb 11, 2023

Each model loading module currently has its own internal cache, with no way to clear them all together.

Add an LRU cache for models to some shared object, probably the ServerContext, and update each loader to use that instead of the module globals.

There should be a way to empty the cache and run garbage collection, on an admin endpoint.

@ssube ssube added status/new issues that have not been confirmed yet type/feature new features labels Feb 11, 2023
@ssube ssube added this to the v0.7 milestone Feb 11, 2023
@ssube ssube added status/progress issues that are in progress and have a branch and removed status/new issues that have not been confirmed yet labels Feb 14, 2023
@ssube ssube mentioned this issue Feb 14, 2023
@ssube ssube added status/fixed issues that have been fixed and released and removed status/progress issues that are in progress and have a branch labels Feb 14, 2023
@ssube ssube closed this as completed Feb 14, 2023
@ssube ssube mentioned this issue Feb 14, 2023
51 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/fixed issues that have been fixed and released type/feature new features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant