-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce high memory usage #114
Comments
I think this is caused by the caching feature of HibiAPI. By default, HibiAPI turns on the default caching feature. However, in the case of poor cache hit rate, this can cause a lot of useless memory usage. This is especially true in the case of multiple workers and no Redis, because the default memory cache of So for your current problem, we have the following solutions:
Due to my poor algorithmic knowledge, I haven't found a better asynchronous caching method that is easy to implement and scalable, perhaps you can help us improve the current caching algorithm (e.g. using LRU or a better algorithm)? |
How do I set to use redis as the external cache? |
Install |
Now, HibiAPI supports the diskcache backend, which means we can use the cache in the disk instead of consuming a large amount of memory. You can reference to |
Is it normal for the python script to use a lot of memory cause it currently uses about 1gb ram. Is there anyway to reduce it?
The text was updated successfully, but these errors were encountered: