-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Huge memory usage after some time #1226
Comments
Hey @shyim, We are aware that MeiliSearch currently uses a lot of memory, we are working on a new indexing system that is much more efficient. Can you provide us the MeiliSearch logs? I would like to know the size the 6898 entries take on disk, uncompressed, please? As little advice it is, I highly recommend you to index all your documents in one batch. Thank you for helping us debug. |
Log: https://drive.google.com/uc?id=1XXFA3YbBrtwMrhvsp9gRUqQ8uTO-pgs6 The data.ms folder is 20.9 GB 😱. I guess the deletion of index and recreating every hour does not cleanup anything 😅 |
Hello @shyim and everyone following the issue! The first RC of MeiliSearch v0.21.0 is out. You can test this new release by downloading the binaries available in this release. docker run -p 7700:7700 getmeili/meilisearch:v0.21.0rc1 ./meilisearch We will still improve this after the release of the v0.21.0. We would rather release a non-completely optimized version rather than delay it and, at the same time, delay the release of new features. Be sure we are doing our best to always improve these indexation issues. As a reminder:
Thanks for your patience and your help with this! ❤️ |
Looks better. Thanks! |
Describe the bug
Uses after some time huge amount of RAM 58GB
To Reproduce
I don't know what really does that
I have a Job that deletes every hour the index and indexes all entries.
Script: https://github.com/FriendsOfShopware/packages/blob/live/src/Command/PackageIndexerCommand.php#L31-L70
We talk about only one index with 6898 entries with less fields
Expected behavior
It should not take so much memory.
Screenshots
Server (please complete the following information):
Additional context
I am running it in Docker
The text was updated successfully, but these errors were encountered: