Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.
Sign upKeep memory pool of scrape caches per target set #3048
Comments
fabxc
added
component/scraping
dev-2.0
kind/enhancement
priority/P3
labels
Aug 10, 2017
fabxc
closed this
Sep 14, 2017
This comment has been minimized.
This comment has been minimized.
lock
bot
commented
Mar 23, 2019
|
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
lock
bot
locked and limited conversation to collaborators
Mar 23, 2019
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
fabxc commentedAug 10, 2017
On SD updates we abandon all disappeared scrape loops. On reload we abandon all scrape loops.
This causes scrape caches to be fully rebuild, which in return causes moderate memory spikes.
We should be able to avoid this to some degree by keeping a pool a memory pool of scrape caches that can then be reused. Doing it per target set sounds intuitively correct as we are likely dealing with uniform sets of series there.