New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vmagent and scrape_timeout #357
Labels
bug
Something isn't working
Comments
valyala
added a commit
that referenced
this issue
Mar 9, 2020
This should prevent from the following unexpected side-effects of idempotent request retries: - increased actual timeout when scraping the target comparing to the configured scrape_timeout - increased load on the target Updates #357
valyala
added a commit
that referenced
this issue
Mar 9, 2020
This should prevent from the following unexpected side-effects of idempotent request retries: - increased actual timeout when scraping the target comparing to the configured scrape_timeout - increased load on the target Updates #357
The issue should be fixed in the commit 7c432da . It will be available in the next release. In the mean time it is possible to build |
спасибо. все работает. |
The bugfix is available starting from v1.34.3. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
Глобальный параметр scrape_timeout: 30s не игнорируется, но не применяется
To Reproduce
Запустить snmp_exporter и обратиться к недоступным хостам
scrape.yml
Логи vmagent
Хочу обратить внимание на scrape_duration=150.007s и with timeout 30s: timeout, который совпадает с указанным в конфиге.
Количество targets отличается какое-то время. В логах видно в обоих случаях.
Т.е. в целом кол-во таргетов не изменилось. Фактически была замена адреса 10.0.0.211 на 10.0.0.209, но в /targets какое-то время показывается оба хоста.
Expected behavior
Опрос так же 4 минуты.
Данные с Prometheus http://10.0.1.20:9090/targets
Scrape Duration 30s по причине context deadline exceeded
Version
Used command-line flags
The text was updated successfully, but these errors were encountered: