Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.
Sign upConsider passing down scrape timeout as header #2508
Comments
This comment has been minimized.
This comment has been minimized.
|
Why a header? |
This comment has been minimized.
This comment has been minimized.
|
Headers could be automatic, whereas url parameters are currently explicitly configured by the user. |
This comment has been minimized.
This comment has been minimized.
|
Anything in the way of us adding default url params?
…On Fri, Mar 17, 2017 at 3:54 PM Brian Brazil ***@***.***> wrote:
Headers could be automatic, whereas url parameters are currently
explicitly configured by the user.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#2508 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AEuA8v-pwezwsoold8wF7uaJPx0DmDovks5rmp6ygaJpZM4Mgqkv>
.
|
This comment has been minimized.
This comment has been minimized.
|
It doesn't seem quite as clean to me. It'd be exposing a minor internal detail on places like the /status page. |
This comment has been minimized.
This comment has been minimized.
|
Closing the connection has signaled such timeout so far and most services
at SoundCloud react to such event by aborting the processing of a request.
Though, with the discussion around keep-alive this sounds like a useful
hint. Intuitively a header also sounds like the better option to me to
transport such meta information.
…On Fri, Mar 17, 2017, 12:09 Brian Brazil ***@***.***> wrote:
It doesn't seem quite as clean to me. It'd be exposing a minor internal
detail on places like the /status page.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#2508 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAANaHhUyf3Wr0-P9LMrfsVL3ayNxR43ks5rmqIXgaJpZM4Mgqkv>
.
|
This comment has been minimized.
This comment has been minimized.
|
Agreed on header in this case. |
mdlayher
referenced this issue
Apr 4, 2017
Merged
retrieval: add Scrape-Timeout-Seconds header to each scrape request #2565
brian-brazil
closed this
in
#2565
Apr 4, 2017
brian-brazil
added a commit
that referenced
this issue
Apr 4, 2017
This comment has been minimized.
This comment has been minimized.
lock
bot
commented
Mar 23, 2019
|
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
lock
bot
locked and limited conversation to collaborators
Mar 23, 2019
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
brian-brazil commentedMar 17, 2017
I'd like to propose passing the scrape timeout down as a header when scraping. For exporters like snmp and blackbox it'd be good if the user didn't have to set timeouts in two places. More generally for expensive exporters, it'd be good to stop querying when the other side has given up. Whether this would ever be exposed to client libraries is a separate question.
This would be something like: