Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WithRetry capability #73

Closed
wants to merge 2 commits into from
Closed

WithRetry capability #73

wants to merge 2 commits into from

Conversation

teivah
Copy link

@teivah teivah commented Apr 29, 2020

First of all, thanks for this cool library.

In my context, we would have been interested in a retry capability before to assert that a test is failing. We are running a test, synchronizing an internal cache and then we trigger an HTTP request. As we don't have the possibility to query the internal cache, we don't know when the HTTP query will succeed. Hence, we had to add an ugly time.Sleep() before calling httpexpect library.

Instead, it would be way cleaner to be able to retry a failing query. That would be something similar to what we have for example in Gomega with the Eventually assertion: https://onsi.github.io/gomega/#making-assertions

WithRetry would accept a max retry value and a retry delay.

@coveralls
Copy link
Collaborator

coveralls commented Apr 29, 2020

Coverage Status

Coverage increased (+0.04%) to 93.991% when pulling da0e32c on teivah:withretry into 4cfa969 on gavv:master.

@teivah
Copy link
Author

teivah commented May 18, 2020

@gavv Any problem with the approach?

@gavv
Copy link
Owner

gavv commented May 18, 2020

Hi, being a bit busy currently with other projects. I'll find time for httpexpect in week or two. Sorry for inconvenience and thanks for the PR!

@teivah
Copy link
Author

teivah commented May 18, 2020

No worries, thank you :)

Copy link
Owner

@gavv gavv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, here are some comments.

Comment on lines +1002 to +1003
for i := 0; i < r.maxRetry; i++ {
resp, err = r.config.Client.Do(r.http)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem here is that, in general, http.Request is not reusable after calling http.Client.Do, even if it failed. In particular, http.Request.Body is io.Reader, and http.Client may read the whole body, so that the retry will see an EOF, like the body is empty.

This problem doesn't appear with requests without body. We should add a test with a POST request with non-empty body which will catch this bug.

The typical way to solve this problem is:

  1. read the whole body into buffer
  2. create a new reader from the buffer
  3. call Client.Do
  4. on failure, go to step (2)

Some optimizations are possible, e.g. if the client buffer is ReaderSeeker, we can just seek to the beginning before each retry.

Quick googling showed that there is also a package the handles this issue: https://github.com/hashicorp/go-retryablehttp

Maybe we can use it here? Or at least reproduce part of its logic.

@@ -851,6 +853,18 @@ func (r *Request) WithMultipart() *Request {
return r
}

// WithRetry allows to retry an HTTP request until a max retry value.
// It also accepts a retry delay that is performed before to retry on failure.
func (r *Request) WithRetry(maxRetry int, retryDelay time.Duration) *Request {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the retryDelay parameter useful in practice? (In your case or in general). Maybe instead remove it and use exponential back-off, to make the API a bit simpler?

Comment on lines +1004 to +1007
if err != nil {
time.Sleep(r.retryDelay)
continue
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It usually makes sense to retry only temporary network errors, like:

if nerr, ok := err.(net.Error); ok && nerr.Temporary() {

Also, it makes sense to retry timeout errors (err.Timeout()), I don't remember whether timeout error is considered temporary or not.

In addition, it usually makes sense to retry some of HTTP status codes, likely all 5xx errors except 501. However, for httpexpect it doesn't always make sense, so probably we need a separate option for this.

I guess we need to allow the user to specify something like "retry policy" (what to retry and how).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not necessarily. Please check at my use-case in the description. It's not simply a transient retry.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. But I guess this is not the most common use case, so maybe handle it as a special retry policy?

E.g. we could have the following policies:

  • retry temporary network errors
  • retry temporary network errors and temporary server errors (5xx) - likely should be default when retrying is enabled
  • retry any errors

What exact error are you retrying in your case?

resp http.Response
err error
}

func (c *mockClient) Do(req *http.Request) (*http.Response, error) {
c.req = req
if c.retries != nil {
currentRetry := c.retries[c.retryIndex]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add an explicit panic with meaningful message when index is out of bounds.

@@ -165,6 +167,53 @@ func TestRequestClient(t *testing.T) {
req3.chain.assertFailed(t)
}

func TestRequestWithRetrySuccess(t *testing.T) {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also check the resulting retryIndex in these tests? To ensure that httpexpect made the desired number of retry attempts.

@gavv gavv added the needs revision Pull request should be revised by its author label Jun 3, 2020
@gavv gavv added this to the v2 milestone Jun 3, 2020
@teivah teivah closed this Sep 29, 2020
@gavv gavv mentioned this pull request Dec 21, 2020
@gavv
Copy link
Owner

gavv commented Dec 21, 2020

I've added an issue: #85.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs revision Pull request should be revised by its author
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants