Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapter is eating MaxRetriesError's and throwing other errors #6301

Closed
robs-nice99 opened this issue Dec 6, 2022 · 5 comments
Closed

Adapter is eating MaxRetriesError's and throwing other errors #6301

robs-nice99 opened this issue Dec 6, 2022 · 5 comments

Comments

@robs-nice99
Copy link

       except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)

            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)

            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)

            if isinstance(e.reason, _SSLError):
                # This branch is for urllib3 v1.22 and later.
                raise SSLError(e, request=request)

            raise ConnectionError(e, request=request)

When using adapter and urllib3....Retry I don't catch the documented exception "MaxRetryError"

Errors will be wrapped in MaxRetryError unless retries are disabled, in which case the causing exception will be raised.

This seems to have been addressed #4389, #5639 and #1198, although not as straightforward.

The question is why is Requests rethrowing the enclosed Exception, making it impossible to use urllib3 logic of the Retry class, where errors can be retried until a maximum number and then the MaxRetryError exception is raised.

This is additionally confusing because it's not documented.

@sigmavirus24
Copy link
Contributor

I think you have some fundamental misunderstanding of what's happening but I can't understand what you're actually asking.

In case I can guess what you're asking, urllib3 retries don't function based on Requests exceptions. If urllib3 raises a MaxRetryError that means that all of those retries you configured in urllib3 failed and you reached the configured (specified by you) max. If we can we try to provide a consistent exception API across both use of retries and not and include the original exception in the new one so users can unwrap things and find the origin

@robs-nice99
Copy link
Author

Thanks for your help. Let me try to explain it better.

I configured Retry and created some tests to make sure everything is working correctly. I'm testing a healthcheck endpoint of a service that is being bootstraped. I want to retry 10 times until I get an HTTP 200 OK response. During that time, I want to retry if I receive different HTTP codes (since during boot the server behavior might be undefined) and even if any connection problems occur (timeout, connection error, etc).

In order to test this, I created some unit tests and I was expecting to always get a MaxRetryError. What happens is that I get ConnectionError or RetryError. This is confusing, because I don't know if I miss configured Retry and it's only retrying on non 200 HTTP codes, but not on connection error. Also I'm receiving a RetryError exception, witch I was not expecting (I can't find any documentation about it).

"If we can, we try to provide a consistent exception API across both use of retries and not"

So, that's the issue. Since I'm using Retry, I was expecting to only get MaxRetryError. Because there's no documentation about Retry in Requests site, I assumed it would work as documented in urllib3.

@sigmavirus24
Copy link
Contributor

So, that's the issue. Since I'm using Retry, I was expecting to only get MaxRetryError. Because there's no documentation about Retry in Requests site

It's not documented because you shouldn't be getting a retry error directly. It's all meant to look the same so you don't need a million branches to handle some logic.

@robs-nice99
Copy link
Author

I see. That seems reasonable.

So let me just share my experience. I was looking for a way to do reties, I found out it was already implemented. I think I found out how to use from SO, but even so, the documentation in the comments says to import urllib3's Retry. No more details, so I went to read urllib3's Retry documentation and was expecting to get that behavior.

I do believe it makes more sense to always get a MaxRetryError when tResponseErrorhe max retries are over. Otherwise I have no way of knowing if I get an error on the first try or after the 5 reties I configured. And notice that instead of only catching MaxRetryError, I have to catch ConnectTimeoutError, RetryError, ProxyError and SSLError. Notice that RetryError is a new and undocumented way of handling the "using" of all retries (so, there's a retry error we get directly).

But I do see the value in having the same API for all the cases. There's never a way to make everyone happy.

I was just sharing my experience, and pointing out that documenting this would have ease my coding.

@matthewarmand
Copy link
Contributor

There's now simple documentation on retries in the Advanced Usage section, added by #6258: https://requests.readthedocs.io/en/latest/user/advanced/#example-automatic-retries

It's very basic at the moment but I'm sure maintainers would accept contributions with whatever additional information you think would be useful

@sigmavirus24 sigmavirus24 closed this as not planned Won't fix, can't repro, duplicate, stale Sep 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants