Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature request] auto-retry when HTTP 409 Conflict: Resource lock is received #2325

Closed
pspacek opened this issue Oct 18, 2022 · 5 comments · Fixed by #2326
Closed

[feature request] auto-retry when HTTP 409 Conflict: Resource lock is received #2325

pspacek opened this issue Oct 18, 2022 · 5 comments · Fixed by #2326

Comments

@pspacek
Copy link
Contributor

pspacek commented Oct 18, 2022

Description of the problem, including code/CLI snippet

Gitlab API requests can fail with error Conflict: Resource lock, and it is tedious to write retry loop around all call sites.

I propose to add an (optional?) feature to auto-retry API calls which fail with this particular error.

In my opinion this is the same case as https://peps.python.org/pep-0475/ .

I'm happy to discuss details like retry limit etc. Thank you for considering this.

Expected Behavior

API calls are automatically retried when server replies with "HTTP 409 Conflict: Resource lock".

Actual Behavior

Traceback (most recent call last):
  File "/usr/lib/python3.10/site-packages/gitlab/exceptions.py", line 333, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/lib/python3.10/site-packages/gitlab/v4/objects/jobs.py", line 32, in cancel
    result = self.manager.gitlab.http_post(path, **kwargs)
  File "/usr/lib/python3.10/site-packages/gitlab/client.py", line 1015, in http_post
    result = self.http_request(
  File "/usr/lib/python3.10/site-packages/gitlab/client.py", line 798, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 409: 409 Conflict: Resource lock

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/pspacek/w/isc/gitlab/docs_test.py", line 99, in <module>
    main()
  File "/home/pspacek/w/isc/gitlab/docs_test.py", line 53, in main
    proj.jobs.get(job.id, lazy=True).cancel()
  File "/usr/lib/python3.10/site-packages/gitlab/cli.py", line 71, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/lib/python3.10/site-packages/gitlab/exceptions.py", line 335, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabJobCancelError: 409: 409 Conflict: Resource lock

Specifications

  • python-gitlab version: 3.10.0
  • API version you are using (v3/v4): v4
  • Gitlab server version (or gitlab.com): gitlab.isc.org
@nejch
Copy link
Member

nejch commented Oct 18, 2022

Thanks for the report @pspacek. A simple retry on 409 would be pretty trivial to add, we just need to extend the list of retryable response codes:

RETRYABLE_TRANSIENT_ERROR_CODES = [500, 502, 503, 504] + list(range(520, 531))

We already have a backoff mechanism in place for the existing ones, so I wouldn't try anything more custom unless you find more issues.


But looking at this again, not every 409 response warrants a retry. If we can reliably inspect the message (resource lock and maybe more), I'd add it to the retries based on that, probably by extending this condition.

if (429 == result.status_code and obey_rate_limit) or (
result.status_code in RETRYABLE_TRANSIENT_ERROR_CODES
and retry_transient_errors
):

A quick search for https://gitlab.com/search?search=409&nav_source=navbar&project_id=278964&group_id=9970&search_code=true&repository_ref=master can give us some clues perhaps.

Would you be interested in opening a PR yourself @pspacek? We have hacktoberfest enabled on this repo.

@pspacek
Copy link
Contributor Author

pspacek commented Oct 18, 2022

You are right, most of HTTP 409 do not warrant retry. From a causal browse through Gitlab sources I think we should go for only for "Resource lock" and add others only later if we find any - I did not notice anything else.

As for PR, sure, let me give it a stab.

@nejch
Copy link
Member

nejch commented Oct 18, 2022

Sounds great @pspacek. If the conditional gets too unwieldy we can factor it out into a _should_retry helper method as well.

@pspacek
Copy link
Contributor Author

pspacek commented Oct 18, 2022

I was thinking that I can reuse test_http_request_with_retry_on_method_for_transient_failures() test, but there is a catch: I cannot see a documented way for responses library to customize the result.reason text. https://github.com/getsentry/responses/blob/873ddb4e3c44b0a72d2eb3665158931cd6e3f73b/responses/__init__.py#L548 is not very reasuring.

Do you have a preferred hack for this?

@nejch
Copy link
Member

nejch commented Oct 18, 2022

I was thinking that I can reuse test_http_request_with_retry_on_method_for_transient_failures() test, but there is a catch: I cannot see a documented way for responses library to customize the result.reason text. https://github.com/getsentry/responses/blob/873ddb4e3c44b0a72d2eb3665158931cd6e3f73b/responses/__init__.py#L548 is not very reasuring.

Do you have a preferred hack for this?

@pspacek I've answered in the PR :) I think httmock_response helper will do here.

nejch pushed a commit to pspacek/python-gitlab that referenced this issue Dec 5, 2022
nejch pushed a commit to pspacek/python-gitlab that referenced this issue Dec 18, 2022
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 25, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants