Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-Prepare request for retry when using dynamic auth token #1680

Merged
merged 7 commits into from
Aug 15, 2023
Merged

Re-Prepare request for retry when using dynamic auth token #1680

merged 7 commits into from
Aug 15, 2023

Conversation

emrebasar
Copy link
Contributor

Description

Currently, a retried request fails if the authentication token expires between retries. This can be a common experience, as GitHub API can throttle a user for up to an hour, which is longer than the lifetime of an authentication token retrieved via the GitHub Application Installation.

This change ensures that a recent token is used in each request by re-creating the connector request for every retry.

Before submitting a PR:

  • Changes must not break binary backwards compatibility. If you are unclear on how to make the change you think is needed while maintaining backward compatibility, CONTRIBUTING.md for details.
  • Add JavaDocs and other comments explaining the behavior.
  • When adding or updating methods that fetch entities, add @link JavaDoc entries to the relevant documentation on https://docs.github.com/en/rest .
  • Add tests that cover any added or changed code. This generally requires capturing snapshot test data. See CONTRIBUTING.md for details.
  • Run mvn -D enable-ci clean install site locally. If this command doesn't succeed, your change will not pass CI.
  • Push your changes to a branch other than main. You will create your PR from that branch.

When creating a PR:

  • Fill in the "Description" above with clear summary of the changes. This includes:
    • If this PR fixes one or more issues, include "Fixes #" lines for each issue.
    • Provide links to relevant documentation on https://docs.github.com/en/rest where possible. If not including links, explain why not.
  • All lines of new code should be covered by tests as reported by code coverage. Any lines that are not covered must have PR comments explaining why they cannot be covered. For example, "Reaching this particular exception is hard and is not a particular common scenario."
  • Enable "Allow edits from maintainers".

@bitwiseman
Copy link
Member

@emrebasar
I see what you're saying, this is probably an acceptable change, but you'll at least need to update tests to match the new behavior.

Also, retry scenarios tend to be quick turnaround. What you're talking about is generally only going to happen related to rate limiting and long waits. Maybe you should add special error scenario for case you want to address. Make it similar to

private void detectInvalidCached404Response(GitHubConnectorResponse connectorResponse, GitHubRequest request)
throws IOException {
// WORKAROUND FOR ISSUE #669:
// When the Requester detects a 404 response with an ETag (only happens when the server's 304
// is bogus and would cause cache corruption), try the query again with new request header
// that forces the server to not return 304 and return new data instead.
//
// This solution is transparent to users of this library and automatically handles a
// situation that was cause insidious and hard to debug bad responses in caching
// scenarios. If GitHub ever fixes their issue and/or begins providing accurate ETags to
// their 404 responses, this will result in at worst two requests being made for each 404
// responses. However, only the second request will count against rate limit.
if (connectorResponse.statusCode() == 404 && Objects.equals(connectorResponse.request().method(), "GET")
&& connectorResponse.header("ETag") != null
&& !Objects.equals(connectorResponse.request().header("Cache-Control"), "no-cache")) {
LOGGER.log(FINE,
"Encountered GitHub invalid cached 404 from " + connectorResponse.request().url()
+ ". Retrying with \"Cache-Control\"=\"no-cache\"...");
// Setting "Cache-Control" to "no-cache" stops the cache from supplying
// "If-Modified-Since" or "If-None-Match" values.
// This makes GitHub give us current data (not incorrectly cached data)
throw new RetryRequestException(
prepareConnectorRequest(request.toBuilder().setHeader("Cache-Control", "no-cache").build()));
}
}
where you detect the situation and throw a RetryRequestException.

The detector should check the http error code and prepare a new request. If the authorization string for new request is the same as the current connectorRequest, then you have a valid failure. Otherwise, throw a RetryRequestException with the updated GitHubConnectorRequest that the main loop will use for retry.

Make sense?

Copy link
Member

@bitwiseman bitwiseman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For most users, preparing the request again is not needed. As noted, we should detect the scenario you describe an update the request only in that case.

@emrebasar
Copy link
Contributor Author

Thank you for the feedback. In our use case, rate limiting is the primary cause of retries, and we observe this specific scenario fairly often, though I agree that it might not be the case for majority of the use cases. I will update the code accordingly, and fix the failing tests.

@bitwiseman bitwiseman self-requested a review July 4, 2023 00:19
@codecov
Copy link

codecov bot commented Jul 4, 2023

Codecov Report

Patch coverage: 90.90% and project coverage change: +0.06% 🎉

Comparison is base (c283716) 80.04% compared to head (3544ce6) 80.10%.
Report is 6 commits behind head on main.

Additional details and impacted files
@@             Coverage Diff              @@
##               main    #1680      +/-   ##
============================================
+ Coverage     80.04%   80.10%   +0.06%     
- Complexity     2231     2237       +6     
============================================
  Files           215      215              
  Lines          6760     6771      +11     
  Branches        365      367       +2     
============================================
+ Hits           5411     5424      +13     
+ Misses         1134     1132       -2     
  Partials        215      215              
Files Changed Coverage Δ
src/main/java/org/kohsuke/github/GitHubClient.java 86.49% <90.90%> (+0.18%) ⬆️

... and 2 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@bitwiseman bitwiseman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All we need now is a test covering the newly added code.

@emrebasar
Copy link
Contributor Author

Hi @bitwiseman, I did dive into this, and tried implementing some tests but got stuck at configuring the correct Wiremock behavior. I am planning to continue looking at this, beginning Week 32. How would you like to proceed? Should we close this PR, and open a new one when tests are fixed, or should we keep this one open?

@bitwiseman
Copy link
Member

@emrebasar
Take your time, we'll keep this open. Update this PR as needed. Thanks for working on this. Feel free to ask questions if you get stuck.

@emrebasar
Copy link
Contributor Author

Hi again. I have now returned, and added two tests that verify the behavior. I think there might be some cleanup opportunities in the Wiremock fixtures, but I haven't looked at them in detail yet. I will check that once the CI verifies that the test coverage is at a satisfactory level.

Currently, a retried request fails if the authentication token expires between retries. This can be a common experience, as GitHub API can throttle a user for up to an hour, which is longer than the lifetime of an authentication token retrieved via the GitHub Application Installation.

This commit ensures that a recent token is used in each request by re-creating the connector request for every retry.
@bitwiseman bitwiseman self-requested a review August 14, 2023 19:40
@bitwiseman
Copy link
Member

Missed javadoc comments on tests.

@bitwiseman bitwiseman merged commit 238331d into hub4j:main Aug 15, 2023
11 checks passed
@bitwiseman bitwiseman changed the title Re-Prepare request for every retry Re-Prepare request for retry when using dynamic auth tokens Aug 15, 2023
@bitwiseman bitwiseman changed the title Re-Prepare request for retry when using dynamic auth tokens Re-Prepare request for retry when using dynamic auth token Aug 15, 2023
@emrebasar emrebasar deleted the prepare_request_on_retries branch August 16, 2023 07:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants