Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: retries for resumable bucket.upload and file.save #1511

Merged
merged 2 commits into from
Jul 23, 2021

Conversation

ddelgrosso1
Copy link
Contributor

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

@ddelgrosso1 ddelgrosso1 requested a review from a team July 22, 2021 19:05
@ddelgrosso1 ddelgrosso1 requested a review from a team as a code owner July 22, 2021 19:05
@product-auto-label product-auto-label bot added the api: storage Issues related to the googleapis/nodejs-storage API. label Jul 22, 2021
@google-cla google-cla bot added the cla: yes This human has signed the Contributor License Agreement. label Jul 22, 2021
src/bucket.ts Outdated
@@ -3737,7 +3737,7 @@ class Bucket extends ServiceObject {
.pipe(writable)
.on('error', err => {
if (
isMultipart &&
(isMultipart || options.resumable) &&
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will always evaluate to true, right? Because each upload is either multipart or resumable. I think this should be checking for (isMultipart || err.message) or something like that. We want to retry all multipart uploads and then resumable uploads only where it is a certain error (because otherwise gcs-resumable-upload will handle it)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case maybe we don't need this check at all. Certain errors are being explicitly handled down at the gcs-resumable-upload level here. This basically tells Gaxios to let gcs-resumable-upload handle everything. However, during the URL creation only certain codes are handled at this the same level and the rest bubble up. I think we should handle anything that bubbles up and meets our retryableFn criteria at this level. What do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we try to retry a 500 error downstream and it never works out, it will be bubbled up here and then we will send it back downstream. This creates exponentially more retries than maxRetries.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It actually doesn't cause an exponential increase in retries because the error that occurs when retries are exhausted is not being checked for in the retry function (it only has an error message and no error code). As a result the code at this level does not send it back downstream. Everything appears to get handled correctly. I will circle up with you to show you what I mean in case I missed anything obvious.

test/file.ts Outdated
@@ -4118,7 +4118,7 @@ describe('File', () => {
await file.save(DATA, options);
throw Error('unreachable');
} catch (e) {
assert.strictEqual(e.message, 'first error');
assert.strictEqual(e.message, 'unreachable');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't test this like this. If we're checking that it retries, we shouldn't throw the "unreachable" error (because it will be reachable) and instead assert the retryCount

@ddelgrosso1 ddelgrosso1 added the owlbot:run Add this label to trigger the Owlbot post processor. label Jul 23, 2021
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Jul 23, 2021
@shaffeeullah shaffeeullah self-requested a review July 23, 2021 19:27
@shaffeeullah shaffeeullah merged commit 9bf163c into googleapis:master Jul 23, 2021
@ddelgrosso1 ddelgrosso1 deleted the resumable-retry branch July 23, 2021 19:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: storage Issues related to the googleapis/nodejs-storage API. cla: yes This human has signed the Contributor License Agreement.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants