Skip to content

[Feature Request]: Making number of retries configurable in BigQuery Storage Write connector #25382

@fpoon

Description

@fpoon

What would you like to happen?

Hi,
Is it possible to make this instance of retry manager configurable?

new RetryManager<>(Duration.standardSeconds(1), Duration.standardSeconds(10), 1000);

Retrying the request to BQ Storage Write API a thousand times seems to be a bit too generous, especially with some nonrecoverable errors, but I guess, it may have its justification in some cases. However, in streaming mode, it clogs the pipeline for an unreasonable amount of time. Maybe clients should be allowed to configure this value?

Issue Priority

Priority: 2 (default / most feature requests should be filed as P2)

Issue Components

  • Component: Python SDK
  • Component: Java SDK
  • Component: Go SDK
  • Component: Typescript SDK
  • Component: IO connector
  • Component: Beam examples
  • Component: Beam playground
  • Component: Beam katas
  • Component: Website
  • Component: Spark Runner
  • Component: Flink Runner
  • Component: Samza Runner
  • Component: Twister2 Runner
  • Component: Hazelcast Jet Runner
  • Component: Google Cloud Dataflow Runner

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions