Skip to content

Conversation

@githubwua
Copy link
Contributor

@githubwua githubwua commented May 18, 2023

When Apache Beam tries to insert a BigQuery row that is between 9 to 10 MB in size, the following error occurs and confuses users with size limit information.

"Error message from worker: java.lang.RuntimeException: We have observed a row that is 9641699 bytes in size. BigQuery supports request sizes up to 10MB, and this row is too large. You may change your retry strategy to unblock this pipeline, and the row will be output as a failed insert."

Updating the error message to display the actual size limit.

Please add a meaningful description for your change here


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Mention the appropriate issue in your description (for example: addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment fixes #<ISSUE NUMBER> instead.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests
Go tests

See CI.md for more information about GitHub Actions CI.

When Apache Beam tries to insert a BigQuery row that is between 9 to 10 MB in size, the following error occurs and confuses users with size limit information.  

"Error message from worker: java.lang.RuntimeException: We have observed a row that is 9641699 bytes in size. BigQuery supports request sizes up to 10MB, and this row is too large.  You may change your retry strategy to unblock this pipeline, and  the row will be output as a failed insert."

Updating the error message to display the actual size limit.
@github-actions
Copy link
Contributor

Assigning reviewers. If you would like to opt out of this review, comment assign to next reviewer:

R: @bvolpato for label java.
R: @johnjcasey for label io.

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

The PR bot will only process comments in the main thread (not review comments).

…ssage

Revised test to address the following error message change:

Old Message:
"We have observed a row that is 9446414 bytes in size. BigQuery supports request sizes up to 10MB, and this row is too large.  You may change your retry strategy to unblock this pipeline, and  the row will be output as a failed insert."


New Message:
"We have observed a row that is 9446414 bytes in size and exceeded BigQueryIO limit of 9MB. While BigQuery supports request sizes up to 10MB, BigQueryIO sets the limit at 9MB to leave room for request overhead. You may change your retry strategy to unblock this pipeline, and the row will be output as a failed insert."
@github-actions
Copy link
Contributor

Reminder, please take a look at this pr: @bvolpato @johnjcasey

@johnjcasey johnjcasey merged commit bd274b7 into apache:master May 30, 2023
cushon pushed a commit to cushon/beam that referenced this pull request May 24, 2024
* Revise error message to state the actual size limit

When Apache Beam tries to insert a BigQuery row that is between 9 to 10 MB in size, the following error occurs and confuses users with size limit information.  

"Error message from worker: java.lang.RuntimeException: We have observed a row that is 9641699 bytes in size. BigQuery supports request sizes up to 10MB, and this row is too large.  You may change your retry strategy to unblock this pipeline, and  the row will be output as a failed insert."

Updating the error message to display the actual size limit.

* Update BigQueryServicesImplTest.java to test against revised error message

Revised test to address the following error message change:

Old Message:
"We have observed a row that is 9446414 bytes in size. BigQuery supports request sizes up to 10MB, and this row is too large.  You may change your retry strategy to unblock this pipeline, and  the row will be output as a failed insert."


New Message:
"We have observed a row that is 9446414 bytes in size and exceeded BigQueryIO limit of 9MB. While BigQuery supports request sizes up to 10MB, BigQueryIO sets the limit at 9MB to leave room for request overhead. You may change your retry strategy to unblock this pipeline, and the row will be output as a failed insert."
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants