You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If DATE type is specified in schema, BigQueryIO with Storage Write API may fail with the following error:
Got error com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The proto field mismatched with BigQuery field at <table>, the proto field type string, BigQuery field type DATE Entity: <write_stream_name>
. DATE type in BEAM for BigQuery is converted to string in proto request [1], while BigQuery Storage Write API requires an integer type for DATE type in BigQuery [2].
A workaround is to use INTEGER in BEAM and manually compute the number of days since epoch.
Actually, a feature request to BigQuery is filed at [3]. If it's implemented, we can use DATE type successfully.
If
DATE
type is specified in schema,BigQueryIO
with Storage Write API may fail with the following error:A workaround is to use
INTEGER
in BEAM and manually compute the number of days since epoch.Actually, a feature request to BigQuery is filed at [3]. If it's implemented, we can use
DATE
type successfully.[1] https://github.com/apache/beam/blob/v2.35.0/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowToStorageApiProto.java#L55-L73
[2] https://cloud.google.com/bigquery/docs/write-api#data_type_conversions
[3] https://issuetracker.google.com/issues/205174128
Imported from Jira BEAM-13753. Original Jira may contain additional context.
Reported by: baeminbo.
The text was updated successfully, but these errors were encountered: