-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bigquery.repackaged.io.grpc.StatusRuntimeException: INVALID_ARGUMENT: request failed #197
Comments
@kmjung I think the error arrives from the BigQuery Storage API. Can you please check? |
Submited the following Scala job (with the same sql query) to the same Dataproc cluster using the same bigquery connector version and it works: val df = spark.read.format("bigquery")
.option("table", table)
.load()
.cache()
df.createOrReplaceTempView("myEvents")
spark.sql(
"select * from myEvents where geo is not null").show(2) TL DR: works in Scala but not in pySpark. EDIT: this fails without the cache() option. |
@martinKindall Notice that in the scala version you have cached the result, so that the SQL may have ran against it. The error you got is from the BigQuery Storage API, when this specific filter was pushed down. |
@davidrabinowitz tried the Scala script without the cache option and obtained the same error, here's the trace:
|
Any news on this? |
This filter is not supported by the storage API at the moment, which is something that we're hoping to be able to remedy in Q3 of this year. It's possible that we can provide a short-term workaround in the Spark connector itself -- @Gaurangi94, any chance you can help triage this? |
Should be fixed in 0.17.0 |
Conditions:
-On Dataproc using spark-bigquery-connector (latest 2.12.jar) and pySpark
-Image: Preview 2.0-debian10 (also tried with 1.5-debian without success).
Code that fails:
geo column is a Struct, as shown here
Here's part of the stacktrace:
Fact: this query works
Other fact: failing query works fine on BigQuery editor
Questions:
The text was updated successfully, but these errors were encountered: