You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, because of this we observe the following issue in v10:
If the google service account has permission to read datasets
from multiple locations in big query source,
then execution of INFORMATION_SCHEMA refresh query fails
due to dataset being present in different location. (error format below)
dataset xyz was not found in location xyz
In our view, big query schema refresh should
not be dependent on the location of datasets.
Is it possible to bring back v8 behaviour in v10
big query schema refresh.
The text was updated successfully, but these errors were encountered:
Issue Summary (Big Query Schema Refresh)
In v8, big query schema refresh is implemented using
rest apis.
(https://github.com/getredash/redash/blob/v8.0.0/redash/query_runner/big_query.py#L262)
In v10, big query schema refresh is implemented using
INFORMATION_SCHEMA query
(https://github.com/getredash/redash/blob/v10.1.0/redash/query_runner/big_query.py#L264)
However, because of this we observe the following issue in v10:
If the google service account has permission to read datasets
from multiple locations in big query source,
then execution of INFORMATION_SCHEMA refresh query fails
due to dataset being present in different location. (error format below)
In our view, big query schema refresh should
not be dependent on the location of datasets.
Is it possible to bring back v8 behaviour in v10
big query schema refresh.
The text was updated successfully, but these errors were encountered: