Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Terraform BigQuery Table Hive partitioning support #2121

Conversation

modular-magician
Copy link
Collaborator

Fixes: hashicorp/terraform-provider-google#5664
As of March 2, range partioning / hive partitioning is GA, see https://cloud.google.com/bigquery/docs/release-notes.
Note: Doesn't support require_partition_filter attribute as this isn't available from the used BigQuery SDK.

Release Note Template for Downstream PRs (will be copied)

bigquery: Added support for `google_bigquery_table` `hive_partitioning_options`
bigquery: Added `google_bigquery_table` `range_partitioning` to GA

Derived from GoogleCloudPlatform/magic-modules#3335

* range partitioning for BigQuery is GA

* add hive partitioning options to google_bigquery_table

* improve on formatting of bigquery table hive partitioning options

* correct indenting on  resource_bigquery_table_test.go

* minor fix on the documentation of bigquery table

* align bigquery table test with upstream changes

* gofmt on resource_bigquery_table, resource_bigquery_table_test.go

Signed-off-by: Modular Magician <magic-modules@google.com>
@modular-magician modular-magician merged commit 34ccce3 into hashicorp:master May 29, 2020
@LoekL
Copy link

LoekL commented Jun 23, 2020

FYI I tried using this today, no dice. The table does get created:

Screenshot 2020-06-23 18 11 31

But there's no sign of the partitions in the table schema:

Screenshot 2020-06-23 18 12 31

If I however run:

bq mkdef \
  --autodetect \
  --ignore_unknown_values \
  --source_format=NEWLINE_DELIMITED_JSON \
  --hive_partitioning_mode=CUSTOM \
  --hive_partitioning_source_uri_prefix=gs://anhistous-metonymic-7578834-analytics-staging-batched/data_type=jsonl/event_schema=playfab/{event_category:STRING}/{event_environment:STRING}/{event_date:DATE}/{event_hour:STRING}/{event_minute:STRING} \
  gs://anhistous-metonymic-7578834-analytics-staging-batched/data_type=jsonl/event_schema=playfab/\* \
  AnalyticsEnvironment:STRING,PlayFabEnvironment:STRING,SourceType:STRING,Source:STRING,EventNamespace:STRING,TitleId:STRING,GroupBatchId:STRING,BatchId:STRING,EventId:STRING,EventName:STRING,EntityType:STRING,EntityId:STRING,Timestamp:TIMESTAMP,ReceivedTimestamp:TIMESTAMP,BatchedTimestamp:TIMESTAMP,BatchJobName:STRING,EventAttributes:STRING \
  > /Users/loek/Desktop/events_playfab_staging_hive_partitioned_batched

Followed by:

bq mk --table --location=US --external_table_definition=/Users/loek/Desktop/events_playfab_staging_hive_partitioned_batched external.events_playfab_hive_partitioned_batched

It does work:

Screenshot 2020-06-23 18 15 54

&

Screenshot 2020-06-23 18 16 00

The only difference within the BQ UI I can see is that when going via the bq CLI, I don't see Compression = GZIP in the table details. Am using Terraform v0.12.18 & provider.google v3.26.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants