Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add force_destroy field for BigQuery dataset #2050

Closed
ocervell opened this issue Sep 13, 2018 · 3 comments
Closed

Add force_destroy field for BigQuery dataset #2050

ocervell opened this issue Sep 13, 2018 · 3 comments

Comments

@ocervell
Copy link

ocervell commented Sep 13, 2018

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment. If the issue is assigned to the "modular-magician" user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If the issue is assigned to a user, that user is claiming responsibility for the issue. If the issue is assigned to "hashibot", a community member has claimed the issue already.

Description

Currently, we have to write our own custom script to essentially empty bigquery tables before deleting the dataset.

The shell script looks like:

PROJECT_ID=$1
DATASET_NAME=$2
for i in $(bq ls $PROJECT_ID:$DATASET_NAME | sed 1,2d | awk "{print \$1}");
do
  bq rm -ft $PROJECT_ID:$DATASET_NAME.$i;
done

and the resource definition:

# BigQuery dataset
resource "google_bigquery_dataset" "dataset" {
  count      = "${local.destination_type == "bigquery" ? 1 : 0}"
  dataset_id = "${local.destination_name}"
  project    = "${local.destination_project}"

  # Delete all tables in dataset on destroy.
  # This is required because a dataset cannot be deleted if it contains any data.
  provisioner "local-exec" {
    when    = "destroy"
    command = "sh ${path.module}/scripts/delete-bq-tables.sh ${local.destination_project} ${local.destination_name}"
  }

  depends_on = [
    "google_project_service.enable_destination_api",
  ]
}

Similarly to the google_storage_bucket resource, we should have a force_destroy flag that we can set to delete all tables on destroy.

New or Affected Resource(s)

  • google_bigquery_dataset

Potential Terraform Configuration

resource "google_bigquery_dataset" "dataset" {
  dataset_id = "${var.dataset_id}"
  project    = "${var.project_id}"
  force_destroy = true
}
@ocervell
Copy link
Author

@danawillow Any update on this ?

@rileykarson
Copy link
Collaborator

Fixed by #2986, delete_contents_on_destroy is available in 2.0.0.

@ghost
Copy link

ghost commented Mar 18, 2019

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 hashibot-feedback@hashicorp.com. Thanks!

@ghost ghost locked and limited conversation to collaborators Mar 18, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants