Skip to content

Latest commit

 

History

History
123 lines (92 loc) · 8.09 KB

File metadata and controls

123 lines (92 loc) · 8.09 KB

+++ title = "Overview" date = 2020-04-20T23:34:03-04:00 weight = 5 chapter = false +++

Quickstart: Building and Using the Provider

Quick install

To quickly install the binary please execute the following curl command in your shell.

$ curl https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/godownloader-databricks-provider.sh | bash -s -- -b $HOME/.terraform.d/plugins

The command should have moved the binary into your ~/.terraform.d/plugins folder.

You can ls the previous directory to verify.

Requirements

Please note that there is a Makefile which contains all the commands you would need to run this project.

This code base to contribute to requires the following software:

To make sure everything is installed correctly please run the following commands:

Testing go installation:

$ go version 
go version go1.13.3 darwin/amd64

Testing terraform installation:

$ terraform --version
Terraform v0.12.19

Your version of Terraform is out of date! The latest version
is 0.12.24. You can update by downloading from https://www.terraform.io/downloads.html

Basic Terraform Workflow

Sample terraform code

provider "databricks" {
  host = "http://databrickshost.com"
  token = "dapitokenhere"
}

resource "databricks_scim_user" "my-user" {
  user_name = join("", ["test-user", "+",count.index,"@databricks.com"])
  display_name = "Test User"
}

Then run terraform init then terraform apply to apply the hcl code to your Databricks workspace.

Please refer to the detailed documentation provided in the html documentation for detailed use of the providers.

Project Components

Databricks Terraform Provider Resources State

Resource Implemented Import Support Acceptance Tests Documentation Reviewed Finalize Schema
databricks_token
databricks_secret_scope
databricks_secret
databricks_secret_acl
databricks_instance_pool
databricks_scim_user
databricks_scim_group
databricks_notebook
databricks_cluster
databricks_job
databricks_dbfs_file
databricks_dbfs_file_sync
databricks_instance_profile
databricks_aws_s3_mount
databricks_azure_blob_mount
databricks_azure_adls_gen1_mount
databricks_azure_adls_gen2_mount

Databricks Terraform Data Sources State

Data Source Implemented Acceptance Tests Documentation Reviewed
databricks_notebook
databricks_notebook_paths
databricks_dbfs_file
databricks_dbfs_file_paths
databricks_zones
databricks_runtimes
databricks_instance_pool
databricks_scim_user
databricks_scim_group
databricks_cluster
databricks_job
databricks_mount
databricks_instance_profile
databricks_database
databricks_table

Testing

⬜ Integration tests should be run at a client level against both azure and aws to maintain sdk parity against both apis (currently only on one cloud)

⬜ Terraform acceptance tests should be run against both aws and azure to maintain parity of provider between both cloud services (currently only on one cloud)

Project Support

Please note that all projects in the /databrickslabs github account are provided for your exploration only, and are not formally supported by Databricks with Service Level Agreements (SLAs). They are provided AS-IS and we do not make any guarantees of any kind. Please do not submit a support ticket relating to any issues arising from the use of these projects.

Any issues discovered through the use of this project should be filed as GitHub Issues on the Repo. They will be reviewed as time permits, but there are no formal SLAs for support.