-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update table names for databricks_workspace_workspace
and databricks_catalog_catalog
#4
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@karanpopat Please see comments, thanks!
docs/index.md
Outdated
|
||
## Multi-Account Connections | ||
|
||
You may create multiple databricks connections: | ||
```hcl | ||
connection "databricks_dev" { | ||
plugin = "databricks" | ||
profile = "databricks_dev" | ||
config_profile = "databricks_dev" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this config arg just be called profile
instead? Are there multiple types of profiles you can use with the Databricks CLI/SDK?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reference points for naming:
For env vars:
- DATABRICKS_CONFIG_PROFILE
- DATABRICKS_CONFIG_FILE
From Terraform (https://registry.terraform.io/providers/databricks/databricks/latest/docs#argument-reference):
- profile
- config_file
In their Config struct (https://github.com/databricks/databricks-sdk-go/blob/main/config/config.go#L54-L60):
- Profile
- ConfigFile
docs/index.md
Outdated
@@ -114,27 +114,27 @@ connection "databricks" { | |||
} | |||
``` | |||
|
|||
By default, all options are commented out in the default connection, thus Steampipe will resolve your credentials using the same mechanism as the Databricks CLI (Databricks environment variables, default profile, etc). This provides a quick way to get started with Steampipe, but you will probably want to customize your experience using configuration options for [querying multiple regions](#multi-account-connections), [configuring credentials](#configuring-databricks-credentials) from your [Databricks Profiles](#databricks-profile-credentials). | |||
You can customize your experience using configuration options for [querying multiple accounts](#multi-account-connections), [configuring credentials](#configuring-databricks-credentials) from your [Databricks Profiles](#databricks-profile-credentials). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we can add support to the plugin to pickup the DEFAULT
profile by default if no other config args for auth are mentioned, we can re-add this line in with the appropriate info. I'm good with the change in this PR though as is.
@@ -23,7 +23,7 @@ func Plugin(ctx context.Context) *plugin.Plugin { | |||
Schema: ConfigSchema, | |||
}, | |||
TableMap: map[string]*plugin.Table{ | |||
"databricks_catalog_catalog": tableDatabricksCatalogCatalog(ctx), | |||
"databricks_catalog": tableDatabricksCatalog(ctx), | |||
"databricks_catalog_connection": tableDatabricksCatalogConnection(ctx), | |||
"databricks_catalog_external_location": tableDatabricksCatalogExternalLocation(ctx), | |||
"databricks_catalog_function": tableDatabricksCatalogFunction(ctx), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Even though it's making an exception to the rule of following the SDK directory naming for services (the most consistent source we could find), we could treat job
and pipeline
as the service names instead of jobs
and pipelines
respectively. This would mean we'd have the table names:
databricks_job
databricks_job_run
databricks_pipeline
databricks_pipeline_event
databricks_pipeline_update
These seem like better table names than what we originally have
Example query results
Results