Skip to content

Commit

Permalink
Merge pull request #132 from databrickslabs/permissions-api
Browse files Browse the repository at this point in the history
Add support for general permissions
  • Loading branch information
nfx committed Jun 25, 2020
2 parents 6f50437 + ed678fe commit 6a54e54
Show file tree
Hide file tree
Showing 33 changed files with 1,868 additions and 132 deletions.
5 changes: 1 addition & 4 deletions .travis.yml
Expand Up @@ -9,8 +9,6 @@ jobs:
script:
- curl -sSL "https://github.com/gotestyourself/gotestsum/releases/download/v0.4.2/gotestsum_0.4.2_linux_amd64.tar.gz" | sudo tar -xz -C /usr/local/bin gotestsum
- curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b $(go env GOPATH)/bin v1.25.0
- time make provider-test
- time make client-test
- time make build
- go: 1.14.x
gemfile: gemfiles/Gemfile.rails-3.0.x
Expand All @@ -21,15 +19,14 @@ jobs:
script:
- curl -sSL "https://github.com/gotestyourself/gotestsum/releases/download/v0.4.2/gotestsum_0.4.2_linux_amd64.tar.gz" | sudo tar -xz -C /usr/local/bin gotestsum
- curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b $(go env GOPATH)/bin v1.25.0
- time make provider-test
- time make client-test
- time make build


after_success:
- echo "travis go version='$TRAVIS_GO_VERSION'"
- bash <(curl -s https://codecov.io/bash) -f client-coverage.txt -F client
- bash <(curl -s https://codecov.io/bash) -f provider-coverage.txt -F provider
- bash <(curl -s https://codecov.io/bash) -f coverage.txt -F repository

notifications:
webhooks: https://www.travisbuddy.com/
72 changes: 69 additions & 3 deletions CONTRIBUTING.md
Expand Up @@ -7,11 +7,13 @@ Contributing to Databricks Terraform Provider
- [Developing with Visual Studio Code Devcontainers](#developing-with-visual-studio-code-devcontainers)
- [Building and Installing with Docker](#building-and-installing-with-docker)
- [Testing](#testing)
- [Code conventions](#code-conventions)
- [Linting](#linting)
- [Unit testing resources](#unit-testing-resources)
- [Integration Testing](#integration-testing)
- [Project Components](#project-components)
- [Databricks Terraform Provider Resources State](#databricks-terraform-provider-resources-state)
- [Databricks Terraform Data Sources State](#databricks-terraform-data-sources-state)
- [Databricks Terraform Provider Resources State](#databricks-terraform-provider-resources-state)
- [Databricks Terraform Data Sources State](#databricks-terraform-data-sources-state)

We happily welcome contributions to databricks-terraform. We use GitHub Issues to track community reported issues and GitHub Pull Requests for accepting changes.

Expand Down Expand Up @@ -123,11 +125,76 @@ $ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform apply
* [ ] Integration tests should be run at a client level against both azure and aws to maintain sdk parity against both apis **(currently only on one cloud)**
* [x] Terraform acceptance tests should be run against both aws and azure to maintain parity of provider between both cloud services **(currently only on one cloud)**

## Code conventions

* Import statements should all be first ordered by "GoLang internal", "Vendor packages" and then "current provider packages". Within those sections imports must follow alphabetical order.

## Linting

Please use makefile for linting. If you run `golangci-lint` by itself it will fail due to different tags containing same functions.
So please run `make lint` instead.

## Unit testing resources

In order to unit test a resource, which runs fast and could be included in code coverage, one should use `ResourceTester`, that launches embedded HTTP server with `HTTPFixture`'s containing all calls that should have been made in given scenario. Some may argue that this is not a pure unit test, because it creates a side effect in form of embedded server, though it's always on different random port, making it possible to execute these tests in parallel. Therefore comments about non-pure unit tests will be ignored, if they use `ResourceTester` helper.

```go
func TestPermissionsCreate(t *testing.T) {
_, err := ResourceTester(t, []HTTPFixture{
{
Method: http.MethodPatch,
// requires full URI
Resource: "/api/2.0/preview/permissions/clusters/abc",
// works with entities, not JSON. Diff is displayed in case of missmatch
ExpectedRequest: model.AccessControlChangeList{
AccessControlList: []*model.AccessControlChange{
{
UserName: &TestingUser,
PermissionLevel: "CAN_USE",
},
},
},
},
{
Method: http.MethodGet,
Resource: "/api/2.0/preview/permissions/clusters/abc?",
Response: model.AccessControlChangeList{
AccessControlList: []*model.AccessControlChange{
{
UserName: &TestingUser,
PermissionLevel: "CAN_MANAGE",
},
},
},
},
{
Method: http.MethodGet,
Resource: "/api/2.0/preview/scim/v2/Me?",
Response: model.User{
UserName: "chuck.norris",
},
},
},
// next argument is function, that creates resource (to make schema for ResourceData)
resourcePermissions,
// state represented as native structure (though a bit clunky)
map[string]interface{}{
"cluster_id": "abc",
"access_control": []interface{}{
map[string]interface{}{
"user_name": TestingUser,
"permission_level": "CAN_USE",
},
},
},
// the last argument is a function, that performs a stage on resource (Create/update/delete/read)
resourcePermissionsCreate)
assert.NoError(t, err, err)
}
```

Each resource should have both unit and integration tests.

## Integration Testing

Currently Databricks supports two cloud providers `azure` and `aws` thus integration testing with the correct cloud service provider is
Expand Down Expand Up @@ -162,7 +229,6 @@ DATABRICKS_AZURE_WORKSPACE_NAME=<azure databricks workspace name>
Note that azure integration tests will use service principal based auth. Even though it is using a service principal,
it will still be generating a personal access token to perform creation of resources.


## Project Components

### Databricks Terraform Provider Resources State
Expand Down
45 changes: 25 additions & 20 deletions Makefile
@@ -1,17 +1,5 @@
default: build

test: lint
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=coverage.txt ./...

client-test:
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=client-coverage.txt ./client/...

provider-test:
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=provider-coverage.txt ./databricks/...

int:
@echo "==> Running tests..."
@gotestsum --raw-command go test -v -json -coverprofile=coverage.txt ./...
Expand All @@ -26,14 +14,6 @@ coverage-int: int

int-build: int build

build: lint test
@echo "==> Building source code with go build..."
@go build -mod vendor -v -o terraform-provider-databricks

lint:
@echo "==> Linting source code with golangci-lint make sure you run make fmt ..."
@golangci-lint run --skip-dirs-use-default --timeout 5m

fmt:
@echo "==> Formatting source code with gofmt..."
@goimports -w client
Expand All @@ -44,6 +24,31 @@ fmt:
@gofmt -s -w main.go
@go fmt ./...

lint:
@echo "==> Linting source code with golangci-lint make sure you run make fmt ..."
@golangci-lint run --skip-dirs-use-default --timeout 5m

client-test:
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=client-coverage.txt ./client/...

provider-test:
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=provider-coverage.txt ./databricks/...

test: lint client-test provider-test
@echo "==> Running tests..."
@gotestsum --format short-verbose --raw-command go test -v -json -short -coverprofile=coverage.txt ./...

build: lint test
@echo "==> Building source code with go build..."
@go build -mod vendor -v -o terraform-provider-databricks

install: build
@echo "==> Installing provider into ~/.terraform.d/plugins ..."
@rm ~/.terraform.d/plugins/terraform-provider-databricks*
@mv terraform-provider-databricks ~/.terraform.d/plugins

vendor:
@echo "==> Filling vendor folder with library code..."
@go mod vendor
Expand Down
4 changes: 2 additions & 2 deletions client/model/notebook.go
Expand Up @@ -32,8 +32,8 @@ const (
LibraryObject ObjectType = "LIBRARY"
)

// NotebookInfo contains information when doing a get request or list request on the workspace api
type NotebookInfo struct {
// WorkspaceObjectStatus contains information when doing a get request or list request on the workspace api
type WorkspaceObjectStatus struct {
ObjectID int64 `json:"object_id,omitempty"`
ObjectType ObjectType `json:"object_type,omitempty"`
Path string `json:"path,omitempty"`
Expand Down
75 changes: 75 additions & 0 deletions client/model/permissions.go
@@ -0,0 +1,75 @@
package model

// ObjectACL is a structure to generically describe access control
type ObjectACL struct {
ObjectID string `json:"object_id,omitempty"`
ObjectType string `json:"object_type,omitempty"`
AccessControlList []*AccessControl `json:"access_control_list"`
}

// AccessControl is a structure to describe user/group permissions
type AccessControl struct {
UserName *string `json:"user_name,omitempty"`
GroupName *string `json:"group_name,omitempty"`
AllPermissions []*Permission `json:"all_permissions,omitempty"`
}

// Permission is a structure to describe permission level
type Permission struct {
PermissionLevel string `json:"permission_level"`
Inherited bool `json:"inherited,omitempty"`
InheritedFromObject []string `json:"inherited_from_object,omitempty"`
}

// AccessControlChangeList is wrapper around ACL changes for REST API
type AccessControlChangeList struct {
AccessControlList []*AccessControlChange `json:"access_control_list"`
}

// AccessControlChange is API wrapper for changing permissions
type AccessControlChange struct {
UserName *string `json:"user_name,omitempty"`
GroupName *string `json:"group_name,omitempty"`
ServicePrincipalName *string `json:"service_principal_name,omitempty"`
PermissionLevel string `json:"permission_level"`
}

// ToAccessControlChangeList converts data formats
func (oa *ObjectACL) ToAccessControlChangeList() *AccessControlChangeList {
acl := new(AccessControlChangeList)
for _, accessControl := range oa.AccessControlList {
for _, permission := range accessControl.AllPermissions {
if permission.Inherited {
continue
}
item := new(AccessControlChange)
acl.AccessControlList = append(acl.AccessControlList, item)
item.PermissionLevel = permission.PermissionLevel
if accessControl.UserName != nil {
item.UserName = accessControl.UserName
} else if accessControl.GroupName != nil {
item.GroupName = accessControl.GroupName
}
}
}
return acl
}

// AccessControl exports data for TF
func (acl *AccessControlChangeList) AccessControl(me string) []map[string]string {
result := []map[string]string{}
for _, control := range acl.AccessControlList {
item := map[string]string{}
if control.UserName != nil && *control.UserName != "" {
if me == *control.UserName {
continue
}
item["user_name"] = *control.UserName
} else if control.GroupName != nil && *control.GroupName != "" {
item["group_name"] = *control.GroupName
}
item["permission_level"] = control.PermissionLevel
result = append(result, item)
}
return result
}
5 changes: 5 additions & 0 deletions client/service/apis.go
Expand Up @@ -116,6 +116,11 @@ func (c *DBApiClient) MWSCustomerManagedKeys() MWSCustomerManagedKeysAPI {
return MWSCustomerManagedKeysAPI{Client: c}
}

// Permissions returns an instance of CommandsAPI
func (c *DBApiClient) Permissions() PermissionsAPI {
return PermissionsAPI{Client: c}
}

func (c *DBApiClient) performQuery(method, path string, apiVersion string, headers map[string]string, data interface{}, secretsMask *SecretsMask) ([]byte, error) {
err := c.Config.getOrCreateToken()
if err != nil {
Expand Down
5 changes: 3 additions & 2 deletions client/service/client.go
Expand Up @@ -29,7 +29,8 @@ const (
Azure CloudServiceProvider = "Azure"
)

type apiErrorBody struct {
// APIErrorBody maps "proper" databricks rest api errors to a struct
type APIErrorBody struct {
ErrorCode string `json:"error_code,omitempty"`
Message string `json:"message,omitempty"`
// The following two are for scim api only for RFC 7644 Section 3.7.3 https://tools.ietf.org/html/rfc7644#section-3.7.3
Expand Down Expand Up @@ -154,7 +155,7 @@ func checkHTTPRetry(ctx context.Context, resp *http.Response, err error) (bool,
if err != nil {
return false, err
}
var errorBody apiErrorBody
var errorBody APIErrorBody
err = json.Unmarshal(body, &errorBody)
// this is most likely HTML... since un-marshalling JSON failed
if err != nil {
Expand Down
4 changes: 2 additions & 2 deletions client/service/clusters_test.go
Expand Up @@ -459,7 +459,7 @@ func TestClustersAPI_WaitForClusterRunning(t *testing.T) {
requestMethod []string
args []interface{}
wantURI []string
want []model.NotebookInfo
want []model.WorkspaceObjectStatus
wantErr bool
}{
{
Expand Down Expand Up @@ -605,7 +605,7 @@ func TestClustersAPI_WaitForClusterTerminated(t *testing.T) {
requestMethod []string
args []interface{}
wantURI []string
want []model.NotebookInfo
want []model.WorkspaceObjectStatus
wantErr bool
}{
{
Expand Down
14 changes: 7 additions & 7 deletions client/service/notebooks.go
Expand Up @@ -32,8 +32,8 @@ func (a NotebooksAPI) Create(path string, content string, language model.Languag
}

// Read returns the notebook metadata and not the contents
func (a NotebooksAPI) Read(path string) (model.NotebookInfo, error) {
var notebookInfo model.NotebookInfo
func (a NotebooksAPI) Read(path string) (model.WorkspaceObjectStatus, error) {
var notebookInfo model.WorkspaceObjectStatus
notebookGetStatusRequest := struct {
Path string `json:"path,omitempty" url:"path,omitempty"`
}{}
Expand Down Expand Up @@ -79,9 +79,9 @@ func (a NotebooksAPI) Mkdirs(path string) error {

// List will list all objects in a path on the workspace and with the recursive flag it will recursively list
// all the objects
func (a NotebooksAPI) List(path string, recursive bool) ([]model.NotebookInfo, error) {
func (a NotebooksAPI) List(path string, recursive bool) ([]model.WorkspaceObjectStatus, error) {
if recursive {
var paths []model.NotebookInfo
var paths []model.WorkspaceObjectStatus
err := a.recursiveAddPaths(path, &paths)
if err != nil {
return nil, err
Expand All @@ -91,7 +91,7 @@ func (a NotebooksAPI) List(path string, recursive bool) ([]model.NotebookInfo, e
return a.list(path)
}

func (a NotebooksAPI) recursiveAddPaths(path string, pathList *[]model.NotebookInfo) error {
func (a NotebooksAPI) recursiveAddPaths(path string, pathList *[]model.WorkspaceObjectStatus) error {
notebookInfoList, err := a.list(path)
if err != nil {
return err
Expand All @@ -109,9 +109,9 @@ func (a NotebooksAPI) recursiveAddPaths(path string, pathList *[]model.NotebookI
return err
}

func (a NotebooksAPI) list(path string) ([]model.NotebookInfo, error) {
func (a NotebooksAPI) list(path string) ([]model.WorkspaceObjectStatus, error) {
var notebookList struct {
Objects []model.NotebookInfo `json:"objects,omitempty" url:"objects,omitempty"`
Objects []model.WorkspaceObjectStatus `json:"objects,omitempty" url:"objects,omitempty"`
}
listRequest := struct {
Path string `json:"path,omitempty" url:"path,omitempty"`
Expand Down

0 comments on commit 6a54e54

Please sign in to comment.