-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better error message when private link enabled workspaces reject requests #647
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This PR breaks backwards compatibility for databrickslabs/blueprint downstream. See build logs for more details. Running from downstreams #123 |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #647 +/- ##
==========================================
+ Coverage 57.62% 57.66% +0.03%
==========================================
Files 47 48 +1
Lines 32650 32680 +30
==========================================
+ Hits 18815 18844 +29
- Misses 13835 13836 +1 ☔ View full report in Codecov by Sentry. |
hectorcast-db
approved these changes
May 17, 2024
hectorcast-db
added a commit
that referenced
this pull request
May 22, 2024
### Backward incompatible changes * `CredentialsProvider` class renamed to `CredentialsStrategy` and `HeaderFactory` class renamed to `CredentialsProvider` ### Improvements and new features * Better error message when private link enabled workspaces reject requests ([#647](#647)). * Create a method to generate OAuth tokens ([#644](#644)). API Changes: * Changed `list()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service to require request of `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Removed [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service. * Added [w.quality_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/quality_monitors.html) workspace-level service. * Renamed `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Changed `schema_name` field for `databricks.sdk.service.catalog.DisableRequest` to `str` dataclass. * Removed `databricks.sdk.service.catalog.DisableSchemaName` dataclass. * Changed `schema_name` field for `databricks.sdk.service.catalog.EnableRequest` to `str` dataclass. * Removed `databricks.sdk.service.catalog.EnableSchemaName` dataclass. * Renamed `databricks.sdk.service.catalog.GetLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `next_page_token` field for `databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboard_id` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Added `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Added `databricks.sdk.service.catalog.MonitorRefreshListResponse` dataclass. * Changed `cluster_status()` method for [w.libraries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/libraries.html) workspace-level service to return `databricks.sdk.service.compute.ClusterLibraryStatuses` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterAttributes`. * Changed `spec` and `cluster_source` fields for `databricks.sdk.service.compute.ClusterDetails` to `databricks.sdk.service.compute.ClusterSpec` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterSpec`. * Removed `databricks.sdk.service.compute.ClusterStatusResponse` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.CreateCluster`. * Removed `clone_from` and `cluster_source` fields for `databricks.sdk.service.compute.EditCluster`. * Removed `sort_by_spec` field for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` and `sort_by` fields for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` field for `databricks.sdk.service.marketplace.SearchListingsRequest`. * Removed `databricks.sdk.service.marketplace.SortBySpec` dataclass. * Removed `databricks.sdk.service.marketplace.SortOrder` dataclass. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.CreatePipeline`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.EditPipeline`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.PipelineSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.SchemaSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.TableSpec`. * Added `databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfig` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfigScdType` dataclass. * Added `deployment_artifacts` field for `databricks.sdk.service.serving.AppDeployment`. * Added `route_optimized` field for `databricks.sdk.service.serving.CreateServingEndpoint`. * Added `contents` field for `databricks.sdk.service.serving.ExportMetricsResponse`. * Changed `openai_api_key` field for `databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Added `microsoft_entra_client_id`, `microsoft_entra_client_secret` and `microsoft_entra_tenant_id` fields for `databricks.sdk.service.serving.OpenAiConfig`. * Added `endpoint_url` and `route_optimized` fields for `databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `databricks.sdk.service.serving.AppDeploymentArtifacts` dataclass. * Added `storage_root` field for `databricks.sdk.service.sharing.CreateShare`. * Added `storage_location` and `storage_root` fields for `databricks.sdk.service.sharing.ShareInfo`. * Added `storage_root` field for `databricks.sdk.service.sharing.UpdateShare`. * Added `scan_index()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Added `databricks.sdk.service.vectorsearch.ListValue` dataclass. * Added `databricks.sdk.service.vectorsearch.MapStringValueEntry` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexRequest` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexResponse` dataclass. * Added `databricks.sdk.service.vectorsearch.Struct` dataclass. * Added `databricks.sdk.service.vectorsearch.Value` dataclass. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Merged
hectorcast-db
added a commit
that referenced
this pull request
May 23, 2024
### Improvements and new features * Better error message when private link enabled workspaces reject requests ([#647](#647)). ### API Changes: * Changed `list()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service to require request of `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Removed [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service. * Added [w.quality_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/quality_monitors.html) workspace-level service. * Renamed `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Changed `schema_name` field for `databricks.sdk.service.catalog.DisableRequest` to `str` dataclass. * Removed `databricks.sdk.service.catalog.DisableSchemaName` dataclass. * Changed `schema_name` field for `databricks.sdk.service.catalog.EnableRequest` to `str` dataclass. * Removed `databricks.sdk.service.catalog.EnableSchemaName` dataclass. * Renamed `databricks.sdk.service.catalog.GetLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `next_page_token` field for `databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboard_id` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Added `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Added `databricks.sdk.service.catalog.MonitorRefreshListResponse` dataclass. * Changed `cluster_status()` method for [w.libraries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/libraries.html) workspace-level service to return `databricks.sdk.service.compute.ClusterLibraryStatuses` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterAttributes`. * Changed `spec` and `cluster_source` fields for `databricks.sdk.service.compute.ClusterDetails` to `databricks.sdk.service.compute.ClusterSpec` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterSpec`. * Removed `databricks.sdk.service.compute.ClusterStatusResponse` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.CreateCluster`. * Removed `clone_from` and `cluster_source` fields for `databricks.sdk.service.compute.EditCluster`. * Removed `sort_by_spec` field for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` and `sort_by` fields for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` field for `databricks.sdk.service.marketplace.SearchListingsRequest`. * Removed `databricks.sdk.service.marketplace.SortBySpec` dataclass. * Removed `databricks.sdk.service.marketplace.SortOrder` dataclass. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.CreatePipeline`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.EditPipeline`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.PipelineSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.SchemaSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.TableSpec`. * Added `databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfig` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfigScdType` dataclass. * Added `deployment_artifacts` field for `databricks.sdk.service.serving.AppDeployment`. * Added `route_optimized` field for `databricks.sdk.service.serving.CreateServingEndpoint`. * Added `contents` field for `databricks.sdk.service.serving.ExportMetricsResponse`. * Changed `openai_api_key` field for `databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Added `microsoft_entra_client_id`, `microsoft_entra_client_secret` and `microsoft_entra_tenant_id` fields for `databricks.sdk.service.serving.OpenAiConfig`. * Added `endpoint_url` and `route_optimized` fields for `databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `databricks.sdk.service.serving.AppDeploymentArtifacts` dataclass. * Added `storage_root` field for `databricks.sdk.service.sharing.CreateShare`. * Added `storage_location` and `storage_root` fields for `databricks.sdk.service.sharing.ShareInfo`. * Added `storage_root` field for `databricks.sdk.service.sharing.UpdateShare`. * Added `scan_index()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Added `databricks.sdk.service.vectorsearch.ListValue` dataclass. * Added `databricks.sdk.service.vectorsearch.MapStringValueEntry` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexRequest` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexResponse` dataclass. * Added `databricks.sdk.service.vectorsearch.Struct` dataclass. * Added `databricks.sdk.service.vectorsearch.Value` dataclass. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
github-merge-queue bot
pushed a commit
that referenced
this pull request
May 23, 2024
### Improvements and new features * Better error message when private link enabled workspaces reject requests ([#647](#647)). ### API Changes * Renamed [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service to [w.quality_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/quality_monitors.html) . * Added `databricks.sdk.service.vectorsearch.ListValue` dataclass. * Added `databricks.sdk.service.vectorsearch.MapStringValueEntry` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexRequest` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexResponse` dataclass. * Added `databricks.sdk.service.vectorsearch.Struct` dataclass. * Added `databricks.sdk.service.vectorsearch.Value` dataclass. * Added `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Added `databricks.sdk.service.catalog.MonitorRefreshListResponse` dataclass. * Added `databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfig` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfigScdType` dataclass. * Added `databricks.sdk.service.serving.AppDeploymentArtifacts` dataclass. * Removed `databricks.sdk.service.catalog.EnableSchemaName` dataclass. * Removed `databricks.sdk.service.catalog.DisableSchemaName` dataclass. * Removed `databricks.sdk.service.marketplace.SortBySpec` dataclass. * Removed `databricks.sdk.service.marketplace.SortOrder` dataclass. * Renamed `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Renamed `databricks.sdk.service.catalog.GetLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `next_page_token` field for `databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboard_id` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Added `is_ascending` and `sort_by` fields for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` field for `databricks.sdk.service.marketplace.SearchListingsRequest`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.CreatePipeline`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.EditPipeline`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.PipelineSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.SchemaSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.TableSpec`. * Added `deployment_artifacts` field for `databricks.sdk.service.serving.AppDeployment`. * Added `route_optimized` field for `databricks.sdk.service.serving.CreateServingEndpoint`. * Added `contents` field for `databricks.sdk.service.serving.ExportMetricsResponse`. * Added `microsoft_entra_client_id`, `microsoft_entra_client_secret` and `microsoft_entra_tenant_id` fields for `databricks.sdk.service.serving.OpenAiConfig`. * Added `endpoint_url` and `route_optimized` fields for `databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `storage_root` field for `databricks.sdk.service.sharing.CreateShare`. * Added `storage_location` and `storage_root` fields for `databricks.sdk.service.sharing.ShareInfo`. * Added `storage_root` field for `databricks.sdk.service.sharing.UpdateShare`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Changed `schema_name` field for `databricks.sdk.service.catalog.DisableRequest` to `str` dataclass. * Changed `schema_name` field for `databricks.sdk.service.catalog.EnableRequest` to `str` dataclass. * Changed `cluster_status()` method for [w.libraries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/libraries.html) workspace-level service to return `databricks.sdk.service.compute.ClusterLibraryStatuses` dataclass. * Changed `spec` and `cluster_source` fields for `databricks.sdk.service.compute.ClusterDetails` to `databricks.sdk.service.compute.ClusterSpec` dataclass. * Changed `openai_api_key` field for `databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterAttributes`. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterSpec`. * Removed `databricks.sdk.service.compute.ClusterStatusResponse` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.CreateCluster`. * Removed `clone_from` and `cluster_source` fields for `databricks.sdk.service.compute.EditCluster`. * Removed `sort_by_spec` field for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `scan_index()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service. * Changed `list()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service to require request of `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
This PR ports databricks/databricks-sdk-go#924 to the Python SDK.
When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message:
To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a custom error,
PrivateLinkValidationError
, with error codePRIVATE_LINK_VALIDATION_ERROR
that inherits from PermissionDenied and has a mock 403 status code.After this change, users will see an error message like this:
The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI.
Tests
Unit tests cover the private link error message mapping.
To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the
last_job_runs.py
example using the host & token:make test
run locallymake fmt
applied