Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support HEAD operation and response Headers #547

Merged
merged 18 commits into from
Feb 19, 2024

Conversation

hectorcast-db
Copy link
Contributor

@hectorcast-db hectorcast-db commented Feb 16, 2024

Changes

Support HEAD operation and response Headers

Test

  • make fmt
  • make test
  • run integration tests (both with current sha and master sha)

@codecov-commenter
Copy link

codecov-commenter commented Feb 16, 2024

Codecov Report

Attention: 18 lines in your changes are missing coverage. Please review.

Comparison is base (94fc5e4) 57.92% compared to head (341b8f4) 57.91%.
Report is 1 commits behind head on main.

Files Patch % Lines
databricks/sdk/service/billing.py 37.50% 5 Missing ⚠️
databricks/sdk/service/files.py 37.50% 5 Missing ⚠️
databricks/sdk/service/serving.py 40.00% 3 Missing ⚠️
databricks/sdk/core.py 33.33% 2 Missing ⚠️
databricks/sdk/mixins/workspace.py 0.00% 2 Missing ⚠️
databricks/sdk/service/iam.py 50.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #547      +/-   ##
==========================================
- Coverage   57.92%   57.91%   -0.02%     
==========================================
  Files          45       45              
  Lines       27993    28015      +22     
==========================================
+ Hits        16216    16225       +9     
- Misses      11777    11790      +13     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@hectorcast-db hectorcast-db changed the title [Test] Support header operations Support HEAD operation and response Headers Feb 19, 2024
@@ -153,7 +153,7 @@ class AccountClient:
:param workspace: The workspace to construct a client for.
:return: A ``WorkspaceClient`` for the given workspace.
"""
config = self._config.copy()
config = self._config.deep_copy()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug fix. The generated code was already updated here

9af2630#diff-3847ef3d682f5902da2252a8d7b3a0745d15a4b81ef42cd73f73be158622c5d6R812

@@ -221,7 +221,9 @@ def test_files_api_upload_download(ucws, random):
f = io.BytesIO(b"some text data")
target_file = f'/Volumes/main/{schema}/{volume}/filesit-{random()}.txt'
w.files.upload(target_file, f)

# TODO: Enable after generating with the latest spec
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added to run integration tests with the master sha. I will uncomment it later today during the release.

Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ApiClient.do() interface is really hard to work around. It assumes too much about how requests are deserialized, resulting in really strange workarounds like this. We do need to revisit this, but it may be a bit more work than we can afford given our current timelines.

One idea would be to introduce a "Response" wrapper type like we have in the Go/Java SDKs. This would expose the core response attributes, like json() to deserialize the response as JSON, headers, etc. In the other SDKs, the type is a parameter to the ApiClient, meaning that we can deserialize reflectively in the API client depending on what is requested. Again, this isn't fleshed out, so we would need to work through it.

In the meantime, I think we should try to preserve some consistency by making from_dict work well with the response of ApiClient.do without such a drastic change to the interface.

.codegen/service.py.tmpl Show resolved Hide resolved
databricks/sdk/core.py Outdated Show resolved Hide resolved
.codegen/service.py.tmpl Show resolved Hide resolved
Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One small nit, otherwise LGTM. This is worth calling out in the release notes about the change in behavior of the raw flag on ApiClient.do().

.codegen/service.py.tmpl Outdated Show resolved Hide resolved
Co-authored-by: Miles Yucht <miles@databricks.com>
Signed-off-by: hectorcast-db <hector.castejon@databricks.com>
@hectorcast-db hectorcast-db added this pull request to the merge queue Feb 19, 2024
Merged via the queue into main with commit f21b2a7 Feb 19, 2024
9 checks passed
@hectorcast-db hectorcast-db deleted the support-headers-operation branch February 19, 2024 15:19
hectorcast-db added a commit that referenced this pull request Feb 19, 2024
Major Changes:

* Updated behaviour for raw parameter in `ApiClient.do()` method. The ainaryIO is not returned directly anymore, but as part of a dict which can contain response headers. The raw

Internal Changes:

* Add get_workspace_id to docgen blocklist ([#549](#549)).
* Support HEAD operation and response Headers ([#547](#547)).

API Changes:

 * Changed `delete()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service with new required argument order.
 * Changed `get()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service with new required argument order.
 * Changed `update()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service with new required argument order.
 * Changed `update()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order.
 * Changed `delete()` method for [w.volumes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/volumes.html) workspace-level service with new required argument order.
 * Changed `read()` method for [w.volumes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/volumes.html) workspace-level service with new required argument order.
 * Changed `update()` method for [w.volumes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/volumes.html) workspace-level service with new required argument order.
 * Added [w.online_tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/online_tables.html) workspace-level service.
 * Removed `name_arg` field for `databricks.sdk.service.catalog.DeleteConnectionRequest`.
 * Added `name` field for `databricks.sdk.service.catalog.DeleteConnectionRequest`.
 * Removed `full_name_arg` field for `databricks.sdk.service.catalog.DeleteVolumeRequest`.
 * Added `name` field for `databricks.sdk.service.catalog.DeleteVolumeRequest`.
 * Removed `name_arg` field for `databricks.sdk.service.catalog.GetConnectionRequest`.
 * Added `name` field for `databricks.sdk.service.catalog.GetConnectionRequest`.
 * Added `max_results` field for `databricks.sdk.service.catalog.ListVolumesRequest`.
 * Added `page_token` field for `databricks.sdk.service.catalog.ListVolumesRequest`.
 * Added `next_page_token` field for `databricks.sdk.service.catalog.ListVolumesResponseContent`.
 * Removed `full_name_arg` field for `databricks.sdk.service.catalog.ReadVolumeRequest`.
 * Added `name` field for `databricks.sdk.service.catalog.ReadVolumeRequest`.
 * Removed `name_arg` field for `databricks.sdk.service.catalog.UpdateConnection`.
 * Added `name` field for `databricks.sdk.service.catalog.UpdateConnection`.
 * Removed `assets_dir` field for `databricks.sdk.service.catalog.UpdateMonitor`.
 * Removed `full_name_arg` field for `databricks.sdk.service.catalog.UpdateVolumeRequestContent`.
 * Added `name` field for `databricks.sdk.service.catalog.UpdateVolumeRequestContent`.
 * Added `databricks.sdk.service.catalog.ContinuousUpdateStatus` dataclass.
 * Added `databricks.sdk.service.catalog.DeleteOnlineTableRequest` dataclass.
 * Added `databricks.sdk.service.catalog.FailedStatus` dataclass.
 * Added `databricks.sdk.service.catalog.GetOnlineTableRequest` dataclass.
 * Added `databricks.sdk.service.catalog.OnlineTable` dataclass.
 * Added `databricks.sdk.service.catalog.OnlineTableSpec` dataclass.
 * Added `databricks.sdk.service.catalog.OnlineTableState` dataclass.
 * Added `databricks.sdk.service.catalog.OnlineTableStatus` dataclass.
 * Added `databricks.sdk.service.catalog.PipelineProgress` dataclass.
 * Added `databricks.sdk.service.catalog.ProvisioningStatus` dataclass.
 * Added `databricks.sdk.service.catalog.TriggeredUpdateStatus` dataclass.
 * Added `databricks.sdk.service.catalog.ViewData` dataclass.
 * Added `get_directory_metadata()` method for [w.files](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/files.html) workspace-level service.
 * Added `get_metadata()` method for [w.files](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/files.html) workspace-level service.
 * Added `content_length` field for `databricks.sdk.service.files.DownloadResponse`.
 * Added `content_type` field for `databricks.sdk.service.files.DownloadResponse`.
 * Added `last_modified` field for `databricks.sdk.service.files.DownloadResponse`.
 * Added `databricks.sdk.service.files.FileSize` dataclass.
 * Added `databricks.sdk.service.files.GetDirectoryMetadataRequest` dataclass.
 * Added `databricks.sdk.service.files.GetMetadataRequest` dataclass.
 * Added `databricks.sdk.service.files.GetMetadataResponse` dataclass.
 * Added `databricks.sdk.service.files.LastModifiedHttpDate` dataclass.
 * Removed `trigger_history` field for `databricks.sdk.service.jobs.Job`.
 * Removed `databricks.sdk.service.jobs.TriggerEvaluation` dataclass.
 * Removed `databricks.sdk.service.jobs.TriggerHistory` dataclass.
 * Added `table` field for `databricks.sdk.service.jobs.TriggerSettings`.
 * Added `databricks.sdk.service.jobs.Condition` dataclass.
 * Added `databricks.sdk.service.jobs.TableTriggerConfiguration` dataclass.
 * Removed `config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `ai21labs_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `anthropic_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `aws_bedrock_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `cohere_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `databricks_model_serving_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `openai_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Added `palm_config` field for `databricks.sdk.service.serving.ExternalModel`.
 * Removed `databricks.sdk.service.serving.ExternalModelConfig` dataclass.
 * Added `max_provisioned_throughput` field for `databricks.sdk.service.serving.ServedEntityInput`.
 * Added `min_provisioned_throughput` field for `databricks.sdk.service.serving.ServedEntityInput`.
 * Added `max_provisioned_throughput` field for `databricks.sdk.service.serving.ServedEntityOutput`.
 * Added `min_provisioned_throughput` field for `databricks.sdk.service.serving.ServedEntityOutput`.
 * Changed `delete()` method for [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service with new required argument order.
 * Changed `get()` method for [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service with new required argument order.
 * Changed `update()` method for [w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html) workspace-level service with new required argument order.
 * Removed `name_arg` field for `databricks.sdk.service.sharing.DeleteCleanRoomRequest`.
 * Added `name` field for `databricks.sdk.service.sharing.DeleteCleanRoomRequest`.
 * Removed `name_arg` field for `databricks.sdk.service.sharing.GetCleanRoomRequest`.
 * Added `name` field for `databricks.sdk.service.sharing.GetCleanRoomRequest`.
 * Removed `name_arg` field for `databricks.sdk.service.sharing.UpdateCleanRoom`.
 * Added `name` field for `databricks.sdk.service.sharing.UpdateCleanRoom`.
 * Added `enum_options` field for `databricks.sdk.service.sql.Parameter`.
 * Added `multi_values_options` field for `databricks.sdk.service.sql.Parameter`.
 * Added `query_id` field for `databricks.sdk.service.sql.Parameter`.
 * Added `databricks.sdk.service.sql.MultiValuesOptions` dataclass.

OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date: 2024-02-19
@hectorcast-db hectorcast-db mentioned this pull request Feb 19, 2024
github-merge-queue bot pushed a commit that referenced this pull request Feb 19, 2024
Major Changes:

* Updated behaviour for raw parameter in `ApiClient.do()` method. The
raw data is not returned directly anymore, but as part of a dict with
the `contents` key. This dict will also contain response headers if
returned by the API.

Internal Changes:

* Add get_workspace_id to docgen blocklist
([#549](#549)).
* Support HEAD operation and response Headers
([#547](#547)).

API Changes:

* Changed `delete()`, `get()` and `update()` methods for
[w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html)
workspace-level service with new required argument order.
* Changed `update()` method for
[w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html)
workspace-level service with new required argument order.
* Changed `delete()`, `get()` and `update()` methods for
[w.volumes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/volumes.html)
workspace-level service with new required argument order.
* Added
[w.online_tables](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/online_tables.html)
workspace-level service.
* Renamed `name_arg` field to `name` for the following dataclasses:
`databricks.sdk.service.catalog.DeleteConnectionRequest`,
   `databricks.sdk.service.catalog.GetConnectionRequest`,
   `databricks.sdk.service.catalog.UpdateConnection`,
   `databricks.sdk.service.sharing.DeleteCleanRoomRequest`, 
   `databricks.sdk.service.sharing.GetCleanRoomRequest` and
   `databricks.sdk.service.sharing.UpdateCleanRoom`.
* Removed `full_name_arg` field for
`databricks.sdk.service.catalog.DeleteVolumeRequest`.
* Added `name` field for
`databricks.sdk.service.catalog.DeleteVolumeRequest`.
* Added `max_results` field for
`databricks.sdk.service.catalog.ListVolumesRequest`.
* Added `page_token` field for
`databricks.sdk.service.catalog.ListVolumesRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.catalog.ListVolumesResponseContent`.
* Removed `full_name_arg` field for
`databricks.sdk.service.catalog.ReadVolumeRequest`.
* Added `name` field for
`databricks.sdk.service.catalog.ReadVolumeRequest`.
* Removed `assets_dir` field for
`databricks.sdk.service.catalog.UpdateMonitor`.
* Removed `full_name_arg` field for
`databricks.sdk.service.catalog.UpdateVolumeRequestContent`.
* Added `name` field for
`databricks.sdk.service.catalog.UpdateVolumeRequestContent`.
* Added the following catalog dataclasses: `ContinuousUpdateStatus`,
`DeleteOnlineTableRequest`, `FailedStatus`,
`GetOnlineTableRequest`, `OnlineTable`, `OnlineTableSpec`,
`OnlineTableState`, `OnlineTableStatus`,
`PipelineProgress`, `ProvisioningStatus`, `TriggeredUpdateStatus` and
`ViewData`.
* Added `get_directory_metadata()` method for
[w.files](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/files.html)
workspace-level service.
* Added `get_metadata()` method for
[w.files](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/files.html)
workspace-level service.
* Added `content_length`, `content_type` and `last_modified` fields for
`databricks.sdk.service.files.DownloadResponse`.
* Added the following files dataclasses: `FileSize`,
`GetDirectoryMetadataRequest`, `GetMetadataRequest`,
   `GetMetadataResponse` and `LastModifiedHttpDate`.
* Removed `trigger_history` field for `databricks.sdk.service.jobs.Job`.
 * Removed `databricks.sdk.service.jobs.TriggerEvaluation` dataclass.
 * Removed `databricks.sdk.service.jobs.TriggerHistory` dataclass.
* Added `table` field for `databricks.sdk.service.jobs.TriggerSettings`.
 * Added `databricks.sdk.service.jobs.Condition` dataclass.
* Added `databricks.sdk.service.jobs.TableTriggerConfiguration`
dataclass.
* Removed `config` field for
`databricks.sdk.service.serving.ExternalModel`.
* Removed `databricks.sdk.service.serving.ExternalModelConfig`
dataclass. Fields moved to
`databricks.sdk.service.serving.ExternalModel`.
* Added `max_provisioned_throughput` and `min_provisioned_throughput`
fields for `databricks.sdk.service.serving.ServedEntityInput`.
* Added `max_provisioned_throughput` and `min_provisioned_throughput`
fields for `databricks.sdk.service.serving.ServedEntityOutput`.
* Changed `delete()` method for
[w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html)
workspace-level service with new required argument order.
* Changed `get()` method for
[w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html)
workspace-level service with new required argument order.
* Changed `update()` method for
[w.clean_rooms](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clean_rooms.html)
workspace-level service with new required argument order.
* Added `enum_options` field for `databricks.sdk.service.sql.Parameter`.
* Added `multi_values_options` field for
`databricks.sdk.service.sql.Parameter`.
 * Added `query_id` field for `databricks.sdk.service.sql.Parameter`.
 * Added `databricks.sdk.service.sql.MultiValuesOptions` dataclass.

OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date: 2024-02-19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants