Support Databricks WebAPI 2.1 version and Support existing_cluster_id
and new_cluster
options to create a Job
#4361
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
TL;DR
Delete
function, post requestdata
format errorMessage
in TaskInfo whenPending
StatelifeCycleState
. (SKIPPED
doesn't haveresultState
, butTERMINATING
does)https://docs.databricks.com/en/workflows/jobs/jobs-2.0-api.html#runresultstate
new_cluster
key in the Databricks config.However, some users want to use Databricks API 2.1 and want to use
existing_cluster_id
in the databricks config.Here's the screenshot from the official documentation.
you can find the difference between
existing_cluster_id
andnew_cluster
https://docs.databricks.com/en/workflows/jobs/jobs-2.0-api.html#request-structure
Type
Are all requirements met?
Complete description
Config yaml in dev mode
Example Code
Test it
Screenshots
Delete Function correct
Note: I terminated it after triggering it for 24 seconds.
The job terminated at 27 seconds.
Add Message in TaskInfo when
Pending
StateCreate a job.
Note: I print the log in
databricks/plugin.go
Tracking Issue
#3936
#4362
Related PRs
flyteorg/flytekit#1935