Skip to content

Commit

Permalink
Add full_refresh attribute to the pipeline_task in databricks_job
Browse files Browse the repository at this point in the history
This allows to force full refresh of the pipeline from the job

this fixes #2362
  • Loading branch information
alexott committed Jun 29, 2023
1 parent 37b4320 commit e44a234
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 1 deletion.
1 change: 1 addition & 0 deletions docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,7 @@ You can invoke Spark submit tasks only on new clusters. **In the `new_cluster` s
### pipeline_task Configuration Block

* `pipeline_id` - (Required) The pipeline's unique ID.
* `full_refresh` - (Optional) (Bool) Specifies if there should be full refresh of the pipeline.

-> **Note** The following configuration blocks are only supported inside a `task` block

Expand Down
3 changes: 2 additions & 1 deletion jobs/resource_job.go
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,8 @@ type PythonWheelTask struct {

// PipelineTask contains the information for pipeline jobs
type PipelineTask struct {
PipelineID string `json:"pipeline_id"`
PipelineID string `json:"pipeline_id"`
FullRefresh bool `json:"full_refresh,omitempty"`
}

type SqlQueryTask struct {
Expand Down

0 comments on commit e44a234

Please sign in to comment.