-
Notifications
You must be signed in to change notification settings - Fork 139
Description
Describe the issue
In the current implementation, I am unable to deploy a job with a table trigger that depends on one (or more) tables that are not yet available in Unity Catalog.
This hinders CI/CD scenarios in which the table is not yet created in the upper environment as the table creation is relying on a run that is yet to happen
Configuration
A DAB with the below configuration is not deployable unless the table demo.table.trigger already exists
jobs:
demo_job:
name: "Demo - ${bundle.target}"
tasks:
- task_key: demo
notebook_task:
notebook_path: ../notebooks/run_notebook.py
trigger:
pause_status: UNPAUSED
table_update:
table_names:
- demo.table.trigger
Steps to reproduce the behavior
- Run
databricks bundle deploy ... - See error
Expected Behavior
I would expect a WARNING instead of an error.
It makes sense to warn the user that their trigger might not ever happen - but it does not make sense to not allow it to be deployed.
Actual Behavior
Below error message, stating that I can not create a trigger on a job to a table that does not exist yet
Error: terraform apply: exit status 1
Error: cannot create job: The table 'demo.table.trigger' does not exist. Please create a table in Unity Catalog and grant SELECT privilege to xxxx@xxxx.com
OS and CLI version
Databricks CLI v0.283.0 on Windows
Is this a regression?
Did this work in a previous version of the CLI? If so, which versions did you try?
Debug Logs
Output logs if you run the command with debug logs enabled. Example: databricks bundle deploy --log-level=debug. Redact if needed