Describe the issue
Description:
Similar to the strict OpenAPI schema validation issues reported in #4883, the Databricks CLI currently fails to compile and deploy valid, officially documented configurations for the Lakeflow Connect Meta Ads connector.
Specifically, the CLI rejects the meta_ads_options block nested under connector_options during a bundle deploy or pipelines update, effectively locking users out of pulling conversion data (action breakdowns) via IaC.
Official Documentation Reference:
According to the official Lakeflow Connect documentation for Meta Ads, users must configure action_breakdowns and action_attribution_windows inside the meta_ads_options block to retrieve conversion data:
Ingest ad_insights with meta_ads_options
Technical Explanation & Troubleshooting Journey:
When attempting to deploy a Delta Live Tables / Lakeflow Connect pipeline via Databricks Asset Bundles, we defined the ad_insights table configuration to include the meta_ads_options block exactly as documented.
When executing databricks bundle deploy, the local CLI schema validator throws an unknown field: meta_ads_options error/warning and strips the configuration from the payload before sending it to the backend. Attempting to bypass the bundle using databricks pipelines update --json @file.json results in the exact same schema validation stripping the block.
To prove this was strictly a CLI validation bug and not a backend API limitation, we bypassed the CLI's validation entirely by using the raw REST API passthrough: databricks api put /api/2.0/pipelines/ --json @update.json.
This workaround successfully hit the backend REST API, and the Databricks UI instantly reflected the injected action_breakdowns and action_attribution_windows configurations, proving the control plane actively accepts this schema.
However, this introduces an unpassable "Catch-22" for teams relying on CI/CD and Infrastructure as Code:
Because the pipeline is managed by a Databricks Asset Bundle, the overarching state is dictated by the bundle's metadata.json state file. Since the CLI compiler stripped the meta_ads_options during the initial deployment, the official bundle state does not contain the configuration. When we manually trigger the pipeline in the UI (after successfully injecting the JSON via the API), the DLT engine compares the live configuration against the metadata.json state file, detects a drift, and immediately auto-reverts the pipeline to the stripped configuration.
Because the CLI rejects the valid code, it cannot be added to the state file. Because it is not in the state file, the backend auto-erases any manual API overrides. We are completely blocked from configuring Meta Ads conversion ingestion via the CLI.
Steps to reproduce the behavior
Attempt to run databricks bundle deploy with the following valid configuration in the databricks.yml or resources file:
resources:
pipelines:
meta_ads_ingestion_pipeline:
name: meta_ads_ingestion_pipeline
ingestion_definition:
connection_name: meta_ads_connection
objects:
- table:
source_schema: act_123456789
source_table: ad_insights
destination_catalog: raw_catalog
destination_schema: meta_ads
destination_table: ad_insights_action_breakdowns
table_configuration:
scd_type: SCD_TYPE_1
connector_options:
meta_ads_options:
level: ad
action_breakdowns:
- action_type
action_attribution_windows:
- 7d_click
- 1d_view
Expected Behavior
The Databricks CLI should recognize meta_ads_options (and its nested parameters) as a valid block under connector_options for Meta Ads ingestion pipelines, compile it into the bundle state, and successfully pass it to the backend without throwing validation errors or stripping the payload.
Actual Behavior
When executing databricks bundle deploy, the local CLI schema validator throws an unknown field: meta_ads_options error/warning and strips the configuration from the payload before sending it to the backend.
Environment and CLI version
Databricks CLI Version: v0.299.1
Target Environment: Azure Databricks (Lakeflow Connect Preview)
Is this a regression?
N/A
Describe the issue
Description:
Similar to the strict OpenAPI schema validation issues reported in #4883, the Databricks CLI currently fails to compile and deploy valid, officially documented configurations for the Lakeflow Connect Meta Ads connector.
Specifically, the CLI rejects the meta_ads_options block nested under connector_options during a bundle deploy or pipelines update, effectively locking users out of pulling conversion data (action breakdowns) via IaC.
Official Documentation Reference:
According to the official Lakeflow Connect documentation for Meta Ads, users must configure action_breakdowns and action_attribution_windows inside the meta_ads_options block to retrieve conversion data:
Ingest ad_insights with meta_ads_options
Technical Explanation & Troubleshooting Journey:
When attempting to deploy a Delta Live Tables / Lakeflow Connect pipeline via Databricks Asset Bundles, we defined the ad_insights table configuration to include the meta_ads_options block exactly as documented.
When executing databricks bundle deploy, the local CLI schema validator throws an unknown field: meta_ads_options error/warning and strips the configuration from the payload before sending it to the backend. Attempting to bypass the bundle using databricks pipelines update --json @file.json results in the exact same schema validation stripping the block.
To prove this was strictly a CLI validation bug and not a backend API limitation, we bypassed the CLI's validation entirely by using the raw REST API passthrough: databricks api put /api/2.0/pipelines/ --json @update.json.
This workaround successfully hit the backend REST API, and the Databricks UI instantly reflected the injected action_breakdowns and action_attribution_windows configurations, proving the control plane actively accepts this schema.
However, this introduces an unpassable "Catch-22" for teams relying on CI/CD and Infrastructure as Code:
Because the pipeline is managed by a Databricks Asset Bundle, the overarching state is dictated by the bundle's metadata.json state file. Since the CLI compiler stripped the meta_ads_options during the initial deployment, the official bundle state does not contain the configuration. When we manually trigger the pipeline in the UI (after successfully injecting the JSON via the API), the DLT engine compares the live configuration against the metadata.json state file, detects a drift, and immediately auto-reverts the pipeline to the stripped configuration.
Because the CLI rejects the valid code, it cannot be added to the state file. Because it is not in the state file, the backend auto-erases any manual API overrides. We are completely blocked from configuring Meta Ads conversion ingestion via the CLI.
Steps to reproduce the behavior
Attempt to run databricks bundle deploy with the following valid configuration in the databricks.yml or resources file:
Expected Behavior
The Databricks CLI should recognize meta_ads_options (and its nested parameters) as a valid block under connector_options for Meta Ads ingestion pipelines, compile it into the bundle state, and successfully pass it to the backend without throwing validation errors or stripping the payload.
Actual Behavior
When executing databricks bundle deploy, the local CLI schema validator throws an unknown field: meta_ads_options error/warning and strips the configuration from the payload before sending it to the backend.
Environment and CLI version
Databricks CLI Version: v0.299.1
Target Environment: Azure Databricks (Lakeflow Connect Preview)
Is this a regression?
N/A