cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Bundle Validation Error After CLI Upgrade (0.274.0 → 0.276.0)

databricksero
New Contributor II

After upgrading the Databricks CLI from version 0.274.0 to 0.276.0, bundle validation is failing with an error indicating that my configuration is formatted for "open-source Spark Declarative Pipelines" while the CLI now only supports "Lakeflow Declarative Pipelines".

Error MessageError: /path/to/databricks-asset-bundle/jobs/jobs.yml seems to be formatted for open-source Spark Declarative Pipelines. Pipelines CLI currently only supports Lakeflow Declarative Pipelines development. To see an example of a supported pipelines template, create a new Pipelines CLI project with "pipelines init". 

This configuration was working correctly with CLI version 0.274.0. The error occurs when running:databricks bundle validate --target dev 

Configuration Files

pipelines.yml

# reusable yaml anchors
definitions:
environment: &environment
dependencies:
- ../../dist/data_platform*.whl

permissions: &permissions
- group_name: users
level: CAN_RUN


resources:
pipelines:
example_pipeline:
name: example-pipeline
serverless: true
channel: PREVIEW
catalog: ${var.catalog}
schema: ${resources.schemas.bronze_schema.name}
root_path: ${workspace.root_path}/files
environment: *environment
permissions: *permissions
configuration:
BRONZE_SCHEMA: ${var.catalog}.${resources.schemas.bronze_schema.name}
SILVER_SCHEMA: ${var.catalog}.${resources.schemas.silver_schema.name}
libraries:
- file:
path: ${var.datasources_path}/datasource/example/bronze_pipeline.py
- file:
path: ${var.datasources_path}/datasource/example/silver_pipeline.py

jobs.yml

# reusable yaml anchors
definitions:
environments: &environments
- environment_key: default
spec:
environment_version: "3"
dependencies:
- ../../dist/data_platform*.whl

permissions: &permissions
- group_name: users
level: CAN_MANAGE_RUN

resources:
jobs:
# Example data source job
example_job:
name: example-full-job
tasks:
- task_key: example_ingestion
spark_python_task:
python_file: ${var.datasources_path}/datasource/example/ingest.py
source: WORKSPACE
environment_key: default
- task_key: example_pipeline
pipeline_task:
pipeline_id: ${resources.pipelines.example_pipeline.id}
depends_on:
- task_key: example_ingestion
schedule:
quartz_cron_expression: "0 0 6 ? * MON" # Weekly on Monday at 6 AM UTC (7 AM CET)
timezone_id: "UTC"
pause_status: "UNPAUSED"
email_notifications: ${var.email_notifications}
permissions: *permissions
environments: *environments

 

Questions

 

1. What changed between CLI versions 0.274.0 and 0.276.0 that causes this validation error?
2. Is there a migration path or configuration change required to make existing bundle configurations compatible with the new CLI version?
3. Are there any breaking changes documented for this upgrade that I should be aware of?

 

Any guidance on resolving this issue would be greatly appreciated.

 

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @databricksero ,

It's a bug. I've checked and the PR fixing this bug is already merged to main branch. Check below github thread and then once they build new release just update databricks CLI (soon they should release version without bug). 🙂

Fix oss-source pipeline errors when using the Databricks CLI by shreyas-goenka · Pull Request #3889 ...

View solution in original post

2 REPLIES 2

ManojkMohan
Honored Contributor

@databricksero  

1) What changed between CLI versions 0.274.0 and 0.276.0 that causes this validation error?

The CLI dropped support for open-source Spark Declarative Pipelines and now fully supports only Lakeflow Declarative Pipelines. This means configurations formatted for the older open-source pipeline format are no longer recognized and cause validation errors.A breaking change was introduced where the CLI returns errors for bundles that use configuration formats not compliant with Lakeflow Declarative Pipelines.

2) Is there a migration path or configuration change required to make existing bundle configurations compatible with the new CLI version?

Yes. Existing configurations must be migrated to the Lakeflow Declarative Pipelines format.

3)Are there any breaking changes documented for this upgrade that I should be aware of?

Yes, the main breaking changes are:

Removal of support for open-source Spark Declarative Pipelines bundle validation. The CLI now only supports Lakeflow Declarative Pipelines.

Details:

Generate a new Lakeflow-compliant project to see the updated and supported template:

Run databricks pipelines init in your environment.

This demonstrates the expected structure and configuration syntax for pipelines.yml and related files.

Official documentation for Lakeflow Declarative Pipelines:

Developer guide with reference examples and pipeline lifecycle details:
https://docs.databricks.com/aws/en/ldp/index.html

Release notes and updates for Lakeflow Declarative Pipelines CLI:
https://learn.microsoft.com/en-us/azure/databricks/release-notes/dlt/

Migration and moving tables between pipelines:

Guidance on moving tables between Lakeflow Declarative Pipelines (important if your existing job configurations include table moves or splits):
https://docs.databricks.com/aws/en/ldp/move-table.html
https://learn.microsoft.com/en-us/azure/databricks/ldp/move-table

Practical migration advice:

Compare your current pipelines.yml and jobs.yml closely against a generated Lakeflow project from pipelines init.

Adjust resource, permissions, environment, and path definitions to align with Lakeflow syntax and features such as Unity Catalog support and serverless compute.

Validate your bundle with databricks bundle validate --target dev after migration.

These references provide official and up-to-date information to help you adjust your deployment pipeline bundles to the new supported format and resolve the validation errors encountered in CLI 0.276.0. If you need detailed syntax or examples, the generated sample with pipelines init along with the online documentation will be most useful.

 

szymon_dybczak
Esteemed Contributor III

Hi @databricksero ,

It's a bug. I've checked and the PR fixing this bug is already merged to main branch. Check below github thread and then once they build new release just update databricks CLI (soon they should release version without bug). 🙂

Fix oss-source pipeline errors when using the Databricks CLI by shreyas-goenka · Pull Request #3889 ...