cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

databricks assets bundles issue

DatabricksEngi1
New Contributor III

Hii all,

I’m working with Databricks Asset Bundles (DAB) and trying to move from a single repository-level bundle to a structure where each workflow (folder under resources/jobs) has its own bundle.
• My repository contains:
• Shared src/variables.yml and src/env/*.yml for common variables and permissions
• Multiple job definitions in resources/jobs/<workflow_name>/*.yml
• A separate databricks.yml inside each workflow folder so that each workflow can be deployed independently as its own bundle.

What happens:
• When I keep a single databricks.yml at the repo root and include all jobs in one bundle → deployment works fine.
• When I move to one bundle per workflow (placing databricks.yml inside each job folder) →
• databricks bundle validate passes or gives only warnings
• but databricks bundle deploy fails with:

Error: exit status 1
Error: failed to read provider configuration schema for registry.terraform.io/databricks/databricks:
failed to instantiate provider "registry.terraform.io/databricks/databricks" to obtain schema:
Unrecognized remote plugin message:
This usually means that the plugin is either invalid or simply needs to be recompiled to support the latest protocol.

This happens even when the two bundles are nearly identical (just adapted per workflow/job). One workflow deploys successfully, another fails with this error.
Has anyone seen this Databricks provider schema error only when splitting bundles by workflow?

 

Thank you

1 ACCEPTED SOLUTION

Accepted Solutions

DatabricksEngi1
New Contributor III

I solved it.

For some reason, the Terraform folder created under the bundles wasn’t set up correctly.

I copied it from a working bundle, and everything completed successfully.

View solution in original post

2 REPLIES 2

Coffee77
New Contributor II

Unfortunately, some variables can not be read/replaced in configuration yaml files so that we can not make these files as dynamic as desired to cover multiple environments. Not sure if this is your case. Take a look to other topics in which there is a need to deploy different jobs per environment. There are some workaround that maybe could fit 🙂

https://www.youtube.com/@CafeConData

DatabricksEngi1
New Contributor III

I solved it.

For some reason, the Terraform folder created under the bundles wasn’t set up correctly.

I copied it from a working bundle, and everything completed successfully.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now