cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Deploying using Databricks asset bundles (DABs) in a closed network

dbxlearner
New Contributor II

Hello, I'm trying to deploy DBX workflows using DABs using an Azure DevOps pipeline, in a network that cannot download the required terraform databricks provider package online, due to firewall/network restrictions.

I have followed this post: https://github.com/databricks/cli/issues/1238 on how to do this by providing the terraform databricks provider file locally, however, the terraform databricks provider version it's trying to use on the 'databricks bundle deploy' command (1.71.0) it is not matching my file version (1.50.0), hence I'm getting the following "Error: failed to query available provider packages, could not retrieve the list of available versions for provider databricks/databricks: no available releases match the given constrains 1.71.0".

I had a look at the bundle.tf.json file, that the databricks bundle deploy generates when using the deploy command and it is stating to use version 1.71.0, is there a way to set this to the version I want?

3 REPLIES 3

dbxlearner
New Contributor II

Another thing I noticed is, when running the 'databricks bundle debug terraform' command, it mentions these variables:

dbxlearner_0-1748044088599.png

I have tried setting these variables as environment variables in my ADO pipeline, specially the databricks terraform provider variable, but it has not made any effect. If anyone has previously used these variables would appreciate if you can let me know how to set them properly.

 

sunilr8
New Contributor II

@dbxlearner  were you able to fix the issue ? i am running into same one 

mark_ott
Databricks Employee
Databricks Employee

There is currently no direct configuration field inside a Databricks Asset Bundle (DAB) to override the Terraform Databricks provider version that databricks bundle deploy uses. The provider version (like 1.71.0) is automatically chosen by the Databricks CLI based on the bundle schema it was built against. However, there are workarounds that allow you to force DAB to use your local provider (like 1.50.0) even in a closed or airโ€‘gapped network.โ€‹

Recommended Workaround

To pin the provider version offline and bypass the registry check, you can preโ€‘create a Terraform lock file and local mirror.

  1. Create a Terraform CLI config file (terraform.rc or cli.tfrc):

    text
    provider_installation { filesystem_mirror { path = "/path/to/providers" include = ["*/*/*"] } direct { exclude = ["*/*/*"] } }

    Then set the environment variable:

    bash
    export TF_CLI_CONFIG_FILE=/path/to/terraform.rc
  2. Manually create a terraform.hcl.lock file:
    Before running databricks bundle deploy, create a file under
    .databricks/bundle/stage/terraform/terraform.hcl.lock and specify your desired provider version and checksum (matching your downloaded 1.50.0 provider).
    This locks the CLI to that version and disables remote version queries.โ€‹

  3. Set Databricks CLI environment variables in your DevOps pipeline:

    bash
    export DATABRICKS_TF_PROVIDER_VERSION=1.50.0 export DATABRICKS_TF_EXEC_PATH=/path/to/terraform export DATABRICKS_TF_CLI_CONFIG_FILE=/path/to/terraform.rc

    While DATABRICKS_TF_PROVIDER_VERSION does not always override the internal version constraint, setting it along with the local lock and mirror ensures Terraform uses your supplied local provider.โ€‹

  4. Avoid reโ€‘generation of bundle.tf.json between runs:
    The CLI autoโ€‘produces this each deploy, so rely on overriding behavior with lock files and mirrors rather than editing it directly.

Why editing bundle.tf.json does not work

bundle.tf.json is autoโ€‘generated and overwritten by the Databricks CLI each deployment. Even if you manually set the version inside it, the CLI will regenerate it based on the internal template schema that hardcodes the supported provider (1.71.0 currently).โ€‹

Summary

Goal Action
Force older provider version Preโ€‘create .databricks/bundle/stage/terraform/terraform.hcl.lock referencing your desired version โ€‹
Avoid internet download Configure TF_CLI_CONFIG_FILE to a local provider_installation mirror โ€‹
Databricks CLI ignores DATABRICKS_TF_PROVIDER_VERSION Use with lock file and mirror together; standalone variable is not enough โ€‹
 
 

By combining a manually created local mirror and lockfile, you can reliably use a pinned Databricks provider (like 1.50.0) inside a restricted network where automatic provider fetching fails.