cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Workspace Base Environment not applied when promoting DAB‑deployed notebooks to QA

Charansai
New Contributor III

Hi Team,

I’m trying to understand how Workspace Base Environments interact with serverless compute when using Databricks Asset Bundles (DAB).

According to the documentation:

  • Workspace Base Environments are supported only for serverless Python, Python wheel, and notebook task types

  • Jobs do not support Workspace Base Environments

  • Notebook tasks can use Workspace Base Environments only when the environment is configured directly in the notebook’s settings

  • Serverless notebook compute ≠ serverless job compute

This is causing confusion in my setup.

 

My Setup

  • I created a custom Workspace Base Environment in DEV

  • I attached this base environment to my notebooks using Notebook Settings → Environment

  • When I run the notebook manually in DEV, it correctly uses my custom base environment

  • I deploy the same notebooks to QA using Databricks Asset Bundles

  • But in QA, the notebook always uses the Standard base environment, not my custom one

Here is a simplified version of my DAB job:

yaml
 
tasks:
  - task_key: Create_external_location_conformance
    notebook_task:
      notebook_path: /Workspace/Shared/Unity_Catalog/...
    environment_key: Default
 

My Questions

  1. When a notebook task runs inside a DAB‑deployed job, does it ever use the Workspace Base Environment?

  2. If I want QA to use the same custom base environment that I configured in DEV notebooks, how can I enforce that through DAB?

  3. Is the only supported method to manually open each notebook in QA and re‑select the base environment in Notebook Settings?

  4. Is there any way to automate or propagate the notebook’s environment selection during DAB deployment?

 

What I Observed

  • Running the notebook manually → uses custom base environment

  • Running the notebook via a job → uses serverless job compute, not the base environment

  • DAB’s environment_key seems unrelated to Workspace Base Environments

  • Promotion to QA does not preserve the notebook’s environment selection

 

Goal

I want:

  • DEV notebooks → use custom base environment

  • QA notebooks → also use the same custom base environment( created one in qa)

  • And I want this to happen automatically via DAB, without manually opening each notebook in QA and re‑selecting the environment.

  • and also can we create the workspace base environment for serverless in compute section inside settings with DAB or Terraform

Is this possible today? If not, what is the recommended pattern?

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions

emma_s
Databricks Employee
Databricks Employee

Hi, 

As you've correctly identified workspace base environments aren't currently supported by DABs as they're a relatively new feature. They are more meant to give a quick base environment to workspace users rather than used to deploy notebooks as jobs through DABS.

The correct way to do this is to specify the environment in the DAB and then use this environment for the job, rather than trying to use the workspace base environment. See the snippet below:

resources:
  jobs:
    serverless_job_environment:
      name: serverless_job_environment
      environments:
        - environment_key: default
          spec:
            environment_version: "2"
            dependencies:
              - "my-private-lib==1.2.3"
              - "requests==2.32.*"

      tasks:
        - task_key: nb_task
          notebook_task:
            notebook_path: ../src/notebook.ipynb
          environment_key: default

Docs for this here https://docs.databricks.com/aws/en/dev-tools/bundles/examples

 

I hope this helps.


Thanks,

Emma

 

View solution in original post

1 REPLY 1

emma_s
Databricks Employee
Databricks Employee

Hi, 

As you've correctly identified workspace base environments aren't currently supported by DABs as they're a relatively new feature. They are more meant to give a quick base environment to workspace users rather than used to deploy notebooks as jobs through DABS.

The correct way to do this is to specify the environment in the DAB and then use this environment for the job, rather than trying to use the workspace base environment. See the snippet below:

resources:
  jobs:
    serverless_job_environment:
      name: serverless_job_environment
      environments:
        - environment_key: default
          spec:
            environment_version: "2"
            dependencies:
              - "my-private-lib==1.2.3"
              - "requests==2.32.*"

      tasks:
        - task_key: nb_task
          notebook_task:
            notebook_path: ../src/notebook.ipynb
          environment_key: default

Docs for this here https://docs.databricks.com/aws/en/dev-tools/bundles/examples

 

I hope this helps.


Thanks,

Emma