I'm playing little bit with on the Databricks free environment and I'm super confused by the documentation vs actual behavior. Maybe you could help me to understand better.
For the workspace I can define base environment which I can use in serverless compute. For example I defined mine as:
environment_version: '4'
dependencies:
- --index-url https://pypi.org/simple
- databricks-sdk>=0.71.0
- databricks-labs-dqx>=0.9.3
- openpyxl
I started a notebook, changed the base environment and it worked. As per documentation:
"For jobs, only notebook tasks can use base environments".
I was really happy to see this because I played with serverless jobs before and I couldn't use environments in jobs. So I tried again with very simple job definition:
resources:
jobs:
New_Job_Oct_31_2025_12_37_AM:
name: New Job Oct 31, 2025, 12:37 AM
tasks:
- task_key: test
notebook_task:
notebook_path: /Workspace/Users/hidden@gmail.com/test2
source: WORKSPACE
environment_key: some_environment_key
queue:
enabled: true
performance_target: PERFORMANCE_OPTIMIZED
environments:
- environment_key: some_environment_key
spec:
client: "4"
I know that the definition is not complete because I'm missing dependencies but upon saving the job definition, I'm still getting:
"A task environment can not be provided for notebook task test. Please use the %pip magic command to install notebook-scoped Python libraries and Python wheel packages".
Hence my confusion.
Is it possible to use environments with notebook tasks?
It's strange that it works with serverless in interactive notebook when I switch to my base environment, and doesn't work with job task.