<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Environment in serverless in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136845#M50657</link>
    <description>&lt;P&gt;I'm playing little bit with on the Databricks free environment and I'm super confused by the documentation vs actual behavior. Maybe you could help me to understand better.&lt;/P&gt;&lt;P&gt;For the workspace I can define base environment which I can use in serverless compute. For example I defined mine as:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;environment_version: '4'
dependencies:
  - --index-url https://pypi.org/simple
  - databricks-sdk&amp;gt;=0.71.0
  - databricks-labs-dqx&amp;gt;=0.9.3
  - openpyxl&lt;/LI-CODE&gt;&lt;P&gt;I started a notebook, changed the base environment and it worked. As per &lt;A href="https://docs.databricks.com/aws/en/admin/workspace-settings/base-environment#limitations" target="_self"&gt;documentation:&lt;/A&gt;&lt;/P&gt;&lt;P&gt;"&lt;SPAN&gt;For jobs, only notebook tasks can use base environments&lt;/SPAN&gt;".&lt;/P&gt;&lt;P&gt;I was really happy to see this because I played with serverless jobs before and I couldn't use environments in jobs. So I tried again with very simple job definition:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;resources:
  jobs:
    New_Job_Oct_31_2025_12_37_AM:
      name: New Job Oct 31, 2025, 12:37 AM
      tasks:
        - task_key: test
          notebook_task:
            notebook_path: /Workspace/Users/hidden@gmail.com/test2
            source: WORKSPACE
          environment_key: some_environment_key
      queue:
        enabled: true
      performance_target: PERFORMANCE_OPTIMIZED
      environments:
        - environment_key: some_environment_key
          spec:
            client: "4"&lt;/LI-CODE&gt;&lt;P&gt;I know that the definition is not complete because I'm missing dependencies but upon saving the job definition, I'm still getting:&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;"A task environment can not be provided for notebook task test. Please use the %pip magic command to install notebook-scoped Python libraries and Python wheel packages".&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Hence my confusion.&lt;/P&gt;&lt;P&gt;Is it possible to use environments with notebook tasks?&lt;/P&gt;&lt;P&gt;It's strange that it works with serverless in interactive notebook&amp;nbsp; when I switch to my base environment, and doesn't work with job task.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 31 Oct 2025 00:00:14 GMT</pubDate>
    <dc:creator>pepco</dc:creator>
    <dc:date>2025-10-31T00:00:14Z</dc:date>
    <item>
      <title>Environment in serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136845#M50657</link>
      <description>&lt;P&gt;I'm playing little bit with on the Databricks free environment and I'm super confused by the documentation vs actual behavior. Maybe you could help me to understand better.&lt;/P&gt;&lt;P&gt;For the workspace I can define base environment which I can use in serverless compute. For example I defined mine as:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;environment_version: '4'
dependencies:
  - --index-url https://pypi.org/simple
  - databricks-sdk&amp;gt;=0.71.0
  - databricks-labs-dqx&amp;gt;=0.9.3
  - openpyxl&lt;/LI-CODE&gt;&lt;P&gt;I started a notebook, changed the base environment and it worked. As per &lt;A href="https://docs.databricks.com/aws/en/admin/workspace-settings/base-environment#limitations" target="_self"&gt;documentation:&lt;/A&gt;&lt;/P&gt;&lt;P&gt;"&lt;SPAN&gt;For jobs, only notebook tasks can use base environments&lt;/SPAN&gt;".&lt;/P&gt;&lt;P&gt;I was really happy to see this because I played with serverless jobs before and I couldn't use environments in jobs. So I tried again with very simple job definition:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;resources:
  jobs:
    New_Job_Oct_31_2025_12_37_AM:
      name: New Job Oct 31, 2025, 12:37 AM
      tasks:
        - task_key: test
          notebook_task:
            notebook_path: /Workspace/Users/hidden@gmail.com/test2
            source: WORKSPACE
          environment_key: some_environment_key
      queue:
        enabled: true
      performance_target: PERFORMANCE_OPTIMIZED
      environments:
        - environment_key: some_environment_key
          spec:
            client: "4"&lt;/LI-CODE&gt;&lt;P&gt;I know that the definition is not complete because I'm missing dependencies but upon saving the job definition, I'm still getting:&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;"A task environment can not be provided for notebook task test. Please use the %pip magic command to install notebook-scoped Python libraries and Python wheel packages".&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Hence my confusion.&lt;/P&gt;&lt;P&gt;Is it possible to use environments with notebook tasks?&lt;/P&gt;&lt;P&gt;It's strange that it works with serverless in interactive notebook&amp;nbsp; when I switch to my base environment, and doesn't work with job task.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 31 Oct 2025 00:00:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136845#M50657</guid>
      <dc:creator>pepco</dc:creator>
      <dc:date>2025-10-31T00:00:14Z</dc:date>
    </item>
    <item>
      <title>Re: Environment in serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136926#M50666</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/154800"&gt;@pepco&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;&lt;STRONG&gt;Is it possible to use environments with notebook tasks?&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;Yes—but only in a very specific way.&lt;/P&gt;
&lt;P&gt;Notebook tasks can use base environments, but you don’t attach them in the job’s YAML. You pick the base env in the notebook’s Environment side panel, and notebook job runs inherit that. If you put environment_key on a notebook task, the Jobs API rejects it with the exact error you saw—that setting is for task environments on non-notebook tasks (Python script / wheel / dbt)&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Doc&lt;/STRONG&gt;:&lt;A href="https://docs.databricks.com/aws/en/admin/workspace-settings/base-environment" target="_blank"&gt;https://docs.databricks.com/aws/en/admin/workspace-settings/base-environment&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 31 Oct 2025 11:15:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136926#M50666</guid>
      <dc:creator>K_Anudeep</dc:creator>
      <dc:date>2025-10-31T11:15:22Z</dc:date>
    </item>
    <item>
      <title>Re: Environment in serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136988#M50676</link>
      <description>&lt;P&gt;Hello,&amp;nbsp;&lt;/P&gt;&lt;P&gt;You are correct I can choose environment in the notebook. But in my opinion documentation is miss-leading because it says: "F&lt;SPAN&gt;or jobs, only notebook tasks can use base environments.".&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Based on the explanation it basically means that it can be used &lt;STRONG&gt;only when notebook is run manually&lt;/STRONG&gt; with attached serverless cluster using custom environment specification.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 31 Oct 2025 13:47:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/environment-in-serverless/m-p/136988#M50676</guid>
      <dc:creator>pepco</dc:creator>
      <dc:date>2025-10-31T13:47:29Z</dc:date>
    </item>
  </channel>
</rss>

