<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Unable to access Databricks Volume from job triggered via API (Container Services) in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/137936#M4410</link>
    <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks Volumes (especially Unity Catalog (UC) volumes) often have strict execution context requirements and typically expect the workload to run in Databricks-managed clusters or notebooks where the specialized file system and security context are present. Your description suggests a known friction point: container services, which may not fully replicate the Databricks runtime's native context, cause access to Volumes to fail—even if credentialed access to raw S3 works fine.&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Are Volumes Intentionally Not Accessible from Container Services?&lt;/H2&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Yes, this is generally by design as of late 2025. Databricks Volumes and UC volumes are mounted using filesystem drivers and security controls that expect a Databricks-managed execution context. When jobs are run in externalized container services (like Docker/Podman containers spun up outside the Databricks control plane), the Volumes file system layer is often unavailable:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Native "/Volumes" and Unity Catalog paths depend on Databricks' FUSE/DBFS/VFS overlays.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;These overlays are only present in Databricks-provisioned environments (clusters, serverless compute, or the interactive Databricks notebook context).&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Externalized workloads via Databricks Container Services or custom drivers (like REST API-triggered containers or Kubernetes pods) typically do NOT have direct access to these overlays, leading to permission errors or mount failures.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Official Documentation on Execution Contexts and Volume Access&lt;/H2&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks documentation does clarify this restriction, but it is often scattered:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Unity Catalog and Volumes&lt;/STRONG&gt;: The official Unity Catalog documentation notes that Volumes are only accessible from Databricks clusters with Unity Catalog enabled, and not from all external interfaces. Only Databricks-interactive or workflow clusters can resolve "/Volumes/" paths.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;DBFS and REST API/Containers&lt;/STRONG&gt;: The documentation also notes that paths like "/Volumes", "/mnt" (for legacy DBFS mounts), and related VFS overlays are not available in:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Jobs running on external custom Kubernetes clusters using Databricks Container Services.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Direct REST API containers, or any compute that runs outside the Databricks cluster control plane.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Recommended Approach&lt;/STRONG&gt;: For workloads that need to access data both inside and outside Databricks, official best practice is to:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Use direct access to cloud object storage (e.g., S3 paths) for jobs that may run outside native Databricks compute contexts.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Avoid using "/Volumes" or "/mnt" mounts outside of Databricks-managed clusters.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Permissions&lt;/STRONG&gt;: Even with Unity Catalog privileges and correct instance profile configuration, the FUSE driver and security context are missing in container service jobs; hence, access fails.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;References&lt;/H2&gt;
&lt;DIV class="group relative"&gt;
&lt;DIV class="w-full overflow-x-auto md:max-w-[90vw] border-subtlest ring-subtlest divide-subtlest bg-transparent"&gt;
&lt;TABLE class="border-subtler my-[1em] w-full table-auto border-separate border-spacing-0 border-l border-t"&gt;
&lt;THEAD class="bg-subtler"&gt;
&lt;TR&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Feature&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Databricks Native Cluster&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;REST API Custom Container&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;External Container Service&lt;/TH&gt;
&lt;/TR&gt;
&lt;/THEAD&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Access&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;/Volumes&lt;/CODE&gt;&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Direct S3 access&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Unity Catalog access&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;/DIV&gt;
&lt;DIV class="bg-base border-subtler shadow-subtle pointer-coarse:opacity-100 right-xs absolute bottom-0 flex rounded-lg border opacity-0 transition-opacity group-hover:opacity-100 [&amp;amp;&amp;gt;*:not(:first-child)]:border-subtle [&amp;amp;&amp;gt;*:not(:first-child)]:border-l"&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Summary&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Volumes and UC paths are intentionally unavailable in Databricks jobs executed via container services or externalized REST API-launched containers.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Official documentation for Unity Catalog and data access paths explicitly limits access to Databricks-managed clusters and does not support mounting within containers that lack the Databricks FUSE/VFS environment.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Direct S3 access remains available everywhere you have credentials, and is the officially recommended approach for hybrid workloads.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;</description>
    <pubDate>Thu, 06 Nov 2025 11:38:18 GMT</pubDate>
    <dc:creator>mark_ott</dc:creator>
    <dc:date>2025-11-06T11:38:18Z</dc:date>
    <item>
      <title>Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/113812#M3185</link>
      <description>&lt;P class=""&gt;Hi everyone,&lt;/P&gt;&lt;P class=""&gt;We’re facing a strange issue when trying to access a &lt;SPAN class=""&gt;&lt;STRONG&gt;Databricks Volume&lt;/STRONG&gt;&lt;/SPAN&gt; from a job that is triggered via the &lt;SPAN class=""&gt;&lt;STRONG&gt;Databricks REST API&lt;/STRONG&gt;&lt;/SPAN&gt; (not via Workflows). These jobs are executed using &lt;SPAN class=""&gt;&lt;STRONG&gt;container services&lt;/STRONG&gt;&lt;/SPAN&gt;, which may be relevant, perhaps due to isolation constraints that prevent access to certain Databricks-native features.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P class=""&gt;The job runs, but when we try to perform a basic file operation like:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;with open("/Volumes/folder_example/file.txt", "r") as f:
    data = f.read()&lt;/LI-CODE&gt;&lt;P class=""&gt;We get the following error:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;PermissionError: [Errno 1] Operation not permitted: '/Volumes/folder_example/file.txt'&lt;/LI-CODE&gt;&lt;P class=""&gt;We increased the log level and got more detail in the traceback:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;TaskException: Task in task_example failed: An error occurred while calling o424.load.
: com.databricks.backend.daemon.data.common.InvalidMountException: Error while using path /Volumes/folder_example/ for creating file system within mount at '/Volumes/folder_example/'.
	at com.databricks.backend.daemon.data.common.InvalidMountException$.apply(DataMessages.scala:765)&lt;/LI-CODE&gt;&lt;P class=""&gt;From the error message and behavior, we suspect this could be related to how container services isolate the job’s execution environment possibly preventing it from accessing &lt;SPAN class=""&gt;&lt;STRONG&gt;Unity Catalog Volumes&lt;/STRONG&gt;&lt;/SPAN&gt;, since these mounts may not be available or reachable outside of a native Databricks execution context.&lt;/P&gt;&lt;P class=""&gt;However, we haven’t found official documentation clearly explaining &lt;SPAN class=""&gt;&lt;STRONG&gt;whether Databricks Volumes can be accessed in jobs triggered this way&lt;/STRONG&gt;&lt;/SPAN&gt;, or under which conditions access is denied.&lt;/P&gt;&lt;P class=""&gt;We can access the &lt;SPAN class=""&gt;&lt;STRONG&gt;same data directly from S3 using the instance profile&lt;/STRONG&gt;&lt;/SPAN&gt; without any issues. This is expected, since the S3 path is accessed directly via the instance profile credentials&lt;/P&gt;&lt;P class=""&gt;•&amp;nbsp;&lt;SPAN class=""&gt;Are Volumes intentionally &lt;/SPAN&gt;&lt;STRONG&gt;not accessible from container services&lt;/STRONG&gt;?&lt;/P&gt;&lt;P class=""&gt;•&amp;nbsp;&lt;SPAN class=""&gt;Is there any official documentation detailing &lt;/SPAN&gt;&lt;STRONG&gt;execution contexts and their access to UC/Volumes/Workspace paths&lt;/STRONG&gt;&lt;SPAN class=""&gt;?&lt;/SPAN&gt;&lt;/P&gt;&lt;P class=""&gt;Thanks in advance! &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;BR /&gt;&lt;BR /&gt;Isi&lt;/P&gt;</description>
      <pubDate>Thu, 27 Mar 2025 14:19:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/113812#M3185</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-03-27T14:19:58Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/114872#M3221</link>
      <description>&lt;P&gt;Is the Volume mounted using unity catalog?&lt;BR /&gt;&lt;BR /&gt;Do user/service principal used run_as job has required access on the volume?&lt;/P&gt;</description>
      <pubDate>Wed, 09 Apr 2025 03:08:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/114872#M3221</guid>
      <dc:creator>rcdatabricks</dc:creator>
      <dc:date>2025-04-09T03:08:11Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/115319#M3241</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/155368"&gt;@rcdatabricks&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Yes, its unity catalog volume and has permissions. I am able to access it from a notebook or a job but not using this container services...&lt;BR /&gt;&lt;BR /&gt;Any idea?&lt;BR /&gt;Thanks,&lt;BR /&gt;Isi&lt;/P&gt;</description>
      <pubDate>Sat, 12 Apr 2025 09:55:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/115319#M3241</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-04-12T09:55:27Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/116075#M3276</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/145555"&gt;@Isi&lt;/a&gt;&amp;nbsp;Are you using databricks-sdk library to access this volumes?&lt;BR /&gt;example:&amp;nbsp;&lt;BR /&gt;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/sdk-python#files-in-volumes:~:text=Catalog%20volume.-,Python,-from%20databricks" target="_blank"&gt;https://docs.databricks.com/aws/en/dev-tools/sdk-python#files-in-volumes:~:text=Catalog%20volume.-,Python,-from%20databricks&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Mon, 21 Apr 2025 13:48:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/116075#M3276</guid>
      <dc:creator>rcdatabricks</dc:creator>
      <dc:date>2025-04-21T13:48:29Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124311#M3593</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/145555"&gt;@Isi&lt;/a&gt;&amp;nbsp;Did you find solution to this issue. I am facing the exact problem right now.&lt;/P&gt;</description>
      <pubDate>Mon, 07 Jul 2025 11:02:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124311#M3593</guid>
      <dc:creator>rxj</dc:creator>
      <dc:date>2025-07-07T11:02:03Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124790#M3623</link>
      <description>&lt;P&gt;Check also whether the cluster used to run the job has the right access to the specific UC Volume.&lt;/P&gt;</description>
      <pubDate>Thu, 10 Jul 2025 13:25:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124790#M3623</guid>
      <dc:creator>Octavian1</dc:creator>
      <dc:date>2025-07-10T13:25:40Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124837#M3628</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/174022"&gt;@rxj&lt;/a&gt;&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/96336"&gt;@Octavian1&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;No &lt;span class="lia-unicode-emoji" title=":disappointed_face:"&gt;😞&lt;/span&gt; I didn`t but a volume is like a door to the storage, so we end reading the path directly with boto3&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Jul 2025 20:32:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/124837#M3628</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-07-10T20:32:38Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to access Databricks Volume from job triggered via API (Container Services)</title>
      <link>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/137936#M4410</link>
      <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks Volumes (especially Unity Catalog (UC) volumes) often have strict execution context requirements and typically expect the workload to run in Databricks-managed clusters or notebooks where the specialized file system and security context are present. Your description suggests a known friction point: container services, which may not fully replicate the Databricks runtime's native context, cause access to Volumes to fail—even if credentialed access to raw S3 works fine.&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Are Volumes Intentionally Not Accessible from Container Services?&lt;/H2&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Yes, this is generally by design as of late 2025. Databricks Volumes and UC volumes are mounted using filesystem drivers and security controls that expect a Databricks-managed execution context. When jobs are run in externalized container services (like Docker/Podman containers spun up outside the Databricks control plane), the Volumes file system layer is often unavailable:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Native "/Volumes" and Unity Catalog paths depend on Databricks' FUSE/DBFS/VFS overlays.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;These overlays are only present in Databricks-provisioned environments (clusters, serverless compute, or the interactive Databricks notebook context).&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Externalized workloads via Databricks Container Services or custom drivers (like REST API-triggered containers or Kubernetes pods) typically do NOT have direct access to these overlays, leading to permission errors or mount failures.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Official Documentation on Execution Contexts and Volume Access&lt;/H2&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks documentation does clarify this restriction, but it is often scattered:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Unity Catalog and Volumes&lt;/STRONG&gt;: The official Unity Catalog documentation notes that Volumes are only accessible from Databricks clusters with Unity Catalog enabled, and not from all external interfaces. Only Databricks-interactive or workflow clusters can resolve "/Volumes/" paths.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;DBFS and REST API/Containers&lt;/STRONG&gt;: The documentation also notes that paths like "/Volumes", "/mnt" (for legacy DBFS mounts), and related VFS overlays are not available in:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Jobs running on external custom Kubernetes clusters using Databricks Container Services.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Direct REST API containers, or any compute that runs outside the Databricks cluster control plane.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Recommended Approach&lt;/STRONG&gt;: For workloads that need to access data both inside and outside Databricks, official best practice is to:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Use direct access to cloud object storage (e.g., S3 paths) for jobs that may run outside native Databricks compute contexts.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Avoid using "/Volumes" or "/mnt" mounts outside of Databricks-managed clusters.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Permissions&lt;/STRONG&gt;: Even with Unity Catalog privileges and correct instance profile configuration, the FUSE driver and security context are missing in container service jobs; hence, access fails.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;References&lt;/H2&gt;
&lt;DIV class="group relative"&gt;
&lt;DIV class="w-full overflow-x-auto md:max-w-[90vw] border-subtlest ring-subtlest divide-subtlest bg-transparent"&gt;
&lt;TABLE class="border-subtler my-[1em] w-full table-auto border-separate border-spacing-0 border-l border-t"&gt;
&lt;THEAD class="bg-subtler"&gt;
&lt;TR&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Feature&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Databricks Native Cluster&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;REST API Custom Container&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;External Container Service&lt;/TH&gt;
&lt;/TR&gt;
&lt;/THEAD&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Access&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;/Volumes&lt;/CODE&gt;&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Direct S3 access&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Unity Catalog access&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;/DIV&gt;
&lt;DIV class="bg-base border-subtler shadow-subtle pointer-coarse:opacity-100 right-xs absolute bottom-0 flex rounded-lg border opacity-0 transition-opacity group-hover:opacity-100 [&amp;amp;&amp;gt;*:not(:first-child)]:border-subtle [&amp;amp;&amp;gt;*:not(:first-child)]:border-l"&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Summary&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Volumes and UC paths are intentionally unavailable in Databricks jobs executed via container services or externalized REST API-launched containers.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Official documentation for Unity Catalog and data access paths explicitly limits access to Databricks-managed clusters and does not support mounting within containers that lack the Databricks FUSE/VFS environment.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;&lt;STRONG&gt;Direct S3 access remains available everywhere you have credentials, and is the officially recommended approach for hybrid workloads.&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Thu, 06 Nov 2025 11:38:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unable-to-access-databricks-volume-from-job-triggered-via-api/m-p/137936#M4410</guid>
      <dc:creator>mark_ott</dc:creator>
      <dc:date>2025-11-06T11:38:18Z</dc:date>
    </item>
  </channel>
</rss>

