<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Workspace Client  dbutils issue in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/workspace-client-dbutils-issue/m-p/139297#M51140</link>
    <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The error where files and directories can be read at the root ADLS level but not at the blob/subdirectory level, combined with a "No file or directory exists on path" message, is frequently due to permission configuration, incorrect path usage, or networking restrictions in Azure Databricks.​&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Troubleshooting Permissions and Access&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks may require "Storage Blob Data Reader" or "Contributor" roles at the storage account level, not just at the container or subdirectory level, especially when using Databricks Compute or serverless workloads.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Unity Catalog and workspace browsing may succeed with container-level permissions, but DBFS commands or direct SDK access from code (like your use of WorkspaceClient) may need broader account-level access due to how requests are made.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If multiple Service Principals are used, ensure the correct one with proper access is referenced.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Potential bugs or limitations exist where even with correct setup, Databricks fails to access subdirectories unless permissions are provided at the top storage account level.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Common Path and Configuration Issues&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Double-check the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;abfss://&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;path syntax is correct. Ensure there are no typos or encoding errors; the path must be formatted exactly for ADLS Gen2.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The difference between mounting a container to DBFS and direct access can affect success. If not mounted, ensure Spark or DatabricksConnect configurations explicitly support direct ADLS access.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If using DBFS mounts (&lt;CODE&gt;/mnt/*&lt;/CODE&gt;), check if remounting or restarting the cluster helps after rotating keys or permissions.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Network and Firewall Considerations&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If the storage account restricts access to specific VNets, IPs, or uses Private Endpoints, the Databricks cluster must be correctly networked. Sometimes allowing traffic from all networks momentarily can confirm if this is the blocker.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Verify that the workspace, cluster, and storage account networking settings (VNet integration, service endpoints, firewalls) explicitly allow Databricks traffic.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Recommendations&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Ensure "Storage Blob Data Reader" is assigned at the storage account level (not just to the container/subdirectory), at least for testing.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Validate the path and string encoding for your&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;abfss://...&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;references.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Cross-check which Service Principal or Managed Identity is being used; permissions must match.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If working with DBFS mounts, try remounting and restarting the cluster after role changes.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Review all networking and firewall settings if you suspect connectivity issues.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If these suggestions do not resolve the issue, it is possible you are encountering a limitation or bug with how Databricks Connect or the WorkspaceClient makes permission checks for blob-level paths, which may require raising a support ticket or checking with Databricks for known issues with container-level or subdirectory access&lt;/P&gt;</description>
    <pubDate>Mon, 17 Nov 2025 11:28:03 GMT</pubDate>
    <dc:creator>mark_ott</dc:creator>
    <dc:date>2025-11-17T11:28:03Z</dc:date>
    <item>
      <title>Workspace Client  dbutils issue</title>
      <link>https://community.databricks.com/t5/data-engineering/workspace-client-dbutils-issue/m-p/83971#M37088</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;I&gt;host = "&lt;A class="" title="https://adb-xxxxxx.xx.azuredatabricks.net/" href="https://adb-xxxxxx.xx.azuredatabricks.net/" target="_blank" rel="noreferrer noopener"&gt;&lt;I&gt;https://adb-xxxxxx.xx.azuredatabricks.net&lt;I&gt;"&lt;/I&gt;&lt;/I&gt;&lt;/A&gt;&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;token = "dapxxxxxxx"&lt;/I&gt;&lt;/P&gt;&lt;P&gt;we are using databricksconnect&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;I&gt;from databricks.sdk import WorkspaceClient&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;dbutil = WorkspaceClient(host=host,token=token).dbutils&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;files = dbutil.fs.ls("abfss://container-name@storage-account-name.dfs.core.windows.net/some_path")&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;print(files)&lt;BR /&gt;&lt;BR /&gt;I'm able to read the directories/files in the&amp;nbsp;root level, but not at the blob level. I have permissions to the ADLS external location.&amp;nbsp;&lt;BR /&gt;DatabricksError: No file or directory exists on path &lt;A class="" title="mailto:container-name@storage-account" href="mailto:container-name@storage-account" target="_blank" rel="noreferrer noopener"&gt;container-name@storage-accountname.dfs.core.windows.net/some_path.&lt;BR /&gt;DatabricksError Traceback (most recent call last) File &lt;/A&gt;&lt;A target="_blank" rel="noreferrer noopener"&gt;&amp;lt;command-2860011694601089&amp;gt;, line 7 4 from databricks.sdkimport WorkspaceClient 6 dbutil = WorkspaceClient(host=host,token=token).dbutils ----&amp;gt; 7 files = dbutil.fs.ls("&lt;/A&gt;&lt;A title="abfss://datamaniacs-gold-data@datamaniacsgoldnonprod.dfs.core.windows.net/test_cp%3C/a%3E%3C/span%3E%3Cspan%3E&amp;amp;quot;%3C/span%3E" target="_blank" rel="noreferrer noopener"&gt;abfss://container-name@storage-account-name.dfs.core.windows.net/some_path&amp;lt;/a&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;span&amp;gt;&amp;amp;quot;&amp;lt;/span&amp;gt;) 8 print(files) File /databricks/python/lib/python3.10/site-packages/databricks/sdk/dbutils.py:52, in _FsUtil.ls(self, dir) 50 """Lists the contents of a directory """ 51 result = [] ---&amp;gt; 52 for f in self._dbfs.list(dir): 53 name = f.path.split('/')[-1] 54 result.append(FileInfo(f'dbfs:{f.path}', name, f.file_size, f.modification_time)) File /databricks/python/lib/python3.10/site-packages/databricks/sdk/mixins/dbfs.py:331, in DbfsExt.list(self, path, recursive)329 while queue: 330 path, queue = queue[0], queue[1:] --&amp;gt; 331 for file_info in super().list(path): 332 if recursive andfile_info.is_dir:&lt;BR /&gt;&lt;BR /&gt;However I'm able to list out contents using dbutils.fs commands the regular way&lt;/A&gt;&lt;/I&gt;&lt;/P&gt;&lt;P&gt;Reference Link:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/databricks-utilities" target="_blank" rel="noopener"&gt;https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect/python/databricks-utilities&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 22 Aug 2024 20:38:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/workspace-client-dbutils-issue/m-p/83971#M37088</guid>
      <dc:creator>SrinuM</dc:creator>
      <dc:date>2024-08-22T20:38:17Z</dc:date>
    </item>
    <item>
      <title>Re: Workspace Client  dbutils issue</title>
      <link>https://community.databricks.com/t5/data-engineering/workspace-client-dbutils-issue/m-p/139297#M51140</link>
      <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The error where files and directories can be read at the root ADLS level but not at the blob/subdirectory level, combined with a "No file or directory exists on path" message, is frequently due to permission configuration, incorrect path usage, or networking restrictions in Azure Databricks.​&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Troubleshooting Permissions and Access&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Databricks may require "Storage Blob Data Reader" or "Contributor" roles at the storage account level, not just at the container or subdirectory level, especially when using Databricks Compute or serverless workloads.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Unity Catalog and workspace browsing may succeed with container-level permissions, but DBFS commands or direct SDK access from code (like your use of WorkspaceClient) may need broader account-level access due to how requests are made.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If multiple Service Principals are used, ensure the correct one with proper access is referenced.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Potential bugs or limitations exist where even with correct setup, Databricks fails to access subdirectories unless permissions are provided at the top storage account level.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Common Path and Configuration Issues&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Double-check the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;abfss://&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;path syntax is correct. Ensure there are no typos or encoding errors; the path must be formatted exactly for ADLS Gen2.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The difference between mounting a container to DBFS and direct access can affect success. If not mounted, ensure Spark or DatabricksConnect configurations explicitly support direct ADLS access.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If using DBFS mounts (&lt;CODE&gt;/mnt/*&lt;/CODE&gt;), check if remounting or restarting the cluster helps after rotating keys or permissions.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Network and Firewall Considerations&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If the storage account restricts access to specific VNets, IPs, or uses Private Endpoints, the Databricks cluster must be correctly networked. Sometimes allowing traffic from all networks momentarily can confirm if this is the blocker.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Verify that the workspace, cluster, and storage account networking settings (VNet integration, service endpoints, firewalls) explicitly allow Databricks traffic.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Recommendations&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Ensure "Storage Blob Data Reader" is assigned at the storage account level (not just to the container/subdirectory), at least for testing.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Validate the path and string encoding for your&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;abfss://...&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;references.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Cross-check which Service Principal or Managed Identity is being used; permissions must match.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If working with DBFS mounts, try remounting and restarting the cluster after role changes.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Review all networking and firewall settings if you suspect connectivity issues.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;If these suggestions do not resolve the issue, it is possible you are encountering a limitation or bug with how Databricks Connect or the WorkspaceClient makes permission checks for blob-level paths, which may require raising a support ticket or checking with Databricks for known issues with container-level or subdirectory access&lt;/P&gt;</description>
      <pubDate>Mon, 17 Nov 2025 11:28:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/workspace-client-dbutils-issue/m-p/139297#M51140</guid>
      <dc:creator>mark_ott</dc:creator>
      <dc:date>2025-11-17T11:28:03Z</dc:date>
    </item>
  </channel>
</rss>

