<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/101087#M2463</link>
    <description>&lt;P&gt;HTTP 403 is the correct response, as Databricks is forbidden from accessing the resource.&amp;nbsp; You need to add your VNet to the allowlist for this to work.&lt;/P&gt;</description>
    <pubDate>Thu, 05 Dec 2024 14:54:50 GMT</pubDate>
    <dc:creator>Rjdudley</dc:creator>
    <dc:date>2024-12-05T14:54:50Z</dc:date>
    <item>
      <title>Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/82445#M1535</link>
      <description>&lt;P&gt;&lt;STRONG&gt;The problem&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;After setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.&lt;/P&gt;&lt;P&gt;However, I cannot access the volume from notebook. I created an All-purpose compute, and run&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;dbutils.fs.ls("/Volumes/catalog1/schema1/volumn11"). Then I got the error&lt;/P&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;Operation failed: "This request is not authorized to perform this operation.", 403, GET&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;&lt;STRONG&gt;How we set up Unity Catalog and Managed Volume&lt;/STRONG&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;I am the Azure Databricks Account Admin, Metastore Admin, and Workspace Admin&lt;/LI&gt;&lt;LI&gt;I created an Azure Databricks Workspace (Premium Tier)&lt;/LI&gt;&lt;LI&gt;I created a Databricks Metastore, named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;metastore1&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;I created an Azure ADSL Gen2 (storage account with Hierarchical namespace enabled), named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;I created an Azure Access Connector for Azure Databricks (as an Azure Managed Identity), named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;access_connector_for_dbr_1&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;In the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1&lt;/FONT&gt;, I assigned the roles&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;Storage Blob Data Contributor&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;Storage Queue Data Contributor&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;to the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;access_connector_for_dbr_1&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;I created two ADSL Gen2 containers under&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1&lt;/FONT&gt;&lt;UL&gt;&lt;LI&gt;One named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1_container_catalog_default&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;Another one named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1_container_schema1&lt;/FONT&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;I created a Databricks Storage Credentials, named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_strg_cred_1&lt;/FONT&gt;&lt;UL&gt;&lt;LI&gt;The&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;connector id&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;is the resource id of&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;access_connector_for_dbr_1&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;The Permissions of the Storage Credentials were not set (empty)&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;I created two Databricks External Locations, both use the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_strg_cred_1&lt;/FONT&gt;&lt;UL&gt;&lt;LI&gt;One external location named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_ext_loc_catalog_default&lt;/FONT&gt;, points to the ADSL Gen2 Container&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1_container_catalog_default&lt;/FONT&gt;, and the Permissions of this External Location were not set (empty)&lt;/LI&gt;&lt;LI&gt;Another one named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_ext_loc_schema1&lt;/FONT&gt;, points to the ADSL Gen2 Container&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;adsl_gen2_1_container_schema1&lt;/FONT&gt;, and the Permissions of this External Location were not set (empty)&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;I created a Databricks Catalog, named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;catalog1&lt;/FONT&gt;, under&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;metastore1&lt;/FONT&gt;, and set&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_ext_loc_catalog_default&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/FONT&gt;as this catalog's Storage Location&lt;/LI&gt;&lt;LI&gt;I created a Databricks Schema, named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;schema1&lt;/FONT&gt;, under&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;catalog1&lt;/FONT&gt;, and set&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;dbr_ext_loc_schema1&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/FONT&gt;as this schema's Storage Location&lt;/LI&gt;&lt;LI&gt;I created a Databricks Volume, named&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;volumn11&lt;/FONT&gt;, under&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;schema1&lt;/FONT&gt;.&lt;/LI&gt;&lt;LI&gt;On Databricks UI, I can upload files to the volume and download files from the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;FONT color="#339966"&gt;volume11&lt;/FONT&gt;&lt;/LI&gt;&lt;LI&gt;However, when I created an All-purpose compute, and run the below Python codes, I always got the error "Operation failed: "This request is not authorized to perform this operation.", 403, GET".&lt;UL&gt;&lt;LI&gt;dbutils.fs.ls("/Volumes/catalog1/schema1/volumn11")&lt;/LI&gt;&lt;LI&gt;dbutils.fs.ls("dbfs:/Volumes/catalog1/schema1/volumn11")&lt;/LI&gt;&lt;LI&gt;spark.read.format("csv").option("header","True").load("/Volumes/catalog1/schema1/volumn11/123.csv")&lt;/LI&gt;&lt;LI&gt;spark.read.format("csv").option("header","True").load("dbfs:/Volumes/catalog1/schema1/volumn11/123.csv")&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;Details about the All-purpose compute&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Type: Single node&lt;/LI&gt;&lt;LI&gt;Access mode: Single user&lt;/LI&gt;&lt;LI&gt;Single user access: myself&lt;/LI&gt;&lt;LI&gt;Runtime version: 14.3 LTS&lt;/LI&gt;&lt;LI&gt;Enable credential passthrough for user-level data access: disabled&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 08 Aug 2024 21:42:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/82445#M1535</guid>
      <dc:creator>AlbertWang</dc:creator>
      <dc:date>2024-08-08T21:42:08Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/82452#M1536</link>
      <description>&lt;P&gt;I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.&lt;/P&gt;&lt;P&gt;When I enable the ADSL Gen2's Public network access from all networks as shown below, I&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;can&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;access the volume from a notebook.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="1.png" style="width: 584px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10248i958E401E67AFF871/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="1.png" alt="1.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;However, if I enable the ADSL Gen2's Public network access from selected virtual networks and IP addresses as shown below, I&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;cannot&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;access the volume from a notebook. Even though I added the VM's public IP to the whitelist, added the resource&amp;nbsp;&lt;/SPAN&gt;Microsoft.Databricks/accessConnectors&lt;SPAN&gt;&amp;nbsp;to the resource instances, and enabled the Exceptions&amp;nbsp;&lt;/SPAN&gt;Allow Azure services on the trusted services list to access this storage account&lt;SPAN&gt;. As I understand, my compute has the Unity Catalog badge, it should access the ADSL Gen2 via the Access Connector for Databricks (Managed Identity), so it should be able to access the ADSL Gen2 via the Access Connector for Databricks.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2.png" style="width: 604px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10249iC16132009F6CC9EB/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="2.png" alt="2.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="3.png" style="width: 999px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10250i887ECB422368FD84/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="3.png" alt="3.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 08 Aug 2024 23:27:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/82452#M1536</guid>
      <dc:creator>AlbertWang</dc:creator>
      <dc:date>2024-08-08T23:27:35Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/97458#M2222</link>
      <description>&lt;P&gt;I had this exact issue though for me the problem was I had not configured private endpoints for the "dfs" and "queue" services, only for "blob". Once I added the missing private endpoints I could list and write to the catalog from a notebook without issues.&lt;/P&gt;</description>
      <pubDate>Mon, 04 Nov 2024 06:59:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/97458#M2222</guid>
      <dc:creator>RichardDriven</dc:creator>
      <dc:date>2024-11-04T06:59:46Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/100270#M2410</link>
      <description>&lt;P&gt;Thank you for this answer! I had exactly the same issue and your post solved my problem.&lt;/P&gt;&lt;P&gt;It really shouldn't throw a 403 error if that is the issue.&lt;/P&gt;</description>
      <pubDate>Wed, 27 Nov 2024 18:11:09 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/100270#M2410</guid>
      <dc:creator>jhruzik</dc:creator>
      <dc:date>2024-11-27T18:11:09Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/101086#M2462</link>
      <description>&lt;P&gt;No no no, don't do this!&amp;nbsp; You should have your Databricks running in a VNet (ref:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/vnet-inject" target="_blank"&gt;Deploy Azure Databricks in your Azure virtual network (VNet injection) - Azure Databricks | Microsoft Learn&lt;/A&gt;).&lt;/P&gt;&lt;P&gt;You then select "&lt;SPAN&gt;Enabled from selected virtual networks and IP addresses" and add your VNet to the allowlist.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Rjdudley_0-1733410358405.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/13341i3EAD30BD27696B65/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Rjdudley_0-1733410358405.png" alt="Rjdudley_0-1733410358405.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;When you go to set up Serverless Compute, you will be given a list of VNets to add to this list, you will add those here also.&lt;/P&gt;</description>
      <pubDate>Thu, 05 Dec 2024 14:53:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/101086#M2462</guid>
      <dc:creator>Rjdudley</dc:creator>
      <dc:date>2024-12-05T14:53:56Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/101087#M2463</link>
      <description>&lt;P&gt;HTTP 403 is the correct response, as Databricks is forbidden from accessing the resource.&amp;nbsp; You need to add your VNet to the allowlist for this to work.&lt;/P&gt;</description>
      <pubDate>Thu, 05 Dec 2024 14:54:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-unity-catalog-cannot-access-managed-volume-in/m-p/101087#M2463</guid>
      <dc:creator>Rjdudley</dc:creator>
      <dc:date>2024-12-05T14:54:50Z</dc:date>
    </item>
  </channel>
</rss>

