<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: External Locations to Azure Storage via Private Endpoint in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120961#M3441</link>
    <description>&lt;P&gt;I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with zero private endpoints. No idea how, but for anyone coming across this, the solution is to use an Access Connector for Azure Databricks.&lt;/P&gt;</description>
    <pubDate>Wed, 04 Jun 2025 18:47:07 GMT</pubDate>
    <dc:creator>Marthinus</dc:creator>
    <dc:date>2025-06-04T18:47:07Z</dc:date>
    <item>
      <title>External Locations to Azure Storage via Private Endpoint</title>
      <link>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120895#M3438</link>
      <description>&lt;P&gt;When working with Azure Databricks (with VNET injection) to connect securely to an Azure Storage account via private endpoint, there's a few locations it needs to connect to, firstly the vnet that databricks is connected to, which works well when connecting with blob client in a notebook.&lt;BR /&gt;Then connecting to serverless compute following this guide:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-link" target="_blank"&gt;Configure private connectivity from serverless compute - Azure Databricks | Microsoft Learn&lt;/A&gt;, which doesn't seem to work, as then performing the same operation errors with `&lt;SPAN&gt;This request is not authorized to perform this operation.`&lt;BR /&gt;&lt;BR /&gt;But most importantly, none of this touches on the control plane, which Unity Catalog uses for external locations, and I can't seem to find documentation on how to create a private endpoint to the control plane at all?&lt;BR /&gt;&lt;BR /&gt;Is there any guidance on how to create an external location using private endpoint Azure Storage account? Trying to create it ends with error: `Failed to access cloud storage: [AbfsRestOperationException] () exceptionTraceId=XXX`, and now other details.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 04 Jun 2025 09:27:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120895#M3438</guid>
      <dc:creator>Marthinus</dc:creator>
      <dc:date>2025-06-04T09:27:56Z</dc:date>
    </item>
    <item>
      <title>Re: External Locations to Azure Storage via Private Endpoint</title>
      <link>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120908#M3440</link>
      <description>&lt;DIV class="paragraph"&gt;Here are some things to consider:&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;To securely connect Azure Databricks to an Azure Storage account via a private endpoint using Unity Catalog, here are key considerations and steps aligning with the documentation:&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Setting up External Locations with Unity Catalog 1. &lt;STRONG&gt;Use Managed Identities&lt;/STRONG&gt;: - Unity Catalog supports storage credentials using Azure managed identities, which eliminate the need for secret rotation and can access storage accounts protected by network rules. Configure the managed identity to have "Storage Blob Data Contributor" or "Storage Blob Delegator" roles on the Azure Storage account.&lt;/DIV&gt;
&lt;OL start="2"&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;Create Storage Credentials&lt;/STRONG&gt;:
&lt;UL&gt;
&lt;LI&gt;Create a storage credential linked to the managed identity. This serves as the authentication mechanism to access the Azure Data Lake Storage Gen2 account.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;Define External Locations&lt;/STRONG&gt;:
&lt;UL&gt;
&lt;LI&gt;Link the storage credential to an external location that specifies the path within the storage account. Access to these external locations can be controlled using dedicated ACLs within Unity Catalog.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;Networking Configuration&lt;/STRONG&gt;:
&lt;UL&gt;
&lt;LI&gt;Ensure private endpoints are established for the storage account to allow access from the Databricks workspace. The private endpoints should cover appropriate sub-resources, such as &lt;CODE&gt;dfs&lt;/CODE&gt; and &lt;CODE&gt;blob&lt;/CODE&gt;, required for operations.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;Verify Network Rules&lt;/STRONG&gt;:
&lt;UL&gt;
&lt;LI&gt;If public network access is disabled for the storage account, ensure that the managed identity is added to the allowed list of network rules. Alternatively, enable private link connectivity.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;DIV class="paragraph"&gt;Addressing Access Issues (e.g., AbfsRestOperationException) To resolve errors such as "[AbfsRestOperationException] Operation failed: 'This request is not authorized to perform this operation,'" ensure the following: - Validate the storage path against the appropriate external location registered in Unity Catalog, confirming the storage credential has sufficient permissions. - Test connectivity to the storage account from Databricks using tools like &lt;CODE&gt;curl&lt;/CODE&gt; or &lt;CODE&gt;nslookup&lt;/CODE&gt; to confirm private endpoints and network configurations are operational.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Private Endpoint Configuration for Unity Catalog Control Plane The control plane in Unity Catalog can utilize back-end private link connections for secure operations when deployed in VNets with private endpoint support: - Each Azure Databricks workspace's control plane can connect privately to core services through back-end private link connections. This guards sensitive control plane traffic against public exposure, enhancing security for governance and metadata management.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Best Practices for External Locations - Avoid mounting storage accounts to DBFS directly to ensure Unity Catalog ACLs are enforced. External locations should be used solely with ACLs defined at the Unity Catalog level.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;These steps and best practices are essential to ensure secure and efficient connectivity between Azure Databricks and Azure Storage accounts via Unity Catalog. For troubleshooting and specific configurations, refer to networking guides and Unity Catalog troubleshooting documentation.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Cheers, Louis.&lt;/DIV&gt;</description>
      <pubDate>Wed, 04 Jun 2025 11:26:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120908#M3440</guid>
      <dc:creator>Louis_Frolio</dc:creator>
      <dc:date>2025-06-04T11:26:42Z</dc:date>
    </item>
    <item>
      <title>Re: External Locations to Azure Storage via Private Endpoint</title>
      <link>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120961#M3441</link>
      <description>&lt;P&gt;I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with zero private endpoints. No idea how, but for anyone coming across this, the solution is to use an Access Connector for Azure Databricks.&lt;/P&gt;</description>
      <pubDate>Wed, 04 Jun 2025 18:47:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/external-locations-to-azure-storage-via-private-endpoint/m-p/120961#M3441</guid>
      <dc:creator>Marthinus</dc:creator>
      <dc:date>2025-06-04T18:47:07Z</dc:date>
    </item>
  </channel>
</rss>

