<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/110503#M9335</link>
    <description>&lt;P&gt;We are trying to connect Databricks to OneLake, to read data from a Fabric workspace into Databricks, using a notebook. We also use Unity Catalog. We are able to read data from the workspace with a Service Principal like this:&lt;/P&gt;&lt;P&gt;from pyspark.sql.types import *&lt;BR /&gt;from pyspark.sql.functions import *&lt;/P&gt;&lt;P&gt;# Credentials&lt;BR /&gt;client_id = xxx&lt;BR /&gt;tenant_id = xxx&lt;BR /&gt;client_secret = xxx&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.auth.type", "OAuth")&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth.provider.type", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.id", client_id)&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.secret", client_secret)&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.endpoint",f"&lt;A href="https://login.microsoftonline.com/{tenant_id}/oauth2/token" target="_blank"&gt;https://login.microsoftonline.com/{tenant_id}/oauth2/token&lt;/A&gt;")&lt;/P&gt;&lt;P&gt;# Define the Onelake parameters&lt;BR /&gt;lakehouse_name = "testlakehouse01"&lt;BR /&gt;workspace_name = "fabrictest"&lt;/P&gt;&lt;P&gt;fullpathtotablesinworkspace = f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/"&lt;BR /&gt;tablename = "publicholidays"&lt;BR /&gt;publicholidaysdf = spark.read.format("delta").load(f"{fullpathtotablesinworkspace}/{tablename}")&lt;BR /&gt;display(publicholidaysdf.limit(10))&lt;/P&gt;&lt;P&gt;As per this documentation: &amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/#path-based-access-to-cloud-storage" target="_blank"&gt;https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/#path-based-access-to-cloud-storage&lt;/A&gt;, we need / want (?) to use an external location instead of the URI, because we use Unity Catalog, right?&lt;BR /&gt;We tried to 'mount' the OneLake tables using the access connector we already have (storage based) to Databricks, but get errors.&lt;/P&gt;&lt;P&gt;Using the gui:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Judith_0-1739892045239.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14928i3572792DF6831781/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Judith_0-1739892045239.png" alt="Judith_0-1739892045239.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Judith_1-1739891020619.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14927i0DE8537D281D4985/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Judith_1-1739891020619.png" alt="Judith_1-1739891020619.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Using a cluster:&lt;BR /&gt;PERMISSION_DENIED: The contributor role on the storage account is not set or Managed Identity does not have READ permissions on url abfss://fabrictest@onelake.dfs.core.windows.net/testlakehouse01.Lakehouse/Tables. Please contact your account admin to update the storage credential. PERMISSION_DENIED: Failed to authenticate with the configured service principal. Please contact your account admin to update the configuration. exceptionTraceId=a5e324b9-3bb7-4663-b1cb-8143f30cf830 SQLSTATE: 42501&lt;/P&gt;&lt;P&gt;Is the URI correct?&lt;BR /&gt;The error message on a cluster implies we have to grant permissions on the OneLake storage, but how? And where exactly?&lt;/P&gt;&lt;P&gt;Thanx,&lt;/P&gt;&lt;P&gt;Judith&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 18 Feb 2025 15:30:49 GMT</pubDate>
    <dc:creator>Judith</dc:creator>
    <dc:date>2025-02-18T15:30:49Z</dc:date>
    <item>
      <title>Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/110503#M9335</link>
      <description>&lt;P&gt;We are trying to connect Databricks to OneLake, to read data from a Fabric workspace into Databricks, using a notebook. We also use Unity Catalog. We are able to read data from the workspace with a Service Principal like this:&lt;/P&gt;&lt;P&gt;from pyspark.sql.types import *&lt;BR /&gt;from pyspark.sql.functions import *&lt;/P&gt;&lt;P&gt;# Credentials&lt;BR /&gt;client_id = xxx&lt;BR /&gt;tenant_id = xxx&lt;BR /&gt;client_secret = xxx&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.auth.type", "OAuth")&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth.provider.type", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.id", client_id)&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.secret", client_secret)&lt;BR /&gt;spark.conf.set("fs.azure.account.oauth2.client.endpoint",f"&lt;A href="https://login.microsoftonline.com/{tenant_id}/oauth2/token" target="_blank"&gt;https://login.microsoftonline.com/{tenant_id}/oauth2/token&lt;/A&gt;")&lt;/P&gt;&lt;P&gt;# Define the Onelake parameters&lt;BR /&gt;lakehouse_name = "testlakehouse01"&lt;BR /&gt;workspace_name = "fabrictest"&lt;/P&gt;&lt;P&gt;fullpathtotablesinworkspace = f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/"&lt;BR /&gt;tablename = "publicholidays"&lt;BR /&gt;publicholidaysdf = spark.read.format("delta").load(f"{fullpathtotablesinworkspace}/{tablename}")&lt;BR /&gt;display(publicholidaysdf.limit(10))&lt;/P&gt;&lt;P&gt;As per this documentation: &amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/#path-based-access-to-cloud-storage" target="_blank"&gt;https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/#path-based-access-to-cloud-storage&lt;/A&gt;, we need / want (?) to use an external location instead of the URI, because we use Unity Catalog, right?&lt;BR /&gt;We tried to 'mount' the OneLake tables using the access connector we already have (storage based) to Databricks, but get errors.&lt;/P&gt;&lt;P&gt;Using the gui:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Judith_0-1739892045239.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14928i3572792DF6831781/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Judith_0-1739892045239.png" alt="Judith_0-1739892045239.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Judith_1-1739891020619.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14927i0DE8537D281D4985/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Judith_1-1739891020619.png" alt="Judith_1-1739891020619.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Using a cluster:&lt;BR /&gt;PERMISSION_DENIED: The contributor role on the storage account is not set or Managed Identity does not have READ permissions on url abfss://fabrictest@onelake.dfs.core.windows.net/testlakehouse01.Lakehouse/Tables. Please contact your account admin to update the storage credential. PERMISSION_DENIED: Failed to authenticate with the configured service principal. Please contact your account admin to update the configuration. exceptionTraceId=a5e324b9-3bb7-4663-b1cb-8143f30cf830 SQLSTATE: 42501&lt;/P&gt;&lt;P&gt;Is the URI correct?&lt;BR /&gt;The error message on a cluster implies we have to grant permissions on the OneLake storage, but how? And where exactly?&lt;/P&gt;&lt;P&gt;Thanx,&lt;/P&gt;&lt;P&gt;Judith&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 18 Feb 2025 15:30:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/110503#M9335</guid>
      <dc:creator>Judith</dc:creator>
      <dc:date>2025-02-18T15:30:49Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/115387#M9336</link>
      <description>&lt;P&gt;Hi, I am facing the same problem. Have you already been able to solve the problem?&lt;/P&gt;</description>
      <pubDate>Mon, 14 Apr 2025 06:48:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/115387#M9336</guid>
      <dc:creator>behema1074</dc:creator>
      <dc:date>2025-04-14T06:48:23Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/128982#M10559</link>
      <description>&lt;P&gt;UC is now displaying a new error when trying to add an external location pointing to a onelake abfss path. The error says that onelake urls are not supported as external locations.&lt;/P&gt;</description>
      <pubDate>Wed, 20 Aug 2025 13:09:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/128982#M10559</guid>
      <dc:creator>adriennn</dc:creator>
      <dc:date>2025-08-20T13:09:44Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/128993#M10561</link>
      <description>&lt;P&gt;One thing you can check if the Databricks Access connector have storage blob contributor access on the Datalake&lt;/P&gt;</description>
      <pubDate>Wed, 20 Aug 2025 14:32:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/128993#M10561</guid>
      <dc:creator>nayan_wylde</dc:creator>
      <dc:date>2025-08-20T14:32:19Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/139240#M11030</link>
      <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;To connect Databricks to OneLake using Unity Catalog and access data with a service principal, and to address the "PERMISSION_DENIED" error you encountered, here are the key points and steps:&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Use External Location with Unity Catalog&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;When using Unity Catalog, you typically do not access cloud storage directly by URI. Instead, you create an external location in Unity Catalog that references your OneLake storage path.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;External locations allow controlled, managed access to storage with permissions enforced via Unity Catalog rather than raw storage permissions.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Creating and managing external locations requires appropriate privileges in the metastore (e.g., metastore admin or external location owner role).​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Permissions for OneLake Storage Access&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The "PERMISSION_DENIED" error indicates that the service principal does not have sufficient permissions on the OneLake storage.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;You need to grant your Databricks service principal both Azure RBAC roles and OneLake workspace access:&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;At the Azure level, assign your service principal roles like&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Storage Blob Data Contributor&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;or&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Storage Account Contributor&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;on the OneLake storage account or relevant resource group.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Within OneLake (Microsoft Fabric workspace), assign the contributor role or equivalent access for the service principal to the target Fabric workspace or lakehouse.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The access control model for OneLake uses deny-by-default, so explicit granting in both Azure portal (IAM role assignments) and Fabric workspace access control is required.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Steps to Grant Permissions&lt;/H2&gt;
&lt;OL class="marker:text-quiet list-decimal"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;In the Azure portal, go to your OneLake storage account or resource group.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Open "Access Control (IAM)" and add a role assignment for your service principal with the role&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Storage Blob Data Contributor&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;or&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Storage Account Contributor&lt;/STRONG&gt;.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;In the Fabric portal, navigate to the target workspace, open&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Manage Access&lt;/STRONG&gt;, and add your service principal with at least the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Contributor&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;role so it can access lakehouse data.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Confirm that the service principal has the necessary permissions to authenticate and read from the storage URI.&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Using External Location in Databricks&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Create an external location in Databricks referencing your OneLake path using the same service principal/credential.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Assign this external location to the appropriate workspace(s).&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Use Unity Catalog tables via this external location for fine-grained access control rather than mounting the storage manually.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;About Mounting OneLake Storage&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Mounting OneLake storage via DBFS is generally not recommended when using Unity Catalog.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Instead, use external locations tied to Unity Catalog and the service principal access model for secured, governed data access.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Mount attempts often fail with permission errors due to missing Contributor roles or managed identity rights on the OneLake storage account.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;HR /&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;This guidance should help resolve permission issues and align with best practices using Unity Catalog external locations for OneLake data in Databricks&lt;/P&gt;</description>
      <pubDate>Sun, 16 Nov 2025 17:53:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/139240#M11030</guid>
      <dc:creator>mark_ott</dc:creator>
      <dc:date>2025-11-16T17:53:46Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to Onelake using Service Principal, Unity Catalog and Databricks Access Connector</title>
      <link>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/139251#M11031</link>
      <description>&lt;P&gt;As commented you need to assign "&lt;STRONG&gt;Storage Blob Data Contributor&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;or&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Storage Account Contributor&lt;/STRONG&gt; to the service principal you're using in the "connection" provided to the "external location".&amp;nbsp;&lt;/P&gt;&lt;P&gt;Another more advanced and even better option would be to use the "managed identity" associated to an "Azure Access Connector for Databricks" so that you can avoid usage of secrets or passwords. That "managed identity" should be provided with same roles.&lt;/P&gt;&lt;P&gt;I explain that in this video but it's only in spanish so far. Maybe I'll make it in english soon &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&amp;nbsp;&lt;A href="https://youtu.be/HSSWP5UbkNY?si=DdzKx-KGJJQUXb3k" target="_blank"&gt;https://youtu.be/HSSWP5UbkNY?si=DdzKx-KGJJQUXb3k&lt;/A&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 16 Nov 2025 20:49:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/connect-to-onelake-using-service-principal-unity-catalog-and/m-p/139251#M11031</guid>
      <dc:creator>Coffee77</dc:creator>
      <dc:date>2025-11-16T20:49:49Z</dc:date>
    </item>
  </channel>
</rss>

