<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87830#M37450</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/118879"&gt;@stefanberreiter&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;You need to grant access to the storage account used by your Microsoft Fabric instance, not to the fabric workspace itself.&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;And also you need to following role for access connector:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;-&amp;nbsp;&lt;STRONG&gt;Storage Blob Data Contributor&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;So viewer and contributor are not correct.&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identities#--step-2-grant-the-managed-identity-access-to-the-storage-account" target="_blank" rel="noopener"&gt;Use Azure managed identities in Unity Catalog to access storage - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 03 Sep 2024 10:12:11 GMT</pubDate>
    <dc:creator>szymon_dybczak</dc:creator>
    <dc:date>2024-09-03T10:12:11Z</dc:date>
    <item>
      <title>[Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87826#M37449</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I want to create an external location from Azure Databricks to a Microsoft Fabric Lakehouse, but seems I am missing something.&lt;/P&gt;&lt;P&gt;What did I do:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;I created an "Access Connector for Azure Databricks" in Azure Portal&lt;/LI&gt;&lt;LI&gt;I created a storage credential for the access connector&lt;/LI&gt;&lt;LI&gt;Granted the access connector access to the Microsoft Fabric Workspace (tried out both viewer and contributor role)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Now I want to create an external location in Azure Databricks with the OneLake path, but get an error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Failed to access cloud storage: [AbfsRestOperationException]&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The pathes I tried out are of the following pattern:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/
abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}
abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse
abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As I struggled so far regarding documentation for this use case (connecting from Databricks to Fabric, not the other way round), I may also be on the wrong path.&lt;/P&gt;&lt;P&gt;Any tips what might be the issues?&lt;/P&gt;&lt;P&gt;Best, Stefan&lt;BR /&gt;&lt;BR /&gt;PS:&amp;nbsp;Fabric Lakehouse has an abfss::/ path which I already validated to read data from (within a Fabric notebook).&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;import pandas as pd

pd.read_parquet(f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/{table_name}")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sources:&lt;BR /&gt;[1]&amp;nbsp;&lt;A href="https://www.youtube.com/watch?v=aAswROA1bM8" target="_blank" rel="noopener"&gt;Advancing Spark - External Tables with Unity Catalog - YouTube&lt;/A&gt;&amp;nbsp;(I tried this approach to make it work with granting Fabric workspace access instead of ADLS Gen2 access in Azure Portal)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 10:01:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87826#M37449</guid>
      <dc:creator>stefanberreiter</dc:creator>
      <dc:date>2024-09-03T10:01:27Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87830#M37450</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/118879"&gt;@stefanberreiter&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;You need to grant access to the storage account used by your Microsoft Fabric instance, not to the fabric workspace itself.&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;And also you need to following role for access connector:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;-&amp;nbsp;&lt;STRONG&gt;Storage Blob Data Contributor&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;So viewer and contributor are not correct.&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identities#--step-2-grant-the-managed-identity-access-to-the-storage-account" target="_blank" rel="noopener"&gt;Use Azure managed identities in Unity Catalog to access storage - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 10:12:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87830#M37450</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-09-03T10:12:11Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87872#M37454</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/110502"&gt;@szymon_dybczak&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;thanks for helping out.&amp;nbsp;&lt;BR /&gt;It seems (at least from this blog post) that you can do it directly from within a Fabric Workspace (grant access) once in Tenant settings you enabled for OneLake settings "Users can access data stored in OneLake with apps external to Fabric".&lt;BR /&gt;&lt;BR /&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="stefanberreiter_0-1725358819777.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10893i9D7F8A72D49136DB/image-size/medium?v=v2&amp;amp;px=400" role="button" title="stefanberreiter_0-1725358819777.png" alt="stefanberreiter_0-1725358819777.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;from source: "&lt;SPAN&gt;The second setting can be found a bit further down under&amp;nbsp;&lt;/SPAN&gt;&lt;EM&gt;OneLake settings&lt;/EM&gt;&lt;SPAN&gt;. This setting allows you to use non-Fabric applications like a Python SDK, Databricks, and more to read and write to the OneLake.&lt;/SPAN&gt;"&lt;BR /&gt;&lt;BR /&gt;Do you know what else one would need to configure (and where) to add the Managed Identity? Can you guide me a bit more with what you said in terms of "&lt;SPAN&gt;You need to grant an access for&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Access Connector for Azure Databricks to the storage account your Fabric instance use&lt;/SPAN&gt;" as I believe it's automatically managed in OneLake ("&lt;SPAN&gt;OneLake comes automatically with every Microsoft Fabric tenant&lt;/SPAN&gt;").&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://dataroots.io/blog/how-to-use-service-principal-authentication-to-access-microsoft-fabrics-onelake" target="_blank"&gt;How to use service principal authentication to access Microsoft Fabric's OneLake (dataroots.io)&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 10:25:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87872#M37454</guid>
      <dc:creator>stefanberreiter</dc:creator>
      <dc:date>2024-09-03T10:25:35Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87923#M37456</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/118879"&gt;@stefanberreiter&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Ok, so it looks like you need to enable Azure Data Lake Storage credential passthrough to make it work. Did you do this step?&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Slash_0-1725360388026.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10895i21F200DBD3CEE1A9/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Slash_0-1725360388026.png" alt="Slash_0-1725360388026.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Below is step by step instuction from documenation:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/onelake/onelake-azure-databricks" target="_blank"&gt;Integrate OneLake with Azure Databricks - Microsoft Fabric | Microsoft Learn&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;And also you can take a look on below video:&lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://www.youtube.com/watch?v=AyYDLTvoXNk" target="_blank"&gt;Leverage OneLake with Azure Databricks (youtube.com)&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 10:46:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87923#M37456</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-09-03T10:46:34Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87929#M37457</link>
      <description>&lt;P&gt;And if you want to use service principal authentication (assuming you have already one) then you need to add this servicep principal to fabric workspace (like in url you send :&lt;A href="https://dataroots.io/blog/how-to-use-service-principal-authentication-to-access-microsoft-fabrics-onelake" target="_blank"&gt;How to use service principal authentication to access Microsoft Fabric's OneLake (dataroots.io)&lt;/A&gt;).&lt;BR /&gt;Then you can use service principal authentication in following way in databricks:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;storage_account = "&amp;lt;storage_account&amp;gt;"
tenant_id = "&amp;lt;tenant_id&amp;gt;"
service_principal_id = "&amp;lt;service_principal_id&amp;gt;"
service_principal_password = "&amp;lt;service_principal_password&amp;gt;"

spark.conf.set(f"fs.azure.account.auth.type.{storage_account}.dfs.core.windows.net", "OAuth")
spark.conf.set(f"fs.azure.account.oauth.provider.type.{storage_account}.dfs.core.windows.net",  "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set(f"fs.azure.account.oauth2.client.id.{storage_account}.dfs.core.windows.net", service_principal_id)
spark.conf.set(f"fs.azure.account.oauth2.client.secret.{storage_account}.dfs.core.windows.net", service_principal_password)
spark.conf.set(f"fs.azure.account.oauth2.client.endpoint.{storage_account}.dfs.core.windows.net", f"https://login.microsoftonline.com/{tenant_id}/oauth2/token")

# read with spn
df = spark.read.format("parquet").load(f"abfss://default@{storage_account}.dfs.core.windows.net/data/unmanaged/t_unmanag_parquet")
df.show(10)

# write
df.write.format("delta").mode("overwrite").save(f"abfss://default@{storage_account}.dfs.core.windows.net/data/unmanaged/fab_unmanag_delta_spn")&lt;/LI-CODE&gt;</description>
      <pubDate>Tue, 03 Sep 2024 10:54:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87929#M37457</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-09-03T10:54:00Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87946#M37459</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/110502"&gt;@szymon_dybczak&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;&lt;BR /&gt;thanks for replying. I've researched a bit into it (thanks for the sources) - now a few more questions are popping up.&lt;/P&gt;&lt;P&gt;It seems like Credential Passthrough will be deprecated and it just works in conjunction with a cluster - while I am looking for a way to have an external table. So the idea would be pointing at the storage of the Lakehouse data in Fabric, not reading and then copying it to Databricks (which I believe is the use case of the video).&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="stefanberreiter_0-1725361593979.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/10896i6D65114A6379AE79/image-size/medium?v=v2&amp;amp;px=400" role="button" title="stefanberreiter_0-1725361593979.png" alt="stefanberreiter_0-1725361593979.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 11:13:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87946#M37459</guid>
      <dc:creator>stefanberreiter</dc:creator>
      <dc:date>2024-09-03T11:13:39Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87953#M37462</link>
      <description>&lt;P&gt;I guess I'm looking now into Lakehouse federation for the SQL endpoint of the Fabric Lakehouse - which comes closest to the experience of the External Table I guess.&lt;BR /&gt;&lt;A href="https://murggu.medium.com/running-federated-queries-from-unity-catalog-on-microsoft-fabric-sql-endpoint-1485da1d450b" target="_blank"&gt;Running Federated Queries from Unity Catalog on Microsoft Fabric SQL Endpoint | by Aitor Murguzur | Sep, 2024 | Medium&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 11:20:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87953#M37462</guid>
      <dc:creator>stefanberreiter</dc:creator>
      <dc:date>2024-09-03T11:20:28Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87963#M37464</link>
      <description>&lt;P&gt;Yeah, that seems like a good option. Thought it also uses service princpal to authenticate. I think in the future they will add ability to use databrticks access connector (MSI) as a valid authentication option to one lake.&lt;/P&gt;</description>
      <pubDate>Tue, 03 Sep 2024 11:32:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/87963#M37464</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-09-03T11:32:40Z</dc:date>
    </item>
    <item>
      <title>Re: [Azure Databricks] Create an External Location to Microsoft Fabric Lakehouse</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/113061#M44410</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/118879"&gt;@stefanberreiter&lt;/a&gt;&amp;nbsp;were you able to access Fabric onelake data from databricks unity catalog using Service prinicipal?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Mar 2025 15:40:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-create-an-external-location-to-microsoft-fabric/m-p/113061#M44410</guid>
      <dc:creator>skpathi</dc:creator>
      <dc:date>2025-03-19T15:40:01Z</dc:date>
    </item>
  </channel>
</rss>

