<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: AWS Databricks and Fabric OneLake integration in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/aws-databricks-and-fabric-onelake-integration/m-p/128921#M48373</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/160453"&gt;@de2298&lt;/a&gt;,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Currently, Microsoft Fabric does not offer a built-in connector that allows direct querying or exposure of Delta Share tables from AWS Databricks into a Fabric Warehouse. The Unity Catalog mirroring feature is supported only with Azure Databricks and has not been extended to AWS Databricks at this time.&lt;/P&gt;&lt;P&gt;- As a workaround, you can automate the ingestion of data from AWS Databricks Delta Share into an intermediate staging layer such as Azure Blob Storage or Azure SQL Database. From there, the data can be ingested into a Fabric Warehouse using T-SQL notebooks or Dataflows Gen2.&lt;BR /&gt;- It's also worth noting that while Lakehouse is currently the only Fabric-native destination with potential support for open Delta Sharing, Warehouse-based direct ingestion from Delta Share is not supported at this stage.&lt;BR /&gt;- Additionally, Fabric now supports direct access to Azure Databricks Unity Catalog tables through the Mirrored Azure Databricks Catalog feature.&lt;BR /&gt;- If you're working with Azure Databricks, this provides a seamless way to integrate Unity Catalog-managed data across other Fabric experiences, including Power BI.&lt;BR /&gt;- For AWS-hosted Delta Shares, however a middle-tier ingestion strategy remains necessary until native support becomes available.&lt;/P&gt;</description>
    <pubDate>Tue, 19 Aug 2025 23:21:52 GMT</pubDate>
    <dc:creator>WiliamRosa</dc:creator>
    <dc:date>2025-08-19T23:21:52Z</dc:date>
    <item>
      <title>AWS Databricks and Fabric OneLake integration</title>
      <link>https://community.databricks.com/t5/data-engineering/aws-databricks-and-fabric-onelake-integration/m-p/126506#M47701</link>
      <description>&lt;P&gt;Bit of a weird scenario and I wanted to hear from the experts in this community&lt;/P&gt;&lt;P&gt;Lets say I have a Fabric Lakehouse (OneLake) and I want to read that data into Databricks (AWS) Unity Catalog to play with that data. What is the recommended mechanism to do this? (without actually read/writing the data)&lt;/P&gt;&lt;P&gt;I was looking into creating external tables in UC pointing to the blob path in OneLake, but some posts/docs out there say this is not possible.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any insights?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Jul 2025 18:52:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/aws-databricks-and-fabric-onelake-integration/m-p/126506#M47701</guid>
      <dc:creator>de2298</dc:creator>
      <dc:date>2025-07-25T18:52:17Z</dc:date>
    </item>
    <item>
      <title>Re: AWS Databricks and Fabric OneLake integration</title>
      <link>https://community.databricks.com/t5/data-engineering/aws-databricks-and-fabric-onelake-integration/m-p/128921#M48373</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/160453"&gt;@de2298&lt;/a&gt;,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Currently, Microsoft Fabric does not offer a built-in connector that allows direct querying or exposure of Delta Share tables from AWS Databricks into a Fabric Warehouse. The Unity Catalog mirroring feature is supported only with Azure Databricks and has not been extended to AWS Databricks at this time.&lt;/P&gt;&lt;P&gt;- As a workaround, you can automate the ingestion of data from AWS Databricks Delta Share into an intermediate staging layer such as Azure Blob Storage or Azure SQL Database. From there, the data can be ingested into a Fabric Warehouse using T-SQL notebooks or Dataflows Gen2.&lt;BR /&gt;- It's also worth noting that while Lakehouse is currently the only Fabric-native destination with potential support for open Delta Sharing, Warehouse-based direct ingestion from Delta Share is not supported at this stage.&lt;BR /&gt;- Additionally, Fabric now supports direct access to Azure Databricks Unity Catalog tables through the Mirrored Azure Databricks Catalog feature.&lt;BR /&gt;- If you're working with Azure Databricks, this provides a seamless way to integrate Unity Catalog-managed data across other Fabric experiences, including Power BI.&lt;BR /&gt;- For AWS-hosted Delta Shares, however a middle-tier ingestion strategy remains necessary until native support becomes available.&lt;/P&gt;</description>
      <pubDate>Tue, 19 Aug 2025 23:21:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/aws-databricks-and-fabric-onelake-integration/m-p/128921#M48373</guid>
      <dc:creator>WiliamRosa</dc:creator>
      <dc:date>2025-08-19T23:21:52Z</dc:date>
    </item>
  </channel>
</rss>

