<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic how to use azure one lake in aws databricks unity catalog in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-one-lake-in-aws-databricks-unity-catalog/m-p/113437#M44530</link>
    <description>&lt;P&gt;i'm trying to connect&amp;nbsp;azure one lake in aws databricks unity catalog but i'm not able to storage credential, since it's currently allowing s3 location only but in hive catalog i'm able to connect to one lake but not in unity.&lt;/P&gt;</description>
    <pubDate>Mon, 24 Mar 2025 16:24:03 GMT</pubDate>
    <dc:creator>ashish31negi</dc:creator>
    <dc:date>2025-03-24T16:24:03Z</dc:date>
    <item>
      <title>how to use azure one lake in aws databricks unity catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-one-lake-in-aws-databricks-unity-catalog/m-p/113437#M44530</link>
      <description>&lt;P&gt;i'm trying to connect&amp;nbsp;azure one lake in aws databricks unity catalog but i'm not able to storage credential, since it's currently allowing s3 location only but in hive catalog i'm able to connect to one lake but not in unity.&lt;/P&gt;</description>
      <pubDate>Mon, 24 Mar 2025 16:24:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-one-lake-in-aws-databricks-unity-catalog/m-p/113437#M44530</guid>
      <dc:creator>ashish31negi</dc:creator>
      <dc:date>2025-03-24T16:24:03Z</dc:date>
    </item>
    <item>
      <title>Re: how to use azure one lake in aws databricks unity catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-one-lake-in-aws-databricks-unity-catalog/m-p/136638#M50623</link>
      <description>&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Azure OneLake cannot be directly connected or credentialed in AWS Databricks Unity Catalog at this time, because AWS Databricks Unity Catalog supports only storage credentials for S3 and a select few options (like Cloudflare R2), rather than Azure-based offerings such as OneLake or ADLS Gen2. In contrast, the Hive Metastore is less restrictive and can be configured to access OneLake using ABFS URIs or other Azure storage connectivity features.​&lt;/P&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Why Unity Catalog Blocks OneLake&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Unity Catalog in AWS currently permits storage credentials only for S3-based paths; attempts to use ABFS for OneLake or ADLS Gen2 are blocked at the permission/credential creation step.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;In Azure Databricks, Unity Catalog can connect to OneLake via Azure-based identities (service principal, managed identity, or credential passthrough), but these authentication and storage mechanisms are not available in the AWS product.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;There is no announced support for extending Unity Catalog's external location feature to non-S3 endpoints on AWS Databricks.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;How Hive Metastore Differs&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Hive Metastore does not deeply enforce cloud-native identity or storage credential constraints, so it can register and mount external tables from OneLake, provided the correct driver and endpoint details are supplied.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;This flexibility lets Hive Metastore support cross-cloud sources, but comes at the cost of weaker permissions, lineage, and governance compared to Unity Catalog.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Workarounds and Alternatives&lt;/H2&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;For advanced cross-cloud data scenarios, some users attempt to bridge data from OneLake to S3 (or vice versa) using integration pipelines, Delta Sharing, or direct data replication, but this is manual and typically outside Unity Catalog's governance model.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;Azure Databricks with Unity Catalog offers native, seamless connectivity to OneLake, but AWS Databricks users must wait for future product support or use non-Unity Catalog patterns.​&lt;/P&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;API-only approaches: In rare cases, admins may use Databricks APIs to attempt custom credential injection, but this requires privileged access and is not supported for OneLake on Unity Catalog in AWS today.​&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 class="mb-2 mt-4 font-display font-semimedium text-base first:mt-0"&gt;Summary Table&lt;/H2&gt;
&lt;DIV class="group relative"&gt;
&lt;DIV class="w-full overflow-x-auto md:max-w-[90vw] border-subtlest ring-subtlest divide-subtlest bg-transparent"&gt;
&lt;TABLE class="border-subtler my-[1em] w-full table-auto border-separate border-spacing-0 border-l border-t"&gt;
&lt;THEAD class="bg-subtler"&gt;
&lt;TR&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Feature / Platform&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;AWS Databricks Unity Catalog&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Azure Databricks Unity Catalog&lt;/TH&gt;
&lt;TH class="border-subtler p-sm break-normal border-b border-r text-left align-top"&gt;Hive Metastore (any cloud)&lt;/TH&gt;
&lt;/TR&gt;
&lt;/THEAD&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;OneLake Connectivity&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;No (S3 only)&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;​&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes (native, managed)&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;​&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Yes (manual)&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;​&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Storage Credential Options&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;S3, R2 only&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;​&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Azure AD, Service Principal&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Custom/driver-based&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;External Location Support&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;S3&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;S3, ADLS Gen2, OneLake&lt;/TD&gt;
&lt;TD class="px-sm border-subtler min-w-[48px] break-normal border-b border-r"&gt;Any, via URI/driver&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;/DIV&gt;
&lt;DIV class="bg-base border-subtler shadow-subtle pointer-coarse:opacity-100 right-xs absolute bottom-0 flex rounded-lg border opacity-0 transition-opacity group-hover:opacity-100 [&amp;amp;&amp;gt;*:not(:first-child)]:border-subtle [&amp;amp;&amp;gt;*:not(:first-child)]:border-l"&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="flex"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;For now, AWS Databricks Unity Catalog is limited to S3 locations for storage credentials and external locations, so direct integration with Azure OneLake is not possible. Using Hive Metastore remains the only practical workaround until this limitation changes or official cross-cloud support is added to Unity Catalog.​&lt;/P&gt;</description>
      <pubDate>Wed, 29 Oct 2025 20:40:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-one-lake-in-aws-databricks-unity-catalog/m-p/136638#M50623</guid>
      <dc:creator>mark_ott</dc:creator>
      <dc:date>2025-10-29T20:40:57Z</dc:date>
    </item>
  </channel>
</rss>

