<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Data selection from adls2 in Serverless Warehouse in Warehousing &amp; Analytics</title>
    <link>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117521#M2040</link>
    <description>&lt;P&gt;Hi, thanks for the detailed explanation.&lt;BR /&gt;Unfortunately, configuring fs.azure.account.key in the Serverless advanced options didn’t help (I’m sure I wrote it correctly) - still receiving the same error.&lt;BR /&gt;I saw in some sources around the net that I should create an external location - will try that approach too.&lt;/P&gt;</description>
    <pubDate>Fri, 02 May 2025 13:44:10 GMT</pubDate>
    <dc:creator>ogs</dc:creator>
    <dc:date>2025-05-02T13:44:10Z</dc:date>
    <item>
      <title>Data selection from adls2 in Serverless Warehouse</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117165#M2034</link>
      <description>&lt;P class=""&gt;Hi everyone,&lt;/P&gt;&lt;P class=""&gt;I'm trying to query data from our adls2 delta lake using a serverless sql warehouse. We've already set up private connectivity via NCC, but hitting a snag when running queries like:&lt;/P&gt;&lt;DIV class=""&gt;&lt;PRE&gt;&lt;SPAN&gt;SELECT &lt;/SPAN&gt;* &lt;SPAN&gt;FROM &lt;/SPAN&gt;delta.`abfss://container@xxx.dfs.core.windows.net/delta_tbl`&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;"Invalid configuration value detected for fs.azure.account.key"&lt;BR /&gt;&lt;/SPAN&gt;&lt;/PRE&gt;&lt;/DIV&gt;&lt;P class=""&gt;My questions:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P class=""&gt;Where exactly should I configure this storage key in a Serverless setup? (It's not a traditional cluster where I'd use spark.conf.set)&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;Is creating an external table in UC a better approach? Something like:&lt;BR /&gt;&lt;PRE&gt;&lt;SPAN&gt;CREATE EXTERNAL TABLE &lt;/SPAN&gt;xxx&lt;BR /&gt;&lt;SPAN&gt;USING &lt;/SPAN&gt;DELTA&lt;BR /&gt;LOCATION &lt;SPAN&gt;'abfss://container@xxx.dfs.core.windows.net/xxx'&lt;BR /&gt;&lt;/SPAN&gt;&lt;/PRE&gt;and what config should be made for this to work Thanks.&lt;/LI&gt;&lt;/OL&gt;&lt;P class=""&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 30 Apr 2025 14:08:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117165#M2034</guid>
      <dc:creator>ogs</dc:creator>
      <dc:date>2025-04-30T14:08:01Z</dc:date>
    </item>
    <item>
      <title>Re: Data selection from adls2 in Serverless Warehouse</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117191#M2036</link>
      <description>&lt;P&gt;Here is something to try:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;DIV class="paragraph"&gt;To resolve your query and simplify access to your ADLS Gen2 Delta Lake via a serverless SQL warehouse, here are the steps and considerations:&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;1. Configuring &lt;CODE&gt;fs.azure.account.key&lt;/CODE&gt; in a Serverless Setup The error indicates that your storage account key isn't properly registered in the configuration. Unlike traditional clusters (where you'd use &lt;CODE&gt;spark.conf.set&lt;/CODE&gt;), with a serverless SQL warehouse, you must specify this configuration in the SQL Warehouse's "Advanced Options" under the &lt;CODE&gt;Spark Config&lt;/CODE&gt; section.&lt;/DIV&gt;
&lt;UL&gt;
&lt;LI&gt;Navigate to the Databricks console.&lt;/LI&gt;
&lt;LI&gt;Go to &lt;STRONG&gt;SQL&lt;/STRONG&gt; &amp;gt; &lt;STRONG&gt;SQL Warehouses&lt;/STRONG&gt;, then select the serverless SQL warehouse you're using.&lt;/LI&gt;
&lt;LI&gt;In the &lt;STRONG&gt;Advanced Options&lt;/STRONG&gt;, add the following under the &lt;STRONG&gt;Spark Config&lt;/STRONG&gt; textbox: &lt;CODE&gt;
fs.azure.account.key.&amp;lt;storage_account&amp;gt;.dfs.core.windows.net "&amp;lt;storage_account_key&amp;gt;"
&lt;/CODE&gt; Replace &lt;CODE&gt;&amp;lt;storage_account&amp;gt;&lt;/CODE&gt; with your Azure Storage account name and &lt;CODE&gt;&amp;lt;storage_account_key&amp;gt;&lt;/CODE&gt; with your actual storage key.&lt;/LI&gt;
&lt;/UL&gt;
&lt;DIV class="paragraph"&gt;2. External Table Using Unity Catalog (UC) Creating an external table through Unity Catalog can streamline security and simplify access management across different users and services. The key configurations for this approach would depend on using the necessary access credentials and ensuring private connectivity is correctly established:&lt;/DIV&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;External Table Syntax&lt;/STRONG&gt;: &lt;CODE&gt;sql
CREATE EXTERNAL TABLE catalog_name.schema_name.table_name
USING DELTA
LOCATION 'abfss://container@xxx.dfs.core.windows.net/delta_tbl';
&lt;/CODE&gt; Replace &lt;CODE&gt;catalog_name&lt;/CODE&gt;, &lt;CODE&gt;schema_name&lt;/CODE&gt;, and other parameters as per your setup.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;&lt;STRONG&gt;Steps to Configure Access&lt;/STRONG&gt;:
&lt;UL&gt;
&lt;LI&gt;If using a service principal or OAuth, ensure the following Spark configurations are added either through the SQL Warehouse interface, an admin console, or a direct configuration file: &lt;CODE&gt;plaintext
fs.azure.account.auth.type &amp;lt;auth_type&amp;gt;
fs.azure.account.oauth2.client.id &amp;lt;client_id&amp;gt;
fs.azure.account.oauth2.client.secret &amp;lt;client_secret&amp;gt;
fs.azure.account.oauth2.client.endpoint &amp;lt;oauth_endpoint&amp;gt;
&lt;/CODE&gt; Adjust these values based on your exact setup and Azure credentials.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/DIV&gt;
&lt;UL&gt;
&lt;LI&gt;If relying on a shared key, you would still add the &lt;CODE&gt;fs.azure.account.key.&amp;lt;storage_account&amp;gt;.dfs.core.windows.net&lt;/CODE&gt; property directly, as explained in Section 1 above.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;DIV class="paragraph"&gt;3. Network Connectivity Considerations with NCC Since your setup already utilizes private connectivity via NCC (Network Connectivity Configuration), ensure the following: - The NCC is correctly configured and attached to your workspace. - Private endpoints for both your SQL warehouse and ADLS Gen2 are properly set up. - You have allowed the required subnets or IP ranges in the storage firewall settings to enable communication between the Databricks serverless compute plane and your storage account.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Recommendation Using an external table along with Unity Catalog is indeed a better approach in terms of centralizing metadata and managing access consistently. Ensure your Spark configurations, storage connectivity, and table definitions align with the recommendations above.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;If required, further troubleshooting can be performed for connectivity/failures using utilities like &lt;CODE&gt;nslookup&lt;/CODE&gt; to verify private endpoint reachability, or by reviewing network access rules within NCC and Azure.&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Hope this helps, Big Roux.&lt;/DIV&gt;</description>
      <pubDate>Wed, 30 Apr 2025 18:18:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117191#M2036</guid>
      <dc:creator>Louis_Frolio</dc:creator>
      <dc:date>2025-04-30T18:18:02Z</dc:date>
    </item>
    <item>
      <title>Re: Data selection from adls2 in Serverless Warehouse</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117521#M2040</link>
      <description>&lt;P&gt;Hi, thanks for the detailed explanation.&lt;BR /&gt;Unfortunately, configuring fs.azure.account.key in the Serverless advanced options didn’t help (I’m sure I wrote it correctly) - still receiving the same error.&lt;BR /&gt;I saw in some sources around the net that I should create an external location - will try that approach too.&lt;/P&gt;</description>
      <pubDate>Fri, 02 May 2025 13:44:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/117521#M2040</guid>
      <dc:creator>ogs</dc:creator>
      <dc:date>2025-05-02T13:44:10Z</dc:date>
    </item>
    <item>
      <title>Re: Data selection from adls2 in Serverless Warehouse</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/131822#M2239</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am following these steps&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Navigate to the Databricks console.&lt;/LI&gt;&lt;LI&gt;Go to&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;SQL&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&amp;gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;SQL Warehouses&lt;/STRONG&gt;, then select the serverless SQL warehouse you're using.&lt;/LI&gt;&lt;LI&gt;In the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Advanced Options&lt;/STRONG&gt;, add the following under the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Spark Config&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;textbox:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;fs.azure.account.key.&amp;lt;storage_account&amp;gt;.dfs.core.windows.net "&amp;lt;storage_account_key&amp;gt;"&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;Replace&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&amp;lt;storage_account&amp;gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;with your Azure Storage account name and&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&amp;lt;storage_account_key&amp;gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;with your actual storage key.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;but I dont see any Spark Config text box in advance Options:&lt;/P&gt;&lt;P&gt;Do i have to enable something on the admin settings&amp;nbsp;&lt;/P&gt;&lt;P&gt;** trying to read delta table in storage account and create another table using SQL serverless &amp;amp; SQL query editor **&lt;/P&gt;&lt;P&gt;thanks&lt;/P&gt;</description>
      <pubDate>Fri, 12 Sep 2025 20:37:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/data-selection-from-adls2-in-serverless-warehouse/m-p/131822#M2239</guid>
      <dc:creator>kbaig8125</dc:creator>
      <dc:date>2025-09-12T20:37:01Z</dc:date>
    </item>
  </channel>
</rss>

