<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Access for delta lake with serverless in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110589#M43612</link>
    <description>&lt;P&gt;I have an issue when trying to use the command&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;display(dbutils.fs.ls("abfss://test@test.dfs.core.windows.net")). When I execute the command on my &lt;STRONG&gt;personal cluster&lt;/STRONG&gt;, it works, and I can see the files. Before that, I set the following configurations:&lt;/P&gt;&lt;PRE&gt;spark.conf.set("fs.azure.account.auth.type.test.dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.test.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.test.dfs.core.windows.net", client_id)
spark.conf.set("fs.azure.account.oauth2.client.secret.test.dfs.core.windows.net", client_secret)
spark.conf.set("fs.azure.account.oauth2.client.endpoint.test.dfs.core.windows.net", f"https://login.microsoftonline.com/{tenant_id}/oauth2/token")&lt;/PRE&gt;&lt;P&gt;However, when I perform the same action on a &lt;STRONG&gt;serverless&lt;/STRONG&gt; environment, I get the following error:&lt;/P&gt;&lt;PRE&gt;Configuration fs.azure.account.auth.type.test.dfs.core.windows.net is not available. SQLSTATE: 42K0I&lt;/PRE&gt;&lt;P&gt;How can I access files stored in Data Lake with serverless?&lt;/P&gt;&lt;P&gt;Thank you.&lt;/P&gt;</description>
    <pubDate>Wed, 19 Feb 2025 11:43:11 GMT</pubDate>
    <dc:creator>DataEnginerrOO1</dc:creator>
    <dc:date>2025-02-19T11:43:11Z</dc:date>
    <item>
      <title>Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110589#M43612</link>
      <description>&lt;P&gt;I have an issue when trying to use the command&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;display(dbutils.fs.ls("abfss://test@test.dfs.core.windows.net")). When I execute the command on my &lt;STRONG&gt;personal cluster&lt;/STRONG&gt;, it works, and I can see the files. Before that, I set the following configurations:&lt;/P&gt;&lt;PRE&gt;spark.conf.set("fs.azure.account.auth.type.test.dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.test.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.test.dfs.core.windows.net", client_id)
spark.conf.set("fs.azure.account.oauth2.client.secret.test.dfs.core.windows.net", client_secret)
spark.conf.set("fs.azure.account.oauth2.client.endpoint.test.dfs.core.windows.net", f"https://login.microsoftonline.com/{tenant_id}/oauth2/token")&lt;/PRE&gt;&lt;P&gt;However, when I perform the same action on a &lt;STRONG&gt;serverless&lt;/STRONG&gt; environment, I get the following error:&lt;/P&gt;&lt;PRE&gt;Configuration fs.azure.account.auth.type.test.dfs.core.windows.net is not available. SQLSTATE: 42K0I&lt;/PRE&gt;&lt;P&gt;How can I access files stored in Data Lake with serverless?&lt;/P&gt;&lt;P&gt;Thank you.&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2025 11:43:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110589#M43612</guid>
      <dc:creator>DataEnginerrOO1</dc:creator>
      <dc:date>2025-02-19T11:43:11Z</dc:date>
    </item>
    <item>
      <title>Re: Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110600#M43616</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/149889"&gt;@DataEnginerrOO1&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Can you try with this configurations set at notebook level? also ensure that variables values are correct:&lt;/P&gt;
&lt;P class="p1"&gt;service_credential = dbutils.secrets.get(scope="&amp;lt;secret-scope&amp;gt;",key="&amp;lt;service-credential-key&amp;gt;")&lt;/P&gt;
&lt;P class="p1"&gt;spark.conf.set("fs.azure.account.auth.type.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "OAuth")&lt;/P&gt;
&lt;P class="p1"&gt;spark.conf.set("fs.azure.account.oauth.provider.type.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")&lt;/P&gt;
&lt;P class="p1"&gt;spark.conf.set("fs.azure.account.oauth2.client.id.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "&amp;lt;application-id&amp;gt;")&lt;/P&gt;
&lt;P class="p1"&gt;spark.conf.set("fs.azure.account.oauth2.client.secret.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", service_credential)&lt;/P&gt;
&lt;P class="p1"&gt;spark.conf.set("fs.azure.account.oauth2.client.endpoint.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "&lt;A href="https://login.microsoftonline.com/" target="_blank"&gt;https://login.microsoftonline.com/&lt;/A&gt;&amp;lt;directory-id&amp;gt;/oauth2/token")&lt;/P&gt;
&lt;P class="p1"&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage" target="_blank"&gt;https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage&lt;/A&gt;&lt;/P&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Set Spark Properties in SQL Warehouse Settings:&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL class="_1t7bu9h7 _1t7bu9h2"&gt;
&lt;LI&gt;&lt;SPAN&gt;Instead of setting them directly in your notebook, configure these properties in the SQL warehouse settings if you are working in Databricks SQL, or adjust the serverless cluster configuration.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 19 Feb 2025 12:39:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110600#M43616</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2025-02-19T12:39:34Z</dc:date>
    </item>
    <item>
      <title>Re: Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110602#M43618</link>
      <description>&lt;P&gt;Can your serverless compute access any storage in that storage account?&amp;nbsp; Something else to check is if your NCC is configured correctly:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-link" target="_blank"&gt;Configure private connectivity from serverless compute - Azure Databricks | Microsoft Learn&lt;/A&gt;.&amp;nbsp; However, if your serverless can access other storage in that account, the NCC is probably working.&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2025 13:44:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110602#M43618</guid>
      <dc:creator>Rjdudley</dc:creator>
      <dc:date>2025-02-19T13:44:52Z</dc:date>
    </item>
    <item>
      <title>Re: Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110636#M43629</link>
      <description>&lt;P&gt;Yes, i can access tables that stored in the catalog on the same data lake.&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2025 16:35:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110636#M43629</guid>
      <dc:creator>DataEnginerrOO1</dc:creator>
      <dc:date>2025-02-19T16:35:27Z</dc:date>
    </item>
    <item>
      <title>Re: Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110637#M43630</link>
      <description>&lt;P&gt;This settings are special for serverless? because as i said, the configurations above work with personal cluster.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2025 16:37:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110637#M43630</guid>
      <dc:creator>DataEnginerrOO1</dc:creator>
      <dc:date>2025-02-19T16:37:49Z</dc:date>
    </item>
    <item>
      <title>Re: Access for delta lake with serverless</title>
      <link>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110752#M43673</link>
      <description>&lt;P&gt;&amp;gt;&amp;nbsp;&lt;SPAN&gt;This settings are special for serverless? because as i said, the configurations above work with personal cluster.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Yes, personal compute uses classic compute, which runs in your Azure subscription.&amp;nbsp; Serverless runs in Databricks' cloud estate and needs special permissions, see &lt;A href="https://docs.databricks.com/aws/en/security/network/#secure-network-connectivity" target="_blank"&gt;https://docs.databricks.com/aws/en/security/network/#secure-network-connectivity&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 20 Feb 2025 14:13:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/access-for-delta-lake-with-serverless/m-p/110752#M43673</guid>
      <dc:creator>Rjdudley</dc:creator>
      <dc:date>2025-02-20T14:13:46Z</dc:date>
    </item>
  </channel>
</rss>

