โ02-19-2025 03:39 AM - edited โ02-19-2025 03:43 AM
I have an issue when trying to use the command display(dbutils.fs.ls("abfss://test@test.dfs.core.windows.net")). When I execute the command on my personal cluster, it works, and I can see the files. Before that, I set the following configurations:
spark.conf.set("fs.azure.account.auth.type.test.dfs.core.windows.net", "OAuth") spark.conf.set("fs.azure.account.oauth.provider.type.test.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider") spark.conf.set("fs.azure.account.oauth2.client.id.test.dfs.core.windows.net", client_id) spark.conf.set("fs.azure.account.oauth2.client.secret.test.dfs.core.windows.net", client_secret) spark.conf.set("fs.azure.account.oauth2.client.endpoint.test.dfs.core.windows.net", f"https://login.microsoftonline.com/{tenant_id}/oauth2/token")
However, when I perform the same action on a serverless environment, I get the following error:
Configuration fs.azure.account.auth.type.test.dfs.core.windows.net is not available. SQLSTATE: 42K0I
How can I access files stored in Data Lake with serverless?
Thank you.
โ02-19-2025 04:39 AM
Hi @DataEnginerrOO1,
Can you try with this configurations set at notebook level? also ensure that variables values are correct:
service_credential = dbutils.secrets.get(scope="<secret-scope>",key="<service-credential-key>")
spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.<storage-account>.dfs.core.windows.net", "<application-id>")
spark.conf.set("fs.azure.account.oauth2.client.secret.<storage-account>.dfs.core.windows.net", service_credential)
spark.conf.set("fs.azure.account.oauth2.client.endpoint.<storage-account>.dfs.core.windows.net", "https://login.microsoftonline.com/<directory-id>/oauth2/token")
https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage
Set Spark Properties in SQL Warehouse Settings:
โ02-19-2025 08:37 AM
This settings are special for serverless? because as i said, the configurations above work with personal cluster.
โ02-20-2025 06:13 AM
> This settings are special for serverless? because as i said, the configurations above work with personal cluster.
Yes, personal compute uses classic compute, which runs in your Azure subscription. Serverless runs in Databricks' cloud estate and needs special permissions, see https://docs.databricks.com/aws/en/security/network/#secure-network-connectivity
โ02-19-2025 05:44 AM
Can your serverless compute access any storage in that storage account? Something else to check is if your NCC is configured correctly: Configure private connectivity from serverless compute - Azure Databricks | Microsoft Learn. However, if your serverless can access other storage in that account, the NCC is probably working.
โ02-19-2025 08:35 AM
Yes, i can access tables that stored in the catalog on the same data lake.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now