โ06-26-2025 01:36 PM
I am new to Databricks, I am using the free edition of Databricks. I have tried
[spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope>", key="<sas-token-key>"))] and [spark.conf.set(
"fs.azure.account.key.<storage-account>.dfs.core.windows.net",
dbutils.secrets.get(scope="<scope>", key="<storage-account-access-key>"))] but I keep getting the same error
[CONFIG_NOT_AVAILABLE] Configuration fs.azure.account.key.ygustorage.dfs.core.windows.net is not available. SQLSTATE: 42K0I
โ06-27-2025 04:07 AM
Hi @yugz, this looks similar to an issue discussed in another thread (https://community.databricks.com/t5/data-engineering/access-adls-with-serverless-config-not-availabl...), could you try the solution mentioned there and see if it helps?
โ06-27-2025 07:03 AM
Hi @yugz ,
This is depracted approach to configure storage in Databricks. If you're using Databricks Free edition your new workspace includes serverless compute and default storage, so you can immediately start exploring and building on Databricks. The point is - you don't have to configure storage, because it's already been configure for you.
โ06-27-2025 08:01 AM
Hello @szymon_dybczak ,
Thanks for reaching out to my concern. I am just been introduced to Databricks. How can I access the data I have in my bronze folder on ADLS?
โ06-27-2025 08:27 AM - edited โ06-27-2025 08:36 AM
Hi,
I don't know if you can create external location in databricks free edition, but this would be my approach ( assuming you want to bring in your own storage account):
https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-external-locations
Here's a really good video guide that explains those concepts and shows how to configure it:
โ06-27-2025 11:07 AM
The only options here are for AWS or Cloudflare to create credentials
โ06-27-2025 08:15 AM
This is an old way of accessing data lake. With the free edition and serverless it is not supported. Try creating an external credential and external location to the data lake.
https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-external-locations
โ06-27-2025 11:07 AM
The only options here are for AWS or Cloudflare to create credentials
โ06-27-2025 12:55 PM
โ06-27-2025 01:00 PM
All I see is for AWS and Cloudflare and nothing for Azure cloud. I am using Azure cloud.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now