How to use Azure Data lake as a storage location to store the Delta Live Tables?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2022 06:32 AM
I am trying write data into Azure Datalake. I am reading files from Azure Blob Storage however when I try to create the Delta Live Table to Azure Datalake I get error the following error
shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key
I also tried to create the DLT in Azure Blob Storage, but it doesnt seem to recognize the container in the Azure Storage Account
I just specify the storage location via the DLT UI, not sure if there is any additional parameters to be configured to make it work.
- Labels:
-
Azure
-
Azure Storage
-
Delta

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2022 08:40 AM
Hi there,
My name is Piper, and I'm a moderator for Databricks. Welcome to the community! Thank you for your question. Let's give your peers a chance to respond before we circle back to this.
Thank you for your patience!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-05-2022 01:29 PM
@Kaniz Fatma I don't think you quite understand the question. I'm running into the same problem. When creating a Delta Live Table pipeline to write to Azure Data Lake Storage (abfss://etc...) as the Storage Location, the pipeline fails with the error @Shikha Mathew mentioned in the original post.
Typically, when you're creating a cluster to process a regular notebook, you have to configure the various OAuth configurations in the cluster's advanced settings. I've tried adding those configurations to the Delta Live Table pipeline, but I get the same error.
Also, we're not using the fs.azure.account.key settings since we're using OAuth authentication to ADLS.
Any suggestions?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-24-2022 05:45 PM
Same issue using abfss//... path and using it for the dlt pipeline storage location. @Robert Thornton , i mounted the abfss path as '/mnt/data' using the same service principal and secret as I used when i configured the oauth. I then changed my dlt pipelines & storage location to use /mnt/data/..path.. and it worked.

