Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
Showing results for 
Search instead for 
Did you mean: 

Publish data to Azure ADLS Gen2 from Delta Live Tables pipelines?


I would like to publish data from from delta live table (DLT) to an Azure ADLS Gen2 storage. Checked the documentation but cannot get it to work. Any insights on how to do so?

Invalid configuration value detected for locationI've tried to specify the "Storage location" with many combinations of abfs:// and also abfss:// without any success.

Only succeeded to write to hive_metastore and dbfs so far. But I want to write to an external location (ADLS Gen2) so the tables can be referenced by Unity Catalog.


Honored Contributor II

if you are using abfs protocol , you need to configure some more configurations to read file from abfs protocol , so firstly mount a databricks worspace with that location then you can use abfs protocol i guess ,

Rishabh Pandey

I can use write and read from ADLS Gen2 using both the abfs and abfss protocol from regular notebooks. But for some reason it is not working for DLT.

Do you know if there is a need to mount ADLS Gen2 to make it work specifically with DLT?

New Contributor III

Mounting an ADLS container is no longer recommended. If you want DLT to materialize your data in ADLS, you need to do two things:

  1. In DLT Pipeline settings, configure ADLS credentials using either SAS token or Service Principal. See here for more details. If you are using SAS token, we also recommed using Secrets to store the key. See here for a concrete example.
  2. Specify ADLS location as your DLT pipeline storage path. e.g. abfss://<container-name>@<storage-account-name>

Esteemed Contributor III

In the settings of DLT you need to go to Compute -> advanced and aff configuration (of course, each key needs a value required in your case): OAuth org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider {{secrets/scope/databricks-clientid}} {{secrets/scope/databricks-secret}}<YOUR_APP_ID>/oauth2/token

if you have the problem you can prefix all keys with spark_conf.spark.hadoop


Great, will try this. Thanks a lot!


Finally got it to work and here's what we did:


  • Service Principal created
  • Secret created in Azure KeyVault
  • Scope & Secret created in Databricks (Append "#secrets/createScope" in the url to get the databricks page where you create secrets)


  • Only one DLT pipeline per external storage location
  • The keys, related to ADLS access, are prefixed with "spark.hadoop."
  • You won't get lineage in Unity Catalog when using DLT, however, you see the lineage in DLT instead.

DLT Pipeline settings - JSON (example of what worked, substitute for your own parameter values)


   "clusters": [


           "label": "default",

           "autoscale": {

               "min_workers": 1,

               "max_workers": 5,

               "mode": "ENHANCED"




           "label": "maintenance"



   "development": true,

   "continuous": false,

   "channel": "PREVIEW",

   "edition": "ADVANCED",

   "photon": false,

   "libraries": [


           "notebook": {

               "path": "/Repos/repo_folder/delta-live-tables-notebooks/sql/Retail Sales"




   "name": "Retail Sales SQL External Tables",

   "storage": "abfss://<container>@<azure-storage-account>",

   "configuration": {

       "": "OAuth",

       "": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",

       "": "<application-id>",

       "": "{{secrets/<scope-name-in-databricks>/<secret-name-in-azure-keyvault>}}",

       "": ""<directory-id>"/oauth2/token"



Unity Catalog - Register a new External Table


LOCATION 'abfss://<container-name>@<-storage-account-name>'

Summary - How to get DLT & Unity Catalog to Work Together

This way you can get Delta Live Tables (DLT) to work together with Unity Catalog by referring to external tables.

New Contributor III

Thanks for the write up. Worked for me. Question. In my case, I let DLT write its tables to an abfss://data@... storage path using your config pattern above. It successfully writes the tables, but when I view the tables in hive_metastore in data explorer, I get the fs.configuration error. If I manually run the oauth routine to connect to that account, it connects, but I still can't view the data in data-explorer. However, I can create an external table in Unity just fine (as it's using the unity storage credentials).

New Contributor III

I am trying to load from ADLS with DLT, after add the spark configurations I got below errors:

org.apache.spark.sql.streaming.StreamingQueryException: [STREAM_FAILED] Query [id = 818323fc-80d5-4833-9f46-7d1afc9c5bf7, runId = 722e9aac-0fdd-4206-9d49-683bb151f0bf] terminated with exception: The container in the file event `{"backfill":{"bucket":"root@dbstoragelhdp7mflfxe2y","key":"5810201264315799/Data/Temp/test_account.csv","size":1801,"eventTime":1682522202000,"newerThan$default$2":false}}` is different from expected by the source: `my_container@my_storageaccount`.

any sequestions?


Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!