cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks DLT with Hive Metastore and ADLS Access Issues

yvishal519
Contributor

We are currently working on Databricks DLT tables to transform data from bronze to silver. we are specifically instructed us not to use mount paths for accessing data from ADLS Gen 2. To comply, I configured storage credentials and created an external location, which allows us to access data from ADLS in any notebook using ABFSS URLs—this setup is working as expected.

Initially, we created DLT tables using the Unity Catalog method without any issues. However, we later learned that Unity Catalog cannot be used in production, prompting us to switch to the Hive Metastore. Unfortunately, while configuring DLT with Hive Metastore, we encountered an error asking us to configure the storage account key and token.

We are puzzled as to why our workspace can access data without issues, yet the DLT Hive Metastore approach is failing to do so. Given that we are strictly prohibited from creating mount points or configuring SAS tokens and storage keys, any suggestions or solutions would be greatly appreciated.

Error - '

shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.KeyProviderException: Failure to initialize configuration for storage account <storage-account-name>.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.key
shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key '

 

yvishal519_0-1721908544085.png

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor III

Hi @yvishal519 ,

Since you're using hive metastore you have no other option than mount points. Storage credentials and external locations are only supported in Unity Catalog

View solution in original post

2 REPLIES 2

szymon_dybczak
Contributor III

Hi @yvishal519 ,

Since you're using hive metastore you have no other option than mount points. Storage credentials and external locations are only supported in Unity Catalog

Thank you for your quick responses! Could you please share any documentation that mentions that storage credentials and external locations are only supported in Unity Catalog?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group