We are currently working on Databricks DLT tables to transform data from bronze to silver. we are specifically instructed us not to use mount paths for accessing data from ADLS Gen 2. To comply, I configured storage credentials and created an external location, which allows us to access data from ADLS in any notebook using ABFSS URLs—this setup is working as expected.
Initially, we created DLT tables using the Unity Catalog method without any issues. However, we later learned that Unity Catalog cannot be used in production, prompting us to switch to the Hive Metastore. Unfortunately, while configuring DLT with Hive Metastore, we encountered an error asking us to configure the storage account key and token.
We are puzzled as to why our workspace can access data without issues, yet the DLT Hive Metastore approach is failing to do so. Given that we are strictly prohibited from creating mount points or configuring SAS tokens and storage keys, any suggestions or solutions would be greatly appreciated.
Error - '
shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.KeyProviderException: Failure to initialize configuration for storage account <storage-account-name>.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.key
shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key '