cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Hive metastore external tables and service principals

mysura
Visitor

we are using mount points via service principals approach to connect the storage account. and using the same mount points to create external tables in hive meta store. but now we are trying using only service principals setup ,so  we need to change the external locations of tables from (dbfs/mnt/)..... path to adfss:// protocol (location remains same but the approach is by adfss URL) .i can alter the existing tables location , working fine but the physical tables in catalog explorer are not opening .
Tried to drop the tables and recreated with new adfss location, but still tables is unaccusable in catalog explore .
Service principal setup is running fine   i could load the data from storage account, as well as from the tables in hive meta store.
*cluster is not unity enabled and it is not need 
code:


1. spark. SQL("""
    ALTER TABLE schema. Table
    SET LOCATION 'abfss://container@storageaccount.dfs.core.windows.net/foldername/filename.delta'
""")
2 . creating  external table:
CREATE EXTERNAL TABLE hive_metastore.schema.table
USING DELTA
LOCATION "abfss://container@storageaccount.dfs.core.windows.net/foldername/filename.delta"

Error:
Failure to initialize configuration for storage account stalyceprdevbdls001.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key
mysura_0-1736349743015.png

 




1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @mysura,

The error message indicates an invalid configuration value for fs.azure.account.key

Since your cluster is not Unity Catalog enabled, ensure that the cluster configuration includes the necessary settings for accessing ADLS. This includes setting the appropriate Spark configurations for OAuth.

Verify that the service principal has the necessary permissions to access the storage account. The service principal should have at least the Storage Blob Data Contributor role assigned at the storage account level

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group