cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

unable to open external table created under hive_metastore (Data view) in azure .

karthik_p
Esteemed Contributor

we have enabled unity catalog in Azure, we have a requirement to create external table in hive_metastore. we have configured ADLS Gen2 and by using different access methods that ADLS Gen 2 supports we have created external table. we are able to view data when we run select query through notebook, but when we try to view table under hive_metastore catalog in Data, able to see table name, but when we select that table, we are getting.

ADLS Gen2 Access config

service_credential = dbutils.secrets.get(scope="<scope>",key="<service-credential-key>")

spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "OAuth")

spark.conf.set("fs.azure.account.oauth.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")

spark.conf.set("fs.azure.account.oauth2.client.id.<storage-account>.dfs.core.windows.net", "<application-id>")

spark.conf.set("fs.azure.account.oauth2.client.secret.<storage-account>.dfs.core.windows.net", service_credential)

spark.conf.set("fs.azure.account.oauth2.client.endpoint.<storage-account>.dfs.core.windows.net", "https://login.microsoftonline.com/<directory-id>/oauth2/token")

"Failure to initialize configuration"

we have configured data access configuration setting and added service credential, we have also tried to add "spark.hadoop.fs.azure.account.key.hiveexternalstore.blob.core.windows.net <Key>" but still we are not able to view table

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

Hi, Unity Catalog and Hive metastore are two different things. If the cluster is, Unity Catalog enabled (and especially shared one) it doesn't support the above configuration. Maybe the best will be to register security credentials and external location in the Unity catalog and then register the external table.

karthik_p
Esteemed Contributor

@Hubert Dudek​ Ya you are right, when we enable UC and create external table using external location, we are able to view using select stameent from notebook (we need to run access key code), but when we add access key to cluster or sql warehouse and login to data-->hive_metastore, we are able to see table, but when we click on table we are getting issue.

please find below query

CREATE EXTERNAL TABLE test.hiveextmount

USING delta

LOCATION 'abfss://testcontainer@testexternalstore.dfs.core.windows.net/tables'

SELECT * from CSV.`abfss://testcontainer@testexternalstore.dfs.core.windows.net/tables`;

as per data bricks if we enable UC also and if we are not specifying catalog, we should be able to create external/managed tables in hive_metastore which is default metastore. we are able to view old tables that we created using mount when we are on wasbs type. after changing wasbs to abfss and using above query when we create table then we started seeing issue.

if we create table under new catalog which 3 level namespace, that works fine. but as a backup we are cheking for hive_metastore also, as UC has lot of limitations

Anonymous
Not applicable

@karthik p​ Unity Catalog is supported in DBR 11.2 and higher versions. If you attempt to create hive external tables on these DBR versions by default storage credential will be used for authentication to the storage account. You can create storage credentials and external location then create external table wth three level namespace notation `hive_metastore`.`schema`.`table`. If you want to configure storage access using spark configuration, then use DBR 11.1 or lower versions for your use case.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.