cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Hive metastore CANNOT access storage

cobba16
Visitor

I'm new to Azure Databricks and I'm facing an issue when trying to create a schema or table that points to my Azure Storage account. I keep getting this error:

```
[EXTERNAL_METASTORE_CLIENT_ERROR.OPERATION_FAILED] Client operation failed: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.AbfsRestOperationException Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, HEAD, , https://s<storage_acc_name>dfs.core.windows.net/data/?upn=false&action=getAccessControl&timeout=90) SQLSTATE: 58000
```

Here's what I've done so far:

  1. I can successfully list files in my storage using dbutils.fs.ls() with my storage account key

  2. I've tried granting access control roles in my storage account

  3. But when I run CREATE SCHEMA with a LOCATION pointing to my storage, it fails


# This works:
dbutils.fs.ls("abfss://data@<storage_acc_name>.dfs.core.windows.net/")
# Returns: [FileInfo(path='abfss://.../hotel-weather/', ...)]

# This fails:
spark.sql("CREATE SCHEMA IF NOT EXISTS bronze LOCATION 'abfss://data@<storage_acc_name>.dfs.core.windows.net/bronze'")

It seems like Databricks can read from my storage, but cannot write/create schemas there. Has anyone faced this before? What permissions am I missing?

What I've tried:

  • Added Storage Blob Data Contributor role to various identities

  • Verified my storage key is correct (since listing works)

0 REPLIES 0