I'm new to Azure Databricks and I'm facing an issue when trying to create a schema or table that points to my Azure Storage account. I keep getting this error:
```
[EXTERNAL_METASTORE_CLIENT_ERROR.OPERATION_FAILED] Client operation failed: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.AbfsRestOperationException Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, HEAD, , https://s<storage_acc_name>dfs.core.windows.net/data/?upn=false&action=getAccessControl&timeout=90) SQLSTATE: 58000```
Here's what I've done so far:
I can successfully list files in my storage using dbutils.fs.ls() with my storage account key
I've tried granting access control roles in my storage account
But when I run CREATE SCHEMA with a LOCATION pointing to my storage, it fails
# This works:
dbutils.fs.ls("abfss://data@<storage_acc_name>.dfs.core.windows.net/")
# Returns: [FileInfo(path='abfss://.../hotel-weather/', ...)]
# This fails:
spark.sql("CREATE SCHEMA IF NOT EXISTS bronze LOCATION 'abfss://data@<storage_acc_name>.dfs.core.windows.net/bronze'")
It seems like Databricks can read from my storage, but cannot write/create schemas there. Has anyone faced this before? What permissions am I missing?
What I've tried: