I have created external table like below.
# create table
spark.sql(f"""
CREATE EXTERNAL TABLE IF NOT EXISTS {database_schema}.{tableName}
USING PARQUET
OPTIONS
(
'path' '{raw_storage}/{folder_path}',
'forward_spark_azure_storage_credentials' 'true'
)
""")
Then try to add partition to the table
spark.sql(f"""
ALTER TABLE {database_schema}.{tableName}
ADD IF NOT EXISTS PARTITION (local_date_part='{partition_date}')
LOCATION '{raw_pcmd_storage}/{folder_path}/local_date_part={partition_date}'
""")
Azure databricks throwing below exception
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.KeyProviderException Failure to initialize configuration)
i have verified dbfss path and oauth and all are good and i am able to list files as well.
Any help would be appreciated.