Adding partition location to external table throwing exception
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-27-2023 10:55 PM
I have created external table like below.
# create table
spark.sql(f"""
CREATE EXTERNAL TABLE IF NOT EXISTS {database_schema}.{tableName}
USING PARQUET
OPTIONS
(
'path' '{raw_storage}/{folder_path}',
'forward_spark_azure_storage_credentials' 'true'
)
""")
Then try to add partition to the table
spark.sql(f"""
ALTER TABLE {database_schema}.{tableName}
ADD IF NOT EXISTS PARTITION (local_date_part='{partition_date}')
LOCATION '{raw_pcmd_storage}/{folder_path}/local_date_part={partition_date}'
""")
Azure databricks throwing below exception
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.KeyProviderException Failure to initialize configuration)
i have verified dbfss path and oauth and all are good and i am able to list files as well.
Any help would be appreciated.
- Labels:
-
Azure
-
External Table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2023 07:13 AM
@sri bet :
The error message suggests that there may be an issue with the configuration of the Azure Key Vault or the authentication process. Here are some possible solutions:
- Check that the Azure Key Vault is properly configured and accessible by the Databricks cluster. You may need to provide the appropriate access policies for the Databricks service principal in the Key Vault.
- Ensure that the credentials used by Databricks to access the Azure storage account are valid and up-to-date. You can verify this by listing the files in the storage account using Databricks.
- Try setting the fs.azure.account.key.<account-name>.dfs.core.windows.net.<your-container>.oauth2.client.id and fs.azure.account.key.<account-name>.dfs.core.windows.net.<your-container>.oauth2.client.secret configuration properties explicitly in the spark.conf object to authenticate with the Azure Key Vault.
- Check the Databricks cluster logs for more detailed error messages or warnings that could provide additional insights into the issue.
I hope this helps! Let me know if you have any other questions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-03-2023 11:35 PM
Hi @sri bet
Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.
Please help us select the best solution by clicking on "Select As Best" if it does.
Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

