cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Adding partition location to external table throwing exception

sribet
New Contributor

I have created external table like below.

# create table
spark.sql(f"""
    CREATE EXTERNAL TABLE IF NOT EXISTS {database_schema}.{tableName}
    USING PARQUET
    OPTIONS
    (
        'path' '{raw_storage}/{folder_path}',
        'forward_spark_azure_storage_credentials' 'true'
    )
""")

Then try to add partition to the table

spark.sql(f"""
     ALTER TABLE {database_schema}.{tableName}
     ADD IF NOT EXISTS PARTITION (local_date_part='{partition_date}')
     LOCATION '{raw_pcmd_storage}/{folder_path}/local_date_part={partition_date}'
""")

Azure databricks throwing below exception

AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.KeyProviderException Failure to initialize configuration)

i have verified dbfss path and oauth and all are good and i am able to list files as well.

Any help would be appreciated.

2 REPLIES 2

Anonymous
Not applicable

@sri bet​ :

The error message suggests that there may be an issue with the configuration of the Azure Key Vault or the authentication process. Here are some possible solutions:

  1. Check that the Azure Key Vault is properly configured and accessible by the Databricks cluster. You may need to provide the appropriate access policies for the Databricks service principal in the Key Vault.
  2. Ensure that the credentials used by Databricks to access the Azure storage account are valid and up-to-date. You can verify this by listing the files in the storage account using Databricks.
  3. Try setting the fs.azure.account.key.<account-name>.dfs.core.windows.net.<your-container>.oauth2.client.id and fs.azure.account.key.<account-name>.dfs.core.windows.net.<your-container>.oauth2.client.secret configuration properties explicitly in the spark.conf object to authenticate with the Azure Key Vault.
  4. Check the Databricks cluster logs for more detailed error messages or warnings that could provide additional insights into the issue.

I hope this helps! Let me know if you have any other questions.

Anonymous
Not applicable

Hi @sri bet​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.