Receiving this error: KeyProviderException: Failure to initialize configuration for storage account adlspersonal.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.key
I used hive meta store to save my table
%python
spark.read.table("catalog.default.table").write.option("path", "abfss://container@storage_acc.dfs.core.windows.net/dir1").saveAsTable("annual_enterprise_survey")
But, on running:
%python
dbutils.fs.ls("abfss://container@storage_acc.dfs.core.windows.net/dir1")
Things are working perfectly. Why so?
Apart from this when trying to save tables in UC things are again working fine.