Dear Databricks Community Members:
The symptom:
The DLT pipeline was failed with the error message:
Failure to initialize configuration for storage account storageaccount.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.key Invalid configuration value detected for fs.azure.account.key
The context:
An external location was created and tested for the above storage location in a unity catalog. The workspace was enabled for the unity catalog.
The notebook to list or load a csv file in the above location is successful. I also noticed that the cluster to run the notebook must enable unity catalog.
DLT code is like this:
import dlt
@dlt.table(name="test_data")
def test_data():
path="abfss://container@storageaccount.dfs.core.windows.net/test.csv"
return (
spark.read.csv(path,header=True,inferSchema=True)
)
The above code can be verified in the notebook.
However, the above code was failed in DLT pipeline with key configuration issue.
Please advise if there is anything missing?
Thanks.