Option 1: Use Azure Service Principal + ABFS OAuth Authentication (Recommended for Prod)
1. Register a Service Principal in Azure
2. Mount using OAuth credentials
configs = {
"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": dbutils.secrets.get(scope="<scope>", key="<client-id-key>"),
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope="<scope>", key="<client-secret-key>"),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<tenant-id>/oauth2/token",
}
dbutils.fs.mount(
source = "abfss://<container>@<storage-account>.dfs.core.windows.net/",
mount_point = "/mnt/<mount-name>",
extra_configs = configs)
Note:- This works in serverless clusters and avoids using account keys (which are less secure).
Supported in Serverless
OAuth (Service Principal) - Yes(Recommended)
dbutils.secrets - Yes
Not Supported in Serverless
spark.conf.set for sensitive keys - NO
Environment variables - NO
Best Practice
Databricks Solution Architect