โ06-12-2024 01:12 PM
As part of S360 action to eliminate SPN secrets, we were asked to move to SPN+certificate / MSI/ User Assigned Managed Identity.
We tried connecting using a custom Active directory (AD) token rather than a client secret through databricks. PEM certificate was used to generate the custom AD token through Java code. We tried to pass the AD token via OAUTH2 method setting below spark configuration.
%scala
// Set up Spark configurations for ADLS Gen2 access with Azure AD token
spark.conf.set("fs.azure.account.auth.type.<Your storage Account>.dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.<Your storage Account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.<Your storage Account>.dfs.core.windows.net", "clientid")
spark.conf.set("fs.azure.account.oauth2.client.secret.<Your storage Account>.dfs.core.windows.net", "")
spark.conf.set("fs.azure.account.oauth2.client.endpoint.<Your storage Account>.dfs.core.windows.net",https://login.microsoftonline.com/<Tenanant-Id>/oauth2/v2.0/token)
spark.conf.set("fs.azure.account.oauth2.access.token.provider", access_token)
To link to an ADLS Gen 2 storage account and access files, we are encountering the following issue.
โ07-13-2024 01:43 PM
I donโt think SPN certificate is supported by spark configuration.
โ07-13-2024 11:36 PM - edited โ07-13-2024 11:37 PM
This is an old way of mounting storage account. Is there any reason why you guys are not using unity catalog? Because if you have unity catalog assigned to your workspace, you can just create databricks access connector (databricks managed identity) and configure access to external location using this managed identity.
โ07-14-2024 01:00 AM - edited โ07-14-2024 01:01 AM
Hello, @szymon_dybczak. We would want to authenticate using a Custom Token or a Keyvault Certificate rather than a secret value or other keys. If we use managed identity, we must set the complete resource level to restrict the access. We would like to authenticate using cluster or notebook level authentication...
โ07-14-2024 01:39 AM - edited โ07-14-2024 01:40 AM
Unfortunately, the way you're currently trying to do this is not supported by databricks. The only valid authentication options are listed in below article:
โ07-14-2024 05:56 AM
@ramesitexp Yes @szymon_dybczak is correct for now only valid option is below :
OAuth 2.0 with a Microsoft Entra ID service principal
Shared access signatures (SAS)
Account keys
For now we are using OAuth 2.0 with a Microsoft Entra ID service principal with client secret not certificate .
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group