cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Mounting adls gen2 from databricks RBAC issue

keer1392
New Contributor

 

I am trying to mount my Gen 2 storage account in databricks. I have added permission of Storage blob data contributor to the storage account. But I am getting below error:
Invalid permissions on the specified KeyVault https://kvmigrationnew.vault.azure.net/. Wrapped Message: Status code 403, {"error":{"code":"Forbidden","message":"Caller is not authorized to perform action on resource.\r\nIf role assignments, deny assignments or role definitions were changed recently, please observe propagation time.\r\nCaller: name=AzureDatabricks;appid=123abcdbhjdllajddaddd;iss=https://sts.windows.net/984433r41f5ssadfgfsf/\r\nAction: 'Microsoft.KeyVault/vaults/secrets/getSecret/action'\r\nResource: '/subscriptions/xxxx-xxx-xxx/resourcegroups/rsggen2demo/providers/microsoft.keyvault/vaults/kvmigrationnew/secrets/clientid'\r\nAssignment: (not found)\r\nDecisionReason: 'DeniedWithNoValidRBAC' \r\nVault: kvmigrationnew;location=eastus2\r\n","innererror":{"code":"ForbiddenByRbac"}}}
------------------------------------
Configuration
---------------
Resource group: rsggen2demo
Storage acc name: stggen2demo
Keyvault: kvmigrationnew
DB Scope: gen2mig

Code:
adlsAccountName = "stggen2demo"
adlsContainerName = "output"
adlsFolderName = "schemas_new"
mountPoint = "/mnt/schemas_new"

# Application (Client) ID
applicationId = dbutils.secrets.get(scope="gen2mig",key="ClientID")

# Application (Client) Secret Key
authenticationKey = dbutils.secrets.get(scope="gen2mig",key="ClientSecret")

# Directory (Tenant) ID
tenandId = dbutils.secrets.get(scope="gen2mig",key="TenantID")

endpoint = "https://login.microsoftonline.com/" + tenandId + "/oauth2/token"
source = "abfss://" + adlsContainerName + "@" + adlsAccountName + ".dfs.core.windows.net/" + adlsFolderName

# Connecting using Service Principal secrets and OAuth
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": applicationId,
"fs.azure.account.oauth2.client.secret": authenticationKey,
"fs.azure.account.oauth2.client.endpoint": endpoint}

# Mount ADLS Storage to DBFS only if the directory is not already mounted
if not any(mount.mountPoint == mountPoint for mount in dbutils.fs.mounts()):
dbutils.fs.mount(
source = source,
mount_point = mountPoint,
extra_configs = configs)




 

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group