cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to create Storage Credential using Service Principal [Azure]

AlbertWang
Contributor

As the document indicates, 

An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or higher on the access connector resource in Azure to add the storage credential.

However, the `Access Connector for Azure Databricks` is created by Databricks when creating the Databricks workspace. It is in a Databricks managed resource group. Databricks added a Deny assignment rule to that resource group, so I cannot assign a `Contributor` role to the Azure service principal.

How can I bypass this limitation?

Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor III

Hi, 

You need to create databricks access connector yourself. You're not supposed to be using the one in managed resources group. Create it via portal and then assign proper role in storage account I AM.

View solution in original post

5 REPLIES 5

AlbertWang
Contributor

szymon_dybczak
Contributor III

Hi, 

You need to create databricks access connector yourself. You're not supposed to be using the one in managed resources group. Create it via portal and then assign proper role in storage account I AM.

Thank you, @szymon_dybczak , for your kind reply. Do you know what the Databricks managed Access Connector for Azure Databricks is supposed to be used?

You can think of it like a managed identity. When you create this connector, then you need to assign for it a proper role in storage account IAM. After that, UC will use that identity on your behalf to access that account

AlbertWang
Contributor

Thank you, @szymon_dybczak. This is what I thought. After deploying the Databricks workspace, it automatically creates the Databricks managed `Access Connector for Azure Databricks` in the Databricks managed resource group.

As I understand, I should create Storage Credential refers to the Databricks managed `Access Connector for Azure Databricks`, then create External Location. After that, I can create Unity Catalog managed tables and volumes.

However, because I would want to use Terraform and Azure Managed (Microsoft Entra ID) service principal to create the Storage Credential, I need to assign Contributor role of the Databricks managed `Access Connector for Azure Databricks` to the service principal. But I cannot assign any role of the Databricks managed `Access Connector for Azure Databricks` because of the Deny Assignment rule.

Therefore, as you suggest, now I create my own `Access Connector for Azure Databricks`.

However, if so, what is the point of having the Databricks managed `Access Connector for Azure Databricks` ๐Ÿ˜‚

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group