cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to connect your Azure Data Lake Storage to Azure DatabricksStandard Workspace �� Private link In your storage accounts please go to “Networ...

Hubert-Dudek
Esteemed Contributor III

How to connect your Azure Data Lake Storage to Azure Databricks

Standard Workspace

👉 Private link

In your storage accounts please go to “Networking” -> “Private endpoint connections” and click Add Private Endpoint.

image.pngIt is important to add private links in the same region and the same virtual network as your databricks. Databricks will need for data lake one private link for target sub-resource “dfs” and one for “blob”.

image.pngIn Virtual Network options for private link please select virtual network  which has PrivateDatabricks and PublicDatabricks subset. You can use ServiceEndpoints subset for your private link (if you don’t have it please create it).

image.png 

👉 Application

You need to create Azure application which will authorize access to your data lake storage. Search for “app registration” and create it with friendly name:

image.pngAfter creating app please copy following values, as you will need them later:

-         app_id: Please go to app main page and copy “Application (client) ID”

-         tenant_id: Please go to app main page and copy “Directory (tenant) ID”

-         secret: Please go to app “Certificates and secrets” create new client secret and please copy “Value”.

👉 Grant your application access to storage account

Please back to your delta lake storage account. Please go to “Access Control (IAM)” and add role “Storage Blob Data Contributor”

image.png 

Click select members and find app which we’ve just created.

👉 Databricks

Now we can finally go to databricks to mount containers from our storage. Mount is permanent it is enough to do it only once. It is good to store code which we used for mount (for example in repo we can create folder infrastructure) so we can easily recreate it. We just need to put to our code values which we copied earlier.

configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": app_id,
           "fs.azure.account.oauth2.client.secret": secret,
           "fs.azure.account.oauth2.client.endpoint": f"https://login.microsoftonline.com/{tenant_id}/oauth2/token"}
 
dbutils.fs.mount(
  source = f"abfss://{container}@{storage_name}.dfs.core.windows.net/",
  mount_point = "/mnt/your_folder",
  extra_configs = configs)

👉 Troubleshooting

It is good to use nslookup command to check is your delta lake storage resolving to private ip:

image.png

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III

I've created that post as it is returning question in databricks community. I will keep it updated. Any suggestions are welcome.

Anonymous
Not applicable

@Hubert Dudek​ - Have I told you lately that you're the best!?!

Hubert-Dudek
Esteemed Contributor III

you know how to motivate me 🙂

dollyb
Contributor

This should be updated for Unity Catalog workspaces.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group