01-26-2022 01:42 PM
How to connect your Azure Data Lake Storage to Azure Databricks
Standard Workspace
👉 Private link
In your storage accounts please go to “Networking” -> “Private endpoint connections” and click Add Private Endpoint.
It is important to add private links in the same region and the same virtual network as your databricks. Databricks will need for data lake one private link for target sub-resource “dfs” and one for “blob”.
In Virtual Network options for private link please select virtual network which has PrivateDatabricks and PublicDatabricks subset. You can use ServiceEndpoints subset for your private link (if you don’t have it please create it).
👉 Application
You need to create Azure application which will authorize access to your data lake storage. Search for “app registration” and create it with friendly name:
After creating app please copy following values, as you will need them later:
- app_id: Please go to app main page and copy “Application (client) ID”
- tenant_id: Please go to app main page and copy “Directory (tenant) ID”
- secret: Please go to app “Certificates and secrets” create new client secret and please copy “Value”.
👉 Grant your application access to storage account
Please back to your delta lake storage account. Please go to “Access Control (IAM)” and add role “Storage Blob Data Contributor”
Click select members and find app which we’ve just created.
👉 Databricks
Now we can finally go to databricks to mount containers from our storage. Mount is permanent it is enough to do it only once. It is good to store code which we used for mount (for example in repo we can create folder infrastructure) so we can easily recreate it. We just need to put to our code values which we copied earlier.
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": app_id,
"fs.azure.account.oauth2.client.secret": secret,
"fs.azure.account.oauth2.client.endpoint": f"https://login.microsoftonline.com/{tenant_id}/oauth2/token"}
dbutils.fs.mount(
source = f"abfss://{container}@{storage_name}.dfs.core.windows.net/",
mount_point = "/mnt/your_folder",
extra_configs = configs)
👉 Troubleshooting
It is good to use nslookup command to check is your delta lake storage resolving to private ip:
01-26-2022 01:43 PM
I've created that post as it is returning question in databricks community. I will keep it updated. Any suggestions are welcome.
01-26-2022 05:33 PM
@Hubert Dudek - Have I told you lately that you're the best!?!
01-27-2022 04:00 AM
you know how to motivate me 🙂
Thursday
This should be updated for Unity Catalog workspaces.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group