- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-16-2021 07:36 AM
I'm mounting a Storage Account to a Databricks cluster in Azure. All the resources are included in a VNET and a private and public subnet has been associated to the Databricks resource. Below I've attached the guide we use for mounting the ADLS G2 to Databricks.
The problem is that when mounting Databricks does not pass on instructions to the executor nodes. It will run for hours without doing anything. Upon deeper investigation in the Spark Cluster UI there is a running application "Databricks Shell". When looking into this application I get the following error message: HTTP ERROR 403 - Problem accessing /stages/stage/. Reason: Invalid or missing CSRF token.
Has anyone from the community encountered this error before? Your suggestions are welcome.
Steps we followed for mounting: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/adls-gen2/azure-datalake-g...
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-16-2021 07:43 AM
- if you have private link please check that you have it both for dfs and blob for the same resource,
- please check dns which you are using with nslookup, popular error is that someone add private link with private link dns but normal address should be used,
- application used for mount need to have IAM role ""Storage Blob Data Contributor" to aceess your ADLS (Azure: App registrations then you add it IAM role in ADLS IAM)
- please unmount and mount again
- example code to mount:
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "your_app_client_id",
"fs.azure.account.oauth2.client.secret": "your_app_client_secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage_account.dfs.core.windows.net/",
mount_point = "/mnt/your_folder",
extra_configs = configs)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-16-2021 07:43 AM
- if you have private link please check that you have it both for dfs and blob for the same resource,
- please check dns which you are using with nslookup, popular error is that someone add private link with private link dns but normal address should be used,
- application used for mount need to have IAM role ""Storage Blob Data Contributor" to aceess your ADLS (Azure: App registrations then you add it IAM role in ADLS IAM)
- please unmount and mount again
- example code to mount:
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "your_app_client_id",
"fs.azure.account.oauth2.client.secret": "your_app_client_secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage_account.dfs.core.windows.net/",
mount_point = "/mnt/your_folder",
extra_configs = configs)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-17-2021 02:53 AM
Dear @Hubert Dudek, thanks for your response. We have performed your suggested points one and two. With regards to mounting storage using the app registration we've followed prober instructions.
Interestingly, today I created a Databricks resource without VNET injection, and allowed all connections from Azure services to the storage account. Via this new workspace I am able to mount and read data from the storage account. Reading from the VNET injected Databricks workspace remains unsuccesful.
We are open to more suggestions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-20-2022 07:53 AM
Hey there @Derrick Bakhuis
Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.
Cheers!