12-16-2021 07:36 AM
I'm mounting a Storage Account to a Databricks cluster in Azure. All the resources are included in a VNET and a private and public subnet has been associated to the Databricks resource. Below I've attached the guide we use for mounting the ADLS G2 to Databricks.
The problem is that when mounting Databricks does not pass on instructions to the executor nodes. It will run for hours without doing anything. Upon deeper investigation in the Spark Cluster UI there is a running application "Databricks Shell". When looking into this application I get the following error message: HTTP ERROR 403 - Problem accessing /stages/stage/. Reason: Invalid or missing CSRF token.
Has anyone from the community encountered this error before? Your suggestions are welcome.
Steps we followed for mounting: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/adls-gen2/azure-datalake-g...
12-16-2021 07:43 AM
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "your_app_client_id",
"fs.azure.account.oauth2.client.secret": "your_app_client_secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage_account.dfs.core.windows.net/",
mount_point = "/mnt/your_folder",
extra_configs = configs)
12-16-2021 07:43 AM
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "your_app_client_id",
"fs.azure.account.oauth2.client.secret": "your_app_client_secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
dbutils.fs.mount(
source = "abfss://container@storage_account.dfs.core.windows.net/",
mount_point = "/mnt/your_folder",
extra_configs = configs)
12-17-2021 02:53 AM
Dear @Hubert Dudek, thanks for your response. We have performed your suggested points one and two. With regards to mounting storage using the app registration we've followed prober instructions.
Interestingly, today I created a Databricks resource without VNET injection, and allowed all connections from Azure services to the storage account. Via this new workspace I am able to mount and read data from the storage account. Reading from the VNET injected Databricks workspace remains unsuccesful.
We are open to more suggestions.
05-20-2022 07:53 AM
Hey there @Derrick Bakhuis
Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.
Cheers!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group