cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

VNET injected Databricks cluster not able to mount - 403 error

TheDataDexter
New Contributor III

I'm mounting a Storage Account to a Databricks cluster in Azure. All the resources are included in a VNET and a private and public subnet has been associated to the Databricks resource. Below I've attached the guide we use for mounting the ADLS G2 to Databricks.

The problem is that when mounting Databricks does not pass on instructions to the executor nodes. It will run for hours without doing anything. Upon deeper investigation in the Spark Cluster UI there is a running application "Databricks Shell". When looking into this application I get the following error message: HTTP ERROR 403 - Problem accessing /stages/stage/. Reason: Invalid or missing CSRF token.

Has anyone from the community encountered this error before? Your suggestions are welcome.

Steps we followed for mounting: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/adls-gen2/azure-datalake-g...

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III
  • if you have private link please check that you have it both for dfs and blob for the same resource,
  • please check dns which you are using with nslookup, popular error is that someone add private link with private link dns but normal address should be used,
  • application used for mount need to have IAM role ""Storage Blob Data Contributor" to aceess your ADLS (Azure: App registrations then you add it IAM role in ADLS IAM)
  • please unmount and mount again
  • example code to mount:
configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": "your_app_client_id",
           "fs.azure.account.oauth2.client.secret": "your_app_client_secret",
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
 
dbutils.fs.mount(
  source = "abfss://container@storage_account.dfs.core.windows.net/",
  mount_point = "/mnt/your_folder",
  extra_configs = configs)

View solution in original post

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III
  • if you have private link please check that you have it both for dfs and blob for the same resource,
  • please check dns which you are using with nslookup, popular error is that someone add private link with private link dns but normal address should be used,
  • application used for mount need to have IAM role ""Storage Blob Data Contributor" to aceess your ADLS (Azure: App registrations then you add it IAM role in ADLS IAM)
  • please unmount and mount again
  • example code to mount:
configs = {"fs.azure.account.auth.type": "OAuth",
           "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
           "fs.azure.account.oauth2.client.id": "your_app_client_id",
           "fs.azure.account.oauth2.client.secret": "your_app_client_secret",
           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your_tenant/oauth2/token"}
 
dbutils.fs.mount(
  source = "abfss://container@storage_account.dfs.core.windows.net/",
  mount_point = "/mnt/your_folder",
  extra_configs = configs)

TheDataDexter
New Contributor III

Dear @Hubert Dudek​, thanks for your response. We have performed your suggested points one and two. With regards to mounting storage using the app registration we've followed prober instructions.

Interestingly, today I created a Databricks resource without VNET injection, and allowed all connections from Azure services to the storage account. Via this new workspace I am able to mount and read data from the storage account. Reading from the VNET injected Databricks workspace remains unsuccesful.

We are open to more suggestions.

Hi @Derrick Bakhuis​ , Did you get a chance to go through this link?

Anonymous
Not applicable

Hey there @Derrick Bakhuis​ 

Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.

Cheers!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.