cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Upload to Volume inside unity catalog not possible?

boriste
New Contributor II
 

I want to upload a simple csv file to a volume which was created in our unity catalog. We are using secure cluster connectivity and our storage account (metastore) is not publicly accessable. We injected the storage in our vnet. 

I am getting the following error while uploading:

"An error occurred while uploading. Please try again"
Should i open the IP Address of the control plane in our storage account? Is there a way to fix this?

1 ACCEPTED SOLUTION

Accepted Solutions

Ahdri
Databricks Employee
Databricks Employee

Hi @Martinitus 

I believe this might be related to one of the known limitations listed in our documentation: 
"You cannot upload or download files for volumes backed by Azure storage accounts configured with Azure Firewall or Private Link" Source

We are actively working on this and hope for a fix in the coming weeks.  

View solution in original post

11 REPLIES 11

karthik_p
Esteemed Contributor

@boriste can you please check if you have permissions to read/write to volume, or you admin if not please reach out to admin to provide to access to write/read to your volume

boriste
New Contributor II

@karthik_p I have full admin rights and im logged in with my azure ad account. I also have read/write permissions

But im getting the message:

{
  "error_code" : "BAD_REQUEST",
  "message" : "Missing credentials to access the DBFS root storage container in Azure."
}

I dont know if databricks uses another identity but somehow i cant upload 

Martinitus
New Contributor III

We (boriste and me) have tested creating volumes from a notebook which worked just fine. This means, networking and auth from the cluster (dataplane) to the metastore storage account works. 

What was not working was uploading a file via the Web-UI. Our suspicion is, that this upload tries to save the file to the metastore storage account from the databricks control plane. Which does _not_ have access (networking is blocked by NSG) to our metastore storage account.


I assume one solution would be to whitelist the IP of the databricks controlplane in the networking settings of our metastore storage account. We could try that but we don't know the appropriate IP range.

We checked the storage account logs of the metastore. Whenever we try to upload a file to a volume from the UI of the workspace, we get an authorization error. The request seems to be coming from a Databricks Java Backend service (ControlPlane?). We tried adding the caller IP to the storage account networking white list - that is not possible because its not a public IP?

Martinitus_1-1690794892372.png

 



Martinitus_0-1690794771666.png

 

Ahdri
Databricks Employee
Databricks Employee

Hi @Martinitus 

I believe this might be related to one of the known limitations listed in our documentation: 
"You cannot upload or download files for volumes backed by Azure storage accounts configured with Azure Firewall or Private Link" Source

We are actively working on this and hope for a fix in the coming weeks.  

UstOldfield
New Contributor II

Hello! Do you have an idea as to when this will be fixed - what release is it aligned to? As this is still an issue and is causing me and my customers some issues, considering that you can't use DBFS with Unity Catalog. 

Ahdri
Databricks Employee
Databricks Employee

Hi @UstOldfield 

A fix was released for the previous issues where upload/download was not working for Azure storage accounts configured with Azure Firewall or Private Link.

Can you try to see if this addresses your problem?


UstOldfield
New Contributor II

Hey @Ahdri - it works. Thanks for letting me know. 

Ahdri
Databricks Employee
Databricks Employee

A fix was released for the previous issues where upload/download was not working for Azure storage accounts configured with Azure Firewall or Private Link.

jeroenvs
New Contributor III

@Ahdri We are running into the same issue. It took a while to figure out that the error message is related to this limitation. Any updates on when we can expect the limitation to be taken away? We want to secure access to our storage accounts with a firewall, setting up a public storage account is not an option.

Ahdri
Databricks Employee
Databricks Employee

Hi @jeroenvs 

A fix was released for the previous issue where upload/download was not working for Azure storage accounts configured with Azure Firewall or Private Link.

Can you try to see if this addresses your problem?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group