cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

upload files to dbfs:/volume using databricks cli

Hardy
New Contributor III

In our azure pipeline we are using databricks-cli command to upload jar files at dbfs:/FileStore location and that works perfectly fine. 

But when we try to use the same command to upload files at dbfs:/Volume/dev/default/files, it does not work and give below error: 

 
Error: dbfs open: /Volume cannot be opened for writing
7 REPLIES 7

saikumar246
New Contributor III
New Contributor III

Hi @Hardy,

What is the command you are running to upload files at "dbfs:/Volume/dev/default/files" ?

The specific command to use with the Databricks CLI for uploading files to the volume's location in Databricks is:

bash
databricks fs cp /path/to/local/file dbfs:/Volumes/my_catalog/my_schema/my_volume/

Please replace /path/to/local/file with the path of your local file and dbfs:/Volumes/my_catalog/my_schema/my_volume/ with the path of your volume in Databricks.

Make sure that the user has all the required permissions on the catalog, schema and volume to upload the file.

Hardy
New Contributor III

@saikumar246 

Yes I am using the same command but it throws error as mentioned in the question. 

saikumar246
New Contributor III
New Contributor III

@Hardy I think you are using the word volume in the path but it should be Volumes(plural), not Volume(singular). Do one thing, copy the volume path directly from the Workspace and try.

Hardy
New Contributor III

Thanks for the suggestion @saikumar246 

I tried everything with Singular, Plural, Small & Capital. (volume, volumes, Volume, Volumes) but unfortunately its not working with either case. 😞 

saikumar246
New Contributor III
New Contributor III

@Hardy I am not sure why you are getting that error. I was able to upload the file when i tried in my local terminal. Can you tell where you are running this command? Is this your local terminal or somewhere else?

And make sure that you are using the latest Databricks CLI.

Hardy
New Contributor III

@saikumar246 

I changed my role to Admin.

So that error is gone but now it is popping another error: 

Error: Put "https://mycompany-dev.cloud.databricks.com/api/2.0/fs/files/Volumes%2Fqa%2Fdefault%2Flibraries%2Ftes...": context deadline exceeded (Client.Timeout exceeded while awaiting headers)

My command is: databricks fs cp C:\test.txt dbfs:/Volumes/qa/default/libraries --recursive --overwrite

I am executing this command from my local machine and databricks cli version is: 0.212.2

Kaniz
Community Manager
Community Manager

Hey there! Thanks a bunch for being part of our awesome community! 🎉 

We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution for you. And remember, if you ever need more help , we're here for you! 

Keep being awesome! 😊🚀

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.