cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

upload files to dbfs:/volume using databricks cli

Hardy
New Contributor III

In our azure pipeline we are using databricks-cli command to upload jar files at dbfs:/FileStore location and that works perfectly fine. 

But when we try to use the same command to upload files at dbfs:/Volume/dev/default/files, it does not work and give below error: 

 
Error: dbfs open: /Volume cannot be opened for writing
6 REPLIES 6

saikumar246
Databricks Employee
Databricks Employee

Hi @Hardy,

What is the command you are running to upload files at "dbfs:/Volume/dev/default/files" ?

The specific command to use with the Databricks CLI for uploading files to the volume's location in Databricks is:

bash
databricks fs cp /path/to/local/file dbfs:/Volumes/my_catalog/my_schema/my_volume/

Please replace /path/to/local/file with the path of your local file and dbfs:/Volumes/my_catalog/my_schema/my_volume/ with the path of your volume in Databricks.

Make sure that the user has all the required permissions on the catalog, schema and volume to upload the file.

Hardy
New Contributor III

@saikumar246 

Yes I am using the same command but it throws error as mentioned in the question. 

saikumar246
Databricks Employee
Databricks Employee

@Hardy I think you are using the word volume in the path but it should be Volumes(plural), not Volume(singular). Do one thing, copy the volume path directly from the Workspace and try.

Hardy
New Contributor III

Thanks for the suggestion @saikumar246 

I tried everything with Singular, Plural, Small & Capital. (volume, volumes, Volume, Volumes) but unfortunately its not working with either case. ๐Ÿ˜ž 

saikumar246
Databricks Employee
Databricks Employee

@Hardy I am not sure why you are getting that error. I was able to upload the file when i tried in my local terminal. Can you tell where you are running this command? Is this your local terminal or somewhere else?

And make sure that you are using the latest Databricks CLI.

Hardy
New Contributor III

@saikumar246 

I changed my role to Admin.

So that error is gone but now it is popping another error: 

Error: Put "https://mycompany-dev.cloud.databricks.com/api/2.0/fs/files/Volumes%2Fqa%2Fdefault%2Flibraries%2Ftes...": context deadline exceeded (Client.Timeout exceeded while awaiting headers)

My command is: databricks fs cp C:\test.txt dbfs:/Volumes/qa/default/libraries --recursive --overwrite

I am executing this command from my local machine and databricks cli version is: 0.212.2

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group