โ02-05-2024 02:07 AM
In our azure pipeline we are using databricks-cli command to upload jar files at dbfs:/FileStore location and that works perfectly fine.
But when we try to use the same command to upload files at dbfs:/Volume/dev/default/files, it does not work and give below error:
โ02-05-2024 05:52 AM
Hi @Hardy,
What is the command you are running to upload files at "dbfs:/Volume/dev/default/files" ?
The specific command to use with the Databricks CLI for uploading files to the volume's location in Databricks is:
bash
databricks fs cp /path/to/local/file dbfs:/Volumes/my_catalog/my_schema/my_volume/
Please replace /path/to/local/file
with the path of your local file and dbfs:/Volumes/my_catalog/my_schema/my_volume/
with the path of your volume in Databricks.
Make sure that the user has all the required permissions on the catalog, schema and volume to upload the file.
โ02-05-2024 07:51 PM
Yes I am using the same command but it throws error as mentioned in the question.
โ02-05-2024 10:34 PM
@Hardy I think you are using the word volume in the path but it should be Volumes(plural), not Volume(singular). Do one thing, copy the volume path directly from the Workspace and try.
โ02-05-2024 11:54 PM
Thanks for the suggestion @saikumar246
I tried everything with Singular, Plural, Small & Capital. (volume, volumes, Volume, Volumes) but unfortunately its not working with either case. ๐
โ02-06-2024 01:21 AM
@Hardy I am not sure why you are getting that error. I was able to upload the file when i tried in my local terminal. Can you tell where you are running this command? Is this your local terminal or somewhere else?
And make sure that you are using the latest Databricks CLI.
โ02-06-2024 01:48 AM - edited โ02-06-2024 01:51 AM
I changed my role to Admin.
So that error is gone but now it is popping another error:
Error: Put "https://mycompany-dev.cloud.databricks.com/api/2.0/fs/files/Volumes%2Fqa%2Fdefault%2Flibraries%2Ftes...": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
My command is: databricks fs cp C:\test.txt dbfs:/Volumes/qa/default/libraries --recursive --overwrite
I am executing this command from my local machine and databricks cli version is: 0.212.2
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group