02-05-2024 02:07 AM
In our azure pipeline we are using databricks-cli command to upload jar files at dbfs:/FileStore location and that works perfectly fine.
But when we try to use the same command to upload files at dbfs:/Volume/dev/default/files, it does not work and give below error:
02-05-2024 05:52 AM
Hi @Hardy,
What is the command you are running to upload files at "dbfs:/Volume/dev/default/files" ?
The specific command to use with the Databricks CLI for uploading files to the volume's location in Databricks is:
bash
databricks fs cp /path/to/local/file dbfs:/Volumes/my_catalog/my_schema/my_volume/
Please replace /path/to/local/file
with the path of your local file and dbfs:/Volumes/my_catalog/my_schema/my_volume/
with the path of your volume in Databricks.
Make sure that the user has all the required permissions on the catalog, schema and volume to upload the file.
02-05-2024 07:51 PM
Yes I am using the same command but it throws error as mentioned in the question.
02-05-2024 10:34 PM
@Hardy I think you are using the word volume in the path but it should be Volumes(plural), not Volume(singular). Do one thing, copy the volume path directly from the Workspace and try.
02-05-2024 11:54 PM
Thanks for the suggestion @saikumar246
I tried everything with Singular, Plural, Small & Capital. (volume, volumes, Volume, Volumes) but unfortunately its not working with either case. 😞
02-06-2024 01:21 AM
@Hardy I am not sure why you are getting that error. I was able to upload the file when i tried in my local terminal. Can you tell where you are running this command? Is this your local terminal or somewhere else?
And make sure that you are using the latest Databricks CLI.
02-06-2024 01:48 AM - edited 02-06-2024 01:51 AM
I changed my role to Admin.
So that error is gone but now it is popping another error:
Error: Put "https://mycompany-dev.cloud.databricks.com/api/2.0/fs/files/Volumes%2Fqa%2Fdefault%2Flibraries%2Ftes...": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
My command is: databricks fs cp C:\test.txt dbfs:/Volumes/qa/default/libraries --recursive --overwrite
I am executing this command from my local machine and databricks cli version is: 0.212.2
02-05-2024 11:42 PM
Hey there! Thanks a bunch for being part of our awesome community! 🎉
We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution for you. And remember, if you ever need more help , we're here for you!
Keep being awesome! 😊🚀
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.