cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Need to move files from one Volume to other

navi_bricks
Visitor

We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. 

As the job itself runs on a service principal that has access to external storage, we don't want to pass in any credentials. Can we achieve this? We tried with OS, DbUtils, and Workspace client, all of which need service principal credentials. Finally the reading of volume we achieved through spark context itself, but moving of files we need other way, please help.

3 REPLIES 3

Walter_C
Databricks Employee
Databricks Employee

You should be able to use dbutils.fs.cp to copy the file but you jus need to ensure that the SP has WRITE VOLUME permission on the destination Volume.

navi_bricks
Visitor

Thanks for that,

But I have a Python data pipeline running under a custom cluster, and its not working from there. 

 

Walter_C
Databricks Employee
Databricks Employee

What is the error being received? And does the SP has the mentioned permission in UC?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group