cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Multiple volumes from same external location?

dsmoore
New Contributor II

Hey all,

Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?

For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd like to be able to read from the same external volume. After setting it up in one workspace, I go to create the volume in the other and get the following error:

 
 

 

 

 

Input path url '<s3 path>' overlaps with other external tables or volumes within 'CreateVolume' call. Conflicting tables/volumes: <original_volume>

 

 

 


Any guidance or is this just a limitation with volumes in Databricks?

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions

ozaaditya
Contributor

Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.

Possible Solution:

  1. Use subdirectories within the same bucket. For example:
    • s3://bucket/prod
    • s3://bucket/test

Below are some links for reference:
https://community.databricks.com/t5/data-governance/external-locations/td-p/67356 

https://kb.databricks.com/unity-catalog/invalid_parameter_valuelocation_overlap-overlaps-with-manage... 

View solution in original post

1 REPLY 1

ozaaditya
Contributor

Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.

Possible Solution:

  1. Use subdirectories within the same bucket. For example:
    • s3://bucket/prod
    • s3://bucket/test

Below are some links for reference:
https://community.databricks.com/t5/data-governance/external-locations/td-p/67356 

https://kb.databricks.com/unity-catalog/invalid_parameter_valuelocation_overlap-overlaps-with-manage... 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now