cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Multiple volumes from same external location?

dsmoore
New Contributor

Hey all,

Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?

For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd like to be able to read from the same external volume. After setting it up in one workspace, I go to create the volume in the other and get the following error:

 
 

 

 

 

Input path url '<s3 path>' overlaps with other external tables or volumes within 'CreateVolume' call. Conflicting tables/volumes: <original_volume>

 

 

 


Any guidance or is this just a limitation with volumes in Databricks?

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions

ozaaditya
New Contributor II

Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.

Possible Solution:

  1. Use subdirectories within the same bucket. For example:
    • s3://bucket/prod
    • s3://bucket/test

Below are some links for reference:
https://community.databricks.com/t5/data-governance/external-locations/td-p/67356 

https://kb.databricks.com/unity-catalog/invalid_parameter_valuelocation_overlap-overlaps-with-manage... 

View solution in original post

1 REPLY 1

ozaaditya
New Contributor II

Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.

Possible Solution:

  1. Use subdirectories within the same bucket. For example:
    • s3://bucket/prod
    • s3://bucket/test

Below are some links for reference:
https://community.databricks.com/t5/data-governance/external-locations/td-p/67356 

https://kb.databricks.com/unity-catalog/invalid_parameter_valuelocation_overlap-overlaps-with-manage... 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group