Hey all,
Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?
For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd like to be able to read from the same external volume. After setting it up in one workspace, I go to create the volume in the other and get the following error:
Input path url '<s3 path>' overlaps with other external tables or volumes within 'CreateVolume' call. Conflicting tables/volumes: <original_volume>
Any guidance or is this just a limitation with volumes in Databricks?
Thanks!