cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

INVALID_PARAMETER_VALUE.LOCATION_OVERLAP when trying to copy from s3 location

dplatform_user
New Contributor

Hi,

Currently we are getting an issue when we try to copy a file from s3 location using dbutils.fs.cp, please see example below:

source = s3://test-bucket/external/zones/{database_name}/{table_name}/test.csv

destination = s3://test-bucket/external/destination/test.csv

dbutils.fs.cp(source, destination)

when we execute above statement we get below error: 

AnalysisException: [RequestId=b8701964-4ec0-4b20-a25d-ce4835e1398e ErrorClass=INVALID_PARAMETER_VALUE.LOCATION_OVERLAP] Input path url 's3://test-bucket/external/zones/' overlaps with other external tables or volumes within 'GenerateTemporaryPathCredential' call.

Note: the source location has "curly braces" part of the location name.

 

1 REPLY 1

Brahmareddy
Esteemed Contributor

Hi dplatform_user,

How are you doing today?, As per my understanding, this error is actually a common one when working with external storage paths that overlap with Unity Catalog-managed locations. The error message is basically saying that your source and/or destination S3 paths are inside a directory (s3://test-bucket/external/) that is already registered or referenced by Databricks as an external location or volume, and Databricks is trying to protect that space to avoid unintended data loss or access conflicts.

Also, while the curly braces {} in your path example are just placeholders, if they're being used literally in the code (not replaced with actual values), that could also cause confusion when resolving the path.

To fix this:

  1. Make sure you replace {database_name} and {table_name} with actual values.

  2. Use a completely separate S3 path outside of any Unity Catalog-managed external location or volume—for example, try copying from s3://test-bucket/tmp-zone/ to s3://test-bucket/destination/ instead of touching anything under /external/.

Databricks restricts copy operations like this to avoid conflicts with external table management, so keeping your copy operations outside of those managed zones will help avoid this error. Let me know if you’d like help restructuring the S3 layout to avoid overlaps!

Regards,

Brahma

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now