cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error with S3 paths

shawnbarrick
New Contributor III

We're trying to read from an S3 bucket using unity catalog and are selectively getting "INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error"  errors within the same bucket.  

This works:

"dbutils.fs.ls("s3://BUCKETNAME/dev/health")"

But within the same bucket we get the location overlap error when running: "dbutils.fs.ls("s3://BUCKETNAME/dev/claims/")"

I reviewed the article https://kb.databricks.com/en_US/unity-catalog/invalid_parameter_valuelocation_overlap-overlaps-with-... but it's unclear to me why one path works.  Is there something I should check other than IAM permissions for the bucket in question?

 

 

2 REPLIES 2

Debayan
Databricks Employee
Databricks Employee

Hi,
Could you please elaborate on the issue here? 
Running the list command on a managed directory is not supported in Unity Catalog. Catalog/schema storage locations are reserved for managed storage.
Please tag @Debayan  with your next comment which will notify me. Thanks!

shawnbarrick
New Contributor III

Thanks for the response. We are trying to to verify S3 buckets access, as well as helping a user troubleshooting a permissions issues.  

I run this command I get a list of the bucket contents for that folder: "dbutils.fs.ls("s3://BUCKETNAME/dev/health")"

Likewise this will show the beginning of a file:

"dbutils.fs.head("s3://gradientai-databricks/dev/health/Q2_2023/1_30.csv")"

 

 
 
However one of our users gets "PermissionError: Forbidden" when running this code:
"import pandas as pd"
"import pickle"
"data = pd.read_pickle('s3://BUCKETNAME/dev/health/Q2_2023/1_30.csv')"
"display(data)"

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group