cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

[13.3] Vacuum on table fails if shallow clone without write access exists

pyter
New Contributor III

Hello everyone,

We use unity catalog, separating our dev, test and prod data into individual catalogs.

We run weekly vacuums on our prod catalog using a service principal that only has (read+write) access to this production catalog, but no access to our dev/test catalogs. I'd like to keep it this way if possible.

For testing, we do shallow clones from these production tables to the test catalog, using a separate service principal that has (read-only) access to the prod catalog, as well as (read+write) access to the test catalog.

However, when we do such a shallow clone on a (managed) table in Databricks Runtime 13.3, our production service principal cannot run vacuum on the *source table* anymore:

com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException: PERMISSION_DENIED: User does not have USE CATALOG on Catalog 'test-catalog'

This is very unexpected. It seems like the service principal needs access to the clone to be able to vacuum the original table. The other way around makes sense, but it doesn't make sense that I need access to the cloned tables to vacuum the original. Right now, it is possible to disturb our production job just by cloning the table somewhere else.

Is this a bug? Is there any workaround?

Cheers, Peter

1 ACCEPTED SOLUTION

Accepted Solutions

Lakshay
Databricks Employee
Databricks Employee

Are you using Unity Catalog in single user access mode? If yes, could you try using shared access mode.

View solution in original post

5 REPLIES 5

Lakshay
Databricks Employee
Databricks Employee

Are you using Unity Catalog in single user access mode? If yes, could you try using shared access mode.

pyter
New Contributor III

Thanks for the suggestion, I will try that now.

pyter
New Contributor III

It actually worked, thank you! Do you happen to know the technical reason why this does not work in single-user mode? I wonder if I should report this as a bug to databricks.

Lakshay
Databricks Employee
Databricks Employee

Hi @pyter , This is actually an expected scenario while working with Unity Catalog in Single User access mode. This is also documented here: https://docs.databricks.com/en/delta/clone-unity-catalog.html#work-with-shallow-cloned-tables-in-sin...

Here is an excerpt from this document: Databricks recommends working with Unity Catalog clones on compute with shared access mode as this allows independent evolution of permissions for Unity Catalog shallow clone targets and their source tables.

pyter
New Contributor III

I actually read through this section before posting here. I did not think this applies to my situation the way it is worded, as the failed job does not touch the cloned table at all, it only deals with the original table. The creator of the original table might not even know that a clone has occurred, so it is surprising to me that this changes operations on the original table.

I don't fully understand how the shared access mode differs to single-user mode on job-clusters (that aren't actually used by multiple people), but I will enable it for my vacuum job since that seems to help. I do still think the documentation is not perfect in this regard, and the error message mentioning access rights to a completely different table (the clone) is also not helpful.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group