01-17-2024 06:12 AM
Hello everyone,
We use unity catalog, separating our dev, test and prod data into individual catalogs.
We run weekly vacuums on our prod catalog using a service principal that only has (read+write) access to this production catalog, but no access to our dev/test catalogs. I'd like to keep it this way if possible.
For testing, we do shallow clones from these production tables to the test catalog, using a separate service principal that has (read-only) access to the prod catalog, as well as (read+write) access to the test catalog.
However, when we do such a shallow clone on a (managed) table in Databricks Runtime 13.3, our production service principal cannot run vacuum on the *source table* anymore:
com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException: PERMISSION_DENIED: User does not have USE CATALOG on Catalog 'test-catalog'
This is very unexpected. It seems like the service principal needs access to the clone to be able to vacuum the original table. The other way around makes sense, but it doesn't make sense that I need access to the cloned tables to vacuum the original. Right now, it is possible to disturb our production job just by cloning the table somewhere else.
Is this a bug? Is there any workaround?
Cheers, Peter
01-17-2024 07:33 AM
Are you using Unity Catalog in single user access mode? If yes, could you try using shared access mode.
01-17-2024 07:33 AM
Are you using Unity Catalog in single user access mode? If yes, could you try using shared access mode.
01-18-2024 12:59 AM
Thanks for the suggestion, I will try that now.
01-18-2024 01:44 AM
It actually worked, thank you! Do you happen to know the technical reason why this does not work in single-user mode? I wonder if I should report this as a bug to databricks.
01-18-2024 03:54 AM - edited 01-18-2024 03:55 AM
Hi @pyter , This is actually an expected scenario while working with Unity Catalog in Single User access mode. This is also documented here: https://docs.databricks.com/en/delta/clone-unity-catalog.html#work-with-shallow-cloned-tables-in-sin...
Here is an excerpt from this document: Databricks recommends working with Unity Catalog clones on compute with shared access mode as this allows independent evolution of permissions for Unity Catalog shallow clone targets and their source tables.
01-18-2024 04:05 AM
I actually read through this section before posting here. I did not think this applies to my situation the way it is worded, as the failed job does not touch the cloned table at all, it only deals with the original table. The creator of the original table might not even know that a clone has occurred, so it is surprising to me that this changes operations on the original table.
I don't fully understand how the shared access mode differs to single-user mode on job-clusters (that aren't actually used by multiple people), but I will enable it for my vacuum job since that seems to help. I do still think the documentation is not perfect in this regard, and the error message mentioning access rights to a completely different table (the clone) is also not helpful.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group