cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks AWS permission question

Lennart
New Contributor II

Hello,

I'm currently using Databricks on AWS for some basic ETL where the resulting data is stored as Hive external delta tables.

Even though Unity catalog is disabled, table access control is disabled for the workspace, and the cluster is running with no access control (that I'm aware of) and the cluster is running shared/no isolation, there still are "permission" buttons in the catalog interface and I can see exceptions being thrown in the driver log related to hive permissions.

I'm wondering if anyone is familiar with this, if it could be some cluster misconfiguration that can cause silent errors, or if it should just be ignored when actively trying to not do management/permissions in the workspace?

Best regards


1 ACCEPTED SOLUTION

Accepted Solutions

Isi
Contributor III

Hey @Lennart 

My opinion is,

Even if Unity Catalog is disabled, Table Access Control (TAC) is off, and you’re running shared clusters with no isolation, Databricks still shows the “Permissions” buttons in the UI. That’s expected: they’re part of the default interface and don’t actually enforce anything unless Unity Catalog or legacy Hive ACLs are active. So unless you’ve configured specific permissions in Hive or UC, you can safely ignore those UI elements.

Regarding the Hive-related permission errors in your driver logs.

These messages come from internal system queries, usually triggered automatically by the catalog browser, data preview panels, or Spark background jobs like profiling. Even when TAC is off, the system might still try to validate or list permissions, especially when interacting with the metastore.

These messages are harmless in most cases and don’t block your jobs, unless there’s an actual permissions issue at the metastore or S3 level. If everything is running as expected, you can ignore them.

Hope that helps! 🙂

Isi

 

View solution in original post

3 REPLIES 3

Isi
Contributor III

Hey @Lennart 

My opinion is,

Even if Unity Catalog is disabled, Table Access Control (TAC) is off, and you’re running shared clusters with no isolation, Databricks still shows the “Permissions” buttons in the UI. That’s expected: they’re part of the default interface and don’t actually enforce anything unless Unity Catalog or legacy Hive ACLs are active. So unless you’ve configured specific permissions in Hive or UC, you can safely ignore those UI elements.

Regarding the Hive-related permission errors in your driver logs.

These messages come from internal system queries, usually triggered automatically by the catalog browser, data preview panels, or Spark background jobs like profiling. Even when TAC is off, the system might still try to validate or list permissions, especially when interacting with the metastore.

These messages are harmless in most cases and don’t block your jobs, unless there’s an actual permissions issue at the metastore or S3 level. If everything is running as expected, you can ignore them.

Hope that helps! 🙂

Isi

 

Lennart
New Contributor II

@Isi Thanks for taking the time to answer 🙂

Things are generally working well and the reason I started looking into these errors was actually two other unrelated errors happening in short succession.

In any case its nice to know that this can be ignored.

Best regards

Isi
Contributor III

@Lennart 

Glad to hear it helped! If you think this solves your question, please consider marking it as the accepted answer so it can assist other users as well.

Best regards, 🙂

Isi

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now