cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Error in SQL statement: SparkException: Table Access Control is not enabled on this cluster.

Nishin
New Contributor

Hi Experts,

Am using 'DataScience and Engineering' Workspace in Azure databricks and want to test 'table access control' on legacy Hive metastore on cluster.

i did all what is mentioned in the link 'https://learn.microsoft.com/en-us/azure/databricks/data-governance/table-acls/table-acl#--enable-table-access-control-for-your-workspace'.

i am using a shared cluster with policy as 'unrestricted' .

imagehow ever when i run the command am getting the below error 

imageCan any one please advice what mistake am doing or is there anything else i need to do before running the command

2 REPLIES 2

Anonymous
Not applicable

@nishin kumarโ€‹ :

The error message indicates that there is a Spark exception while executing the SQL statement. There could be several reasons for this error. Here are a few things you can try to troubleshoot this issue:

  1. Check if the table exists: Make sure that the table you are trying to apply table access control to exists in the database. You can check this by running the following command:
SHOW TABLES <database_name>;

Replace <database_name> with the name of your database.

  1. Check if the table is accessible: Make sure that the table is accessible from your cluster. You can test this by running a simple SQL query on the table:
SELECT * FROM <database_name>.<table_name> LIMIT 10;
  1. Replace <database_name> and <table_name> with the name of your database and table respectively.
  2. Check the SQL syntax: Double-check the SQL statement you are running to make sure that there are no syntax errors. You can also try running the same SQL statement in the Databricks SQL endpoint to see if it works.
  3. Check the cluster logs: Check the cluster logs to see if there are any error messages related to the Spark exception. You can find the cluster logs in the Azure portal by going to the Databricks workspace and selecting the "Clusters" tab. Click on the cluster name and select "Logs" from the menu.

Anonymous
Not applicable

Hi @nishin kumarโ€‹ 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group