In databricks database table I was able to set permissions to groups but Now I get this error when using a cluster:
Error getting permissions
summary: SparkException: Trying to perform permission action on Hive Metastore /CATALOG/`hive_metastore`/DATABASE/`db_name`/TABLE/`tbl_name` but Table Access Control is not enabled on this cluster., data: com.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: org.apache.spark.SparkException:
Table access control is indeed enabled in the security tab when I am inside the Admin settings.
The cluster version I am using is:
policy --> unrestricted
Access mode --> No Isolation shared
worker type --> standard_D8_v3
driver type --> standard_D8_v3