As we are starting to build our Lakehouse solution on Databricks, we need ACLs to be active. So far I have found two options:
- via UI or terraform: create a high-concurrency cluster and enable table access control for python and SQL. In terraform this would look like this:
spark_conf = {
"spark.databricks.cluster.profile": "serverless",
"spark.databricks.repl.allowedLanguages": "python,sql",
"spark.databricks.acl.dfAclsEnabled": "true"
}
- only in terraform: create a "standard" cluster with enabled table access control for python, SQL and R. In terraform the code for config is as followed:
spark_conf = {
"spark.databricks.repl.allowedLanguages" : "python,sql,r",
"spark.databricks.acl.dfAclsEnabled" : true,
}
We have tested that this second option only allows me to see tables that I have been granted access to. Do I miss something in the documentation? Is it the correct way to deploy a single-user cluster with ACLs?