Hi @NC ,
The error message you're encountering, "Spark Conf: โspark.databricks.acl.enabledโ is not allowed when choosing an access mode," is likely due to the job cluster's access mode not being set to "assigned" or "no isolation shared" as required by Databricks 12.0 ML and above.
To resolve this issue, follow these steps:
-
Adjust the access mode of your job cluster to either "assigned" or "no isolation shared." The exact steps for doing this may vary depending on your setup and the interface you are using (UI, API, etc.).
-
Ensure that the cluster you're using for running jobs is already set to one of the required access modes. All-Purpose Clusters are often configured in this way.
By making these adjustments, you should be able to resolve the ACL-related error and run your job successfully. It's generally recommended to use the first solution, changing the access mode, to ensure long-term compatibility and avoid temporary changes to the cluster configuration.