cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Unable to use job cluster for task in workflows

NC
New Contributor III

Hi,

I have a workflow setup in Databricks using 12.2 LTS ML.

I am trying to use a job cluster for the task but i am getting the following error: 

Spark Conf: โ€˜spark.databricks.acl.enabledโ€™ is not allowed when choosing an access mode

As a result I have to use my All Purpose Cluster.

Can anyone let me know what is preventing me from using a job cluster for my workflow and how to fix this?

Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @NC , I apologize for the inconvenience youโ€™re facing.

 

Letโ€™s explore additional steps to troubleshoot this issue:

 

Cluster Restart:

  • Sometimes, changes in cluster configurations require a restart. Try restarting your job cluster after setting the access mode to โ€œNo Isolation Sharedโ€. This can help apply the changes effectively.

Cluster Permissions:

  • Ensure that you have the necessary permissions to modify cluster configurations. If youโ€™re not the owner or have restricted permissions, you might encounter issues while saving changes.

Cluster Initialization Scripts:

  • Check if there are any initialization scripts associated with your cluster. These scripts can override Spark configurations. If you find any, review their content and ensure they donโ€™t conflict with your desired settings.

Workspace Permissions:

  • Verify that you have the appropriate permissions within the Databricks workspace. Sometimes, workspace-level permissions can impact cluster configuration changes.

Temporary Workaround:

  • While investigating the issue, consider using your All Purpose Cluster for now. Although itโ€™s not ideal, it will allow you to continue working while troubleshooting the job cluster issue.

Contact Databricks Support:

  • If none of the above steps resolve the problem, I recommend reaching out to Databricks support by filing the support ticket.

    I hope this helps you get closer to a solution! ๐ŸŒŸ

View solution in original post

3 REPLIES 3

Kaniz_Fatma
Community Manager
Community Manager

Hi @NC , 

The error message you're encountering, "Spark Conf: โ€˜spark.databricks.acl.enabledโ€™ is not allowed when choosing an access mode," is likely due to the job cluster's access mode not being set to "assigned" or "no isolation shared" as required by Databricks 12.0 ML and above.

To resolve this issue, follow these steps:

  1. Adjust the access mode of your job cluster to either "assigned" or "no isolation shared." The exact steps for doing this may vary depending on your setup and the interface you are using (UI, API, etc.).

  2. Ensure that the cluster you're using for running jobs is already set to one of the required access modes. All-Purpose Clusters are often configured in this way.

By making these adjustments, you should be able to resolve the ACL-related error and run your job successfully. It's generally recommended to use the first solution, changing the access mode, to ensure long-term compatibility and avoid temporary changes to the cluster configuration.

NC
New Contributor III

Hi @Kaniz_Fatma,

I have tried setting access mode to No isolation shared but i still get the same error when saving.

image.png

 

Kaniz_Fatma
Community Manager
Community Manager

Hi @NC , I apologize for the inconvenience youโ€™re facing.

 

Letโ€™s explore additional steps to troubleshoot this issue:

 

Cluster Restart:

  • Sometimes, changes in cluster configurations require a restart. Try restarting your job cluster after setting the access mode to โ€œNo Isolation Sharedโ€. This can help apply the changes effectively.

Cluster Permissions:

  • Ensure that you have the necessary permissions to modify cluster configurations. If youโ€™re not the owner or have restricted permissions, you might encounter issues while saving changes.

Cluster Initialization Scripts:

  • Check if there are any initialization scripts associated with your cluster. These scripts can override Spark configurations. If you find any, review their content and ensure they donโ€™t conflict with your desired settings.

Workspace Permissions:

  • Verify that you have the appropriate permissions within the Databricks workspace. Sometimes, workspace-level permissions can impact cluster configuration changes.

Temporary Workaround:

  • While investigating the issue, consider using your All Purpose Cluster for now. Although itโ€™s not ideal, it will allow you to continue working while troubleshooting the job cluster issue.

Contact Databricks Support:

  • If none of the above steps resolve the problem, I recommend reaching out to Databricks support by filing the support ticket.

    I hope this helps you get closer to a solution! ๐ŸŒŸ
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!