cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Inquiry Regarding Enabling Unity Catalog in Databricks Cluster Configuration via API

Himanshu4
New Contributor II

Dear Databricks Community,

I hope this message finds you well. I am currently working on automating cluster configuration updates in Databricks using the API. As part of this automation, I am looking to ensure that the Unity Catalog is enabled within the cluster configuration.

I understand that enabling Unity Catalog is typically done manually through the Databricks UI during cluster creation or modification. However, I would like to inquire if there is a way to programmatically enable Unity Catalog via the Databricks API.

In my current script, I am fetching job details and cluster configuration settings, but I am unable to reflect the enabled Unity Catalog in the cluster summary. I would greatly appreciate any insights or guidance on how to achieve this programmatically via the API.

Thank you very much for your assistance and support.

4 REPLIES 4

raphaelblg
Contributor III
Contributor III

Hi @Himanshu4 ,

To run Unity Catalog workloads, compute resources must comply with certain security requirements. Non-compliant compute resources cannot access data or other objects in Unity Catalog. SQL warehouses always comply with Unity Catalog requirements, but some cluster access modes do not. See Access modes.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Himanshu4
New Contributor II

Hi @raphaelblg ,

Thanks for your response. My task is to upgrade jobs. One of the scenarios involves upgrading job clusters, where I am trying to upgrade non-Unity Catalog clusters to Unity Catalog-enabled clusters. Is it possible by upgrading the spark version of the cluster using Databricks API?

@Himanshu4 as long as the job cluster is on Shared mode, UC should be accessible.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Himanshu4
New Contributor II

Hi Raphael
Can we fetch job details from one workspace and create new job in new workspace with the same "job id" and configuration?

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!