cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Inquiry Regarding Enabling Unity Catalog in Databricks Cluster Configuration via API

Himanshu4
New Contributor II

Dear Databricks Community,

I hope this message finds you well. I am currently working on automating cluster configuration updates in Databricks using the API. As part of this automation, I am looking to ensure that the Unity Catalog is enabled within the cluster configuration.

I understand that enabling Unity Catalog is typically done manually through the Databricks UI during cluster creation or modification. However, I would like to inquire if there is a way to programmatically enable Unity Catalog via the Databricks API.

In my current script, I am fetching job details and cluster configuration settings, but I am unable to reflect the enabled Unity Catalog in the cluster summary. I would greatly appreciate any insights or guidance on how to achieve this programmatically via the API.

Thank you very much for your assistance and support.

5 REPLIES 5

raphaelblg
Databricks Employee
Databricks Employee

Hi @Himanshu4 ,

To run Unity Catalog workloads, compute resources must comply with certain security requirements. Non-compliant compute resources cannot access data or other objects in Unity Catalog. SQL warehouses always comply with Unity Catalog requirements, but some cluster access modes do not. See Access modes.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Himanshu4
New Contributor II

Hi @raphaelblg ,

Thanks for your response. My task is to upgrade jobs. One of the scenarios involves upgrading job clusters, where I am trying to upgrade non-Unity Catalog clusters to Unity Catalog-enabled clusters. Is it possible by upgrading the spark version of the cluster using Databricks API?

raphaelblg
Databricks Employee
Databricks Employee

@Himanshu4 as long as the job cluster is on Shared mode, UC should be accessible.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Himanshu4
New Contributor II

Hi Raphael
Can we fetch job details from one workspace and create new job in new workspace with the same "job id" and configuration?

raphaelblg
Databricks Employee
Databricks Employee

@Himanshu4 You can fetch the job details from one workspace but the job ID is auto-generated and non-customizable.

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group