cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Job with a cluster defined in DAB YML has error [UC_NOT_ENABLED] on cluster

RolandoCM2020
Visitor

Hello,

The error is: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

We have a job that uses a cluster defined in yml as:

  small_cluster_id:
    description: "Small cluster, singleNode, for longer jobs"
    type: complex
    default:
      spark_version: "15.4.x-scala2.12"
      node_type_id: "Standard_D4ds_v5"
      num_workers: 0
      spark_conf:
        'spark.databricks.cluster.profile': 'singleNode'
        'spark.master': 'local[*]'
      custom_tags:
        'ResourceClass': 'SingleNode'

We defined this job some months ago, and it ran for a while with no issue. I need some help as DAB documentation is a bit lacking on clearing this up. Recently, our end user reported an issue, apparently they have had this issue for a while now. Between we handing off the project to them, and this error being communicated to us they changed the user that runs the job, I am not sure if that is related. 

The job and cluster were written around september of last year. Has anyone ran into something similar or does anyone know what configuration needs to be updated in the yml in order for it to have the unity catalog accessible? Or could this be from the user lacking access permissions in some way?

I appreciate any help, thanks.

1 REPLY 1

RahulPathakDBX
Databricks Employee
Databricks Employee

UC_NOT_ENABLED is a cluster configuration issue, not a permissions issue on the user. Databricks throws UC_NOT_ENABLED when the compute is not Unity Catalog enabled, i.e., its access mode is not Standard or Dedicated / (data_security_mode not USER_ISOLATION or SINGLE_USER).

Your YAML cluster configuration is missing the access mode / data_security_mode fields. For Bundles / Jobs clusters, you need to set them explicitly if you want UC access.

If data_security_mode is not set in the job cluster definition, Databricks can fall back to the workspace’s “Default access mode for jobs compute” setting; if that default is No isolation shared, the cluster is not UC-enabled and you get UC_NOT_ENABLED. So either: Someone changed the workspace Default access mode for jobs compute, or The workspace behavior changed when Jobs/UC defaults were rolled out. Changing the run-as user alone would normally cause permission errors (missing privileges), not UC_NOT_ENABLED.

Add data_security_mode (and single_user_name if you choose single-user) to the cluster definition that the job uses.

Option 1 –> Standard access mode (multi-user UC cluster):

data_security_mode: "USER_ISOLATION" # Standard access mode (UC-enabled)

Option 2 –> Dedicated / single-user UC cluster (best for “run as” user or service principal):

data_security_mode: "SINGLE_USER"
single_user_name: "user-or-spn@your-domain.com" # or ${workspace.current_user.userName}

This mirrors the Databricks Bundles cluster schema and examples, which use data_security_mode: SINGLE_USER plus single_user_name for UC-enabled clusters.

let us know if it helps!!