cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Job with a cluster defined in DAB YML has error [UC_NOT_ENABLED] on cluster

RolandoCM2020
Visitor

Hello,

The error is: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

We have a job that uses a cluster defined in yml as:

  small_cluster_id:
    description: "Small cluster, singleNode, for longer jobs"
    type: complex
    default:
      spark_version: "15.4.x-scala2.12"
      node_type_id: "Standard_D4ds_v5"
      num_workers: 0
      spark_conf:
        'spark.databricks.cluster.profile': 'singleNode'
        'spark.master': 'local[*]'
      custom_tags:
        'ResourceClass': 'SingleNode'

We defined this job some months ago, and it ran for a while with no issue. I need some help as DAB documentation is a bit lacking on clearing this up. Recently, our end user reported an issue, apparently they have had this issue for a while now. Between we handing off the project to them, and this error being communicated to us they changed the user that runs the job, I am not sure if that is related. 

The job and cluster were written around september of last year. Has anyone ran into something similar or does anyone know what configuration needs to be updated in the yml in order for it to have the unity catalog accessible? Or could this be from the user lacking access permissions in some way?

I appreciate any help, thanks.

0 REPLIES 0