cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks

vidya_kothavale
Contributor

I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.

Steps I followed:

  1. Created a DLT pipeline using the Databricks UI.
  2. Selected the appropriate Compute settings for the pipeline.
  3. Configured the pipeline to use Unity Catalog.
  4. Attempted to run the pipeline, but got the error:

    Unity Catalog is not enabled on this job cluster

Observations:

  • My cluster is running Databricks Runtime dlt:15.4.8
  • The Access Mode is set to Standard (formerly Shared)
  • The worker type is e2-standard-8 (32 GB Memory, 8 Cores)

Troubleshooting did:

Verified that Unity Catalog is enabled in the workspace.
Checked if the correct metastore is assigned to the workspace.
Ensured that the cluster is configured with the right IAM roles and policies.

Additional Finding:

  • If I create a standard Databricks job, Unity Catalog is enabled on the job compute.
  • The issue occurs only when I create a Delta Live Table (DLT) pipeline, where Unity Catalog is not enabled on the job cluster.
1 REPLY 1

mark_ott
Databricks Employee
Databricks Employee

The error “Unity Catalog is not enabled on this job cluster” during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essentials; however, the most likely cause relates to job cluster access mode and DLT pipeline configuration requirements for Unity Catalog.

Key Issue Identified

Delta Live Tables with Unity Catalog require the job cluster’s Access Mode to be set to "Unity Catalog", which is different from the default Standard (formerly Shared) mode. DLT job clusters must explicitly be in this mode to access Unity Catalog tables, and this is managed by Databricks when you configure the pipeline properly. Standard or No Isolation Shared access modes do not support Unity Catalog for DLT pipelines.

Steps to Resolve

  1. Delete Existing DLT Pipeline Compute Configuration

    • If you manually configured a cluster as a DLT Compute, delete the custom compute configuration in the pipeline. DLT pipelines do not use user-managed clusters, but create their own ephemeral job clusters according to pipeline settings.

  2. Update the Access Mode via Pipeline Settings

    • In the DLT pipeline creation or editing screen, make sure you do not specify a pre-existing cluster or manually set any advanced cluster options that force the compute back to Standard mode.

    • Instead, let Databricks manage the job cluster for the DLT pipeline. When Unity Catalog is enabled for the pipeline in the UI, Databricks should automatically manage access mode.

    • Double-check that “Enable Unity Catalog” is set to true in the pipeline settings.

  3. Workspace Metastore Assignment

    • Confirm that the workspace is assigned to the correct Unity Catalog-enabled metastore, especially for DLT.

  4. Databricks Runtime Version

    • You’re using Databricks Runtime dlt:15.4.8, which is new enough—but confirm from Databricks docs that this version explicitly supports DLT + Unity Catalog, since some minor versions may have required critical fixes.

  5. IAM Roles

    • Ensure the service principal and all identity access for the pipeline have all required Unity Catalog permissions, including data access and DLT permissions.

Additional Troubleshooting

  • Try Creating a New DLT Pipeline: Sometimes existing configurations persist. Create a new DLT pipeline, this time ensuring to enable Unity Catalog and do not customize compute settings.

  • Check DLT Pipeline JSON/YAML: If configuring via JSON or YAML, confirm the property "target": "<uc-catalog>.<schema>" exists and "unity_catalog": true is set.

  • Disable Custom Cluster Policies: Do not apply cluster policies that might override pipeline defaults and inadvertently disable Unity Catalog access.

Why It Only Affects DLT

Standard Databricks jobs can assign clusters with Unity Catalog access mode directly. DLT pipelines require ephemeral job clusters whose configuration is governed by pipeline settings, not by user-provided clusters.


Summary:
Set your DLT pipeline to use Databricks-managed compute without customizing access mode, ensure “Enable Unity Catalog” is checked, and make sure all permissions and runtime versions align. Do not use clusters in Standard access mode for DLT with Unity Catalog.