The error โUnity Catalog is not enabled on this job clusterโ during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essentials; however, the most likely cause relates to job cluster access mode and DLT pipeline configuration requirements for Unity Catalog.
Key Issue Identified
Delta Live Tables with Unity Catalog require the job clusterโs Access Mode to be set to "Unity Catalog", which is different from the default Standard (formerly Shared) mode. DLT job clusters must explicitly be in this mode to access Unity Catalog tables, and this is managed by Databricks when you configure the pipeline properly. Standard or No Isolation Shared access modes do not support Unity Catalog for DLT pipelines.
Steps to Resolve
-
Delete Existing DLT Pipeline Compute Configuration
-
If you manually configured a cluster as a DLT Compute, delete the custom compute configuration in the pipeline. DLT pipelines do not use user-managed clusters, but create their own ephemeral job clusters according to pipeline settings.
-
Update the Access Mode via Pipeline Settings
-
In the DLT pipeline creation or editing screen, make sure you do not specify a pre-existing cluster or manually set any advanced cluster options that force the compute back to Standard mode.
-
Instead, let Databricks manage the job cluster for the DLT pipeline. When Unity Catalog is enabled for the pipeline in the UI, Databricks should automatically manage access mode.
-
Double-check that โEnable Unity Catalogโ is set to true in the pipeline settings.
-
Workspace Metastore Assignment
-
Confirm that the workspace is assigned to the correct Unity Catalog-enabled metastore, especially for DLT.
-
Databricks Runtime Version
-
Youโre using Databricks Runtime dlt:15.4.8, which is new enoughโbut confirm from Databricks docs that this version explicitly supports DLT + Unity Catalog, since some minor versions may have required critical fixes.
-
IAM Roles
-
Ensure the service principal and all identity access for the pipeline have all required Unity Catalog permissions, including data access and DLT permissions.
Additional Troubleshooting
-
Try Creating a New DLT Pipeline: Sometimes existing configurations persist. Create a new DLT pipeline, this time ensuring to enable Unity Catalog and do not customize compute settings.
-
Check DLT Pipeline JSON/YAML: If configuring via JSON or YAML, confirm the property "target": "<uc-catalog>.<schema>" exists and "unity_catalog": true is set.
-
Disable Custom Cluster Policies: Do not apply cluster policies that might override pipeline defaults and inadvertently disable Unity Catalog access.
Why It Only Affects DLT
Standard Databricks jobs can assign clusters with Unity Catalog access mode directly. DLT pipelines require ephemeral job clusters whose configuration is governed by pipeline settings, not by user-provided clusters.
Summary:
Set your DLT pipeline to use Databricks-managed compute without customizing access mode, ensure โEnable Unity Catalogโ is checked, and make sure all permissions and runtime versions align. Do not use clusters in Standard access mode for DLT with Unity Catalog.