cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unity Catalog blocks DML (UPDATE, DELETE) on static Delta tables — unable to use spark.sql

varni
New Contributor III

Hello,

We’ve started migrating from Azure Databricks (Hive Metastore) to AWS Databricks with Unity Catalog. Our entire codebase was deliberately designed around spark.sql('...') using DML operations (UPDATE, DELETE, MERGE) for two reasons:

  • In many cases, performing UPDATE through a DataFrame took tens of minutes (e.g. updating 100 rows could take up to 30 minutes), while the same operation via spark.sql completed in seconds.

  • SQL-based logic significantly improves readability and flexibility, especially when collaborating with analysts who are not proficient in PySpark.

Problem:

After migrating to Unity Catalog, all spark.sql('UPDATE ...') calls on static Delta tables now fail with:

[UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] UpdateTable are not supported in Unity Catalog.

What’s worse is that the error explicitly states no alternative or workaround is provided.

Current options we see:

  1. Rewrite all SQL logic to use DataFrame API — which is expensive and undermines the original design and performance.

  2. Stay on Hive Metastore and forgo Unity Catalog — losing key features like audit logging, fine-grained ACLs, lineage, and external catalogs.

Questions:

  • Is support for UPDATE, DELETE, MERGE on static Delta tables planned in Unity Catalog?

  • Is there any officially supported way to retain SQL-based DML compatibility under Unity Catalog?

  • Are there any planned mechanisms for SQL migration to Unity Catalog without completely switching to DataFrame logic?

Thank you!

 

1 ACCEPTED SOLUTION

Accepted Solutions

varni
New Contributor III

[RESOLVED] The issue was caused by the source tables being in Parquet format. After rewriting them as Delta tables, everything worked fine — including DML operations like UPDATE via DataFrame logic. Thanks!

View solution in original post

2 REPLIES 2

varni
New Contributor III

[RESOLVED] The issue was caused by the source tables being in Parquet format. After rewriting them as Delta tables, everything worked fine — including DML operations like UPDATE via DataFrame logic. Thanks!

HLEGUA
New Contributor III

Just to clarify: Delta tables still store data in Parquet under the hood, but Delta adds a transaction log (_delta_log) that enables ACID operations like UPDATE, DELETE, and MERGE.

That log layer is what Unity Catalog expects for full SQL DML support — which explains why converting the tables to Delta resolved the issue.

Thanks for sharing the resolution! It will help others hitting the same blocker.

In case it's useful to someone, here are the commands:
Convert existing Parquet folder to Delta:

CONVERT TO DELTA parquet.`s3://your-bucket/path/to/parquet-data`

Convert existing external table to Delta:

CONVERT TO DELTA your_schema.your_parquet_table

Create a Delta table directly:

CREATE TABLE your_table
USING DELTA
LOCATION 's3://your-bucket/path'

Greetings!!!!!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now