cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DBR 16.4 LTS - Spark 3.5.2 is not compatible with Delta Lake 3.3.1

leireroman
New Contributor III

I'm migrating to Databricks Runtime 16.4 LTS, which is using Spark 3.5.2 and Delta Lake 3.3.1 according to the documentation: Databricks Runtime 16.4 LTS - Azure Databricks | Microsoft Learn

I've upgraded my conda environment to use those versions, but I get this error message when I try to upgrade my environment:

Captura de pantalla 2025-06-09 084355.png

According to Delta Lake releases (Releases ยท delta-io/delta), the last version compatible with Spark 3.5.2 is 3.2.0, because the next one (3.2.1) is built on Spark 3.5.3.

Is Databricks really using Delta Lake version 3.3.1? How can I check this from a cluster with DBR 16.4 LTS?

1 ACCEPTED SOLUTION

Accepted Solutions

Renu_
Valued Contributor II

Hi @leireroman, Databricks Runtime 16.4 LTS includes Delta Lake 3.3.1, paired with Spark 3.5.2. This combination works within Databricks because itโ€™s a custom build. In your Conda environment, the conflict occurs because delta-spark 3.3.1 requires pyspark >=3.5.3, but youโ€™ve set it to 3.5.2.

To resolve this, you can either:

  • Upgrade pyspark to 3.5.3 to work with delta-spark 3.3.1
  • Downgrade to delta-spark 3.2.0 to stay compatible with Spark 3.5.2.

View solution in original post

2 REPLIES 2

Renu_
Valued Contributor II

Hi @leireroman, Databricks Runtime 16.4 LTS includes Delta Lake 3.3.1, paired with Spark 3.5.2. This combination works within Databricks because itโ€™s a custom build. In your Conda environment, the conflict occurs because delta-spark 3.3.1 requires pyspark >=3.5.3, but youโ€™ve set it to 3.5.2.

To resolve this, you can either:

  • Upgrade pyspark to 3.5.3 to work with delta-spark 3.3.1
  • Downgrade to delta-spark 3.2.0 to stay compatible with Spark 3.5.2.

SamAdams
Contributor

@leireroman encountered the same and used an override (like a pip constraints.txt file or PDM resolution override specification) to make sure my local development environment matched the runtime.