<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: DBR 16.4 LTS - Spark 3.5.2 is not compatible with Delta Lake 3.3.1 in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/121970#M46611</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/103578"&gt;@leireroman&lt;/a&gt;, Databricks Runtime 16.4 LTS includes Delta Lake 3.3.1, paired with Spark 3.5.2. This combination works within Databricks because it’s a custom build. In your Conda environment, the conflict occurs because delta-spark 3.3.1 requires pyspark &amp;gt;=3.5.3, but you’ve set it to 3.5.2.&lt;/P&gt;&lt;P&gt;To resolve this, you can either:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Upgrade pyspark to 3.5.3 to work with delta-spark 3.3.1&lt;/LI&gt;&lt;LI&gt;Downgrade to delta-spark 3.2.0 to stay compatible with Spark 3.5.2.&lt;/LI&gt;&lt;/UL&gt;</description>
    <pubDate>Tue, 17 Jun 2025 11:50:14 GMT</pubDate>
    <dc:creator>Renu_</dc:creator>
    <dc:date>2025-06-17T11:50:14Z</dc:date>
    <item>
      <title>DBR 16.4 LTS - Spark 3.5.2 is not compatible with Delta Lake 3.3.1</title>
      <link>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/121221#M46380</link>
      <description>&lt;P&gt;I'm migrating to Databricks Runtime 16.4 LTS, which is using Spark 3.5.2 and Delta Lake 3.3.1 according to the documentation:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/16.4lts" target="_blank" rel="noopener"&gt;Databricks Runtime 16.4 LTS - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I've upgraded my conda environment to use those versions, but I get this error message when I try to upgrade my environment:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Captura de pantalla 2025-06-09 084355.png" style="width: 425px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/17405iCC2FA34206DD7FED/image-size/large?v=v2&amp;amp;px=999" role="button" title="Captura de pantalla 2025-06-09 084355.png" alt="Captura de pantalla 2025-06-09 084355.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;According to Delta Lake releases (&lt;A href="https://github.com/delta-io/delta/releases?page=1" target="_blank" rel="noopener"&gt;Releases · delta-io/delta&lt;/A&gt;), the last version compatible with Spark 3.5.2 is 3.2.0, because the next one (3.2.1) is built on Spark 3.5.3.&lt;/P&gt;&lt;P&gt;Is Databricks really using Delta Lake version 3.3.1? How can I check this from a cluster with DBR 16.4 LTS?&lt;/P&gt;</description>
      <pubDate>Mon, 09 Jun 2025 06:45:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/121221#M46380</guid>
      <dc:creator>leireroman</dc:creator>
      <dc:date>2025-06-09T06:45:53Z</dc:date>
    </item>
    <item>
      <title>Re: DBR 16.4 LTS - Spark 3.5.2 is not compatible with Delta Lake 3.3.1</title>
      <link>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/121970#M46611</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/103578"&gt;@leireroman&lt;/a&gt;, Databricks Runtime 16.4 LTS includes Delta Lake 3.3.1, paired with Spark 3.5.2. This combination works within Databricks because it’s a custom build. In your Conda environment, the conflict occurs because delta-spark 3.3.1 requires pyspark &amp;gt;=3.5.3, but you’ve set it to 3.5.2.&lt;/P&gt;&lt;P&gt;To resolve this, you can either:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Upgrade pyspark to 3.5.3 to work with delta-spark 3.3.1&lt;/LI&gt;&lt;LI&gt;Downgrade to delta-spark 3.2.0 to stay compatible with Spark 3.5.2.&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Tue, 17 Jun 2025 11:50:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/121970#M46611</guid>
      <dc:creator>Renu_</dc:creator>
      <dc:date>2025-06-17T11:50:14Z</dc:date>
    </item>
    <item>
      <title>Re: DBR 16.4 LTS - Spark 3.5.2 is not compatible with Delta Lake 3.3.1</title>
      <link>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/133589#M49886</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/103578"&gt;@leireroman&lt;/a&gt;&amp;nbsp;encountered the same and used an override (like a pip constraints.txt file or &lt;A href="https://pdm-project.org/latest/usage/dependency/#dependency-overrides" target="_self"&gt;PDM resolution override specification&lt;/A&gt;) to make sure my local development environment matched the runtime.&lt;/P&gt;</description>
      <pubDate>Thu, 02 Oct 2025 22:21:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/dbr-16-4-lts-spark-3-5-2-is-not-compatible-with-delta-lake-3-3-1/m-p/133589#M49886</guid>
      <dc:creator>SamAdams</dc:creator>
      <dc:date>2025-10-02T22:21:00Z</dc:date>
    </item>
  </channel>
</rss>

