<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Issue with spark version in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/136990#M4321</link>
    <description>&lt;P&gt;Hello, I faced an issue with the configuration of IaC using Terraform.&lt;/P&gt;&lt;P&gt;Our organization uses IaC as the default method for deploying resources.&lt;BR /&gt;When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;data "databricks_spark_version" "latest_lts" {
provider = databricks
spark_version = var.spark_version
}&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;Where `var.spark_version` has the value "4" (also tested with "4.0.0" and "4.0"), I do not receive any valid version.&lt;BR /&gt;When I try to get the latest version, it shows me:&lt;/P&gt;&lt;P&gt;id = 16.4.x-scala2.12&lt;/P&gt;&lt;P&gt;as the latest available Spark version.&lt;BR /&gt;What should I do? (Our data engineers want to use Spark version 4.0.0.)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 31 Oct 2025 13:58:16 GMT</pubDate>
    <dc:creator>lubiarzm1</dc:creator>
    <dc:date>2025-10-31T13:58:16Z</dc:date>
    <item>
      <title>Issue with spark version</title>
      <link>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/136990#M4321</link>
      <description>&lt;P&gt;Hello, I faced an issue with the configuration of IaC using Terraform.&lt;/P&gt;&lt;P&gt;Our organization uses IaC as the default method for deploying resources.&lt;BR /&gt;When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;data "databricks_spark_version" "latest_lts" {
provider = databricks
spark_version = var.spark_version
}&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;Where `var.spark_version` has the value "4" (also tested with "4.0.0" and "4.0"), I do not receive any valid version.&lt;BR /&gt;When I try to get the latest version, it shows me:&lt;/P&gt;&lt;P&gt;id = 16.4.x-scala2.12&lt;/P&gt;&lt;P&gt;as the latest available Spark version.&lt;BR /&gt;What should I do? (Our data engineers want to use Spark version 4.0.0.)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 31 Oct 2025 13:58:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/136990#M4321</guid>
      <dc:creator>lubiarzm1</dc:creator>
      <dc:date>2025-10-31T13:58:16Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with spark version</title>
      <link>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/137304#M4331</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/196005"&gt;@lubiarzm1&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P class=""&gt;To list all available Spark versions for your Databricks workspace, you can call the following API endpoint:&lt;BR /&gt;&lt;BR /&gt;GET /api/2.1/clusters/spark-versions&lt;A href="https://docs.databricks.com/api/workspace/clusters/sparkversions" target="_self"&gt; API Docs&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;This request will return a JSON response containing all available Spark runtime versions.&lt;/P&gt;&lt;P class=""&gt;For example:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;{
  "versions": [
    {
      "key": "12.2.x-scala2.12",
      "name": "12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)"
    },
    {
      "key": "17.3.x-photon-scala2.13",
      "name": "17.3 LTS Photon (includes Apache Spark 4.0.0, Scala 2.13)"
    }
  ]
}&lt;/LI-CODE&gt;&lt;P class=""&gt;You can then choose the Spark version using the value of the &lt;SPAN class=""&gt;"key"&lt;/SPAN&gt; field — for instance:&lt;/P&gt;&lt;P class=""&gt;&lt;STRONG&gt;spark_version = "17.3.x-photon-scala2.13"&lt;/STRONG&gt;&lt;/P&gt;&lt;P class=""&gt;Another quick trick is to open any existing cluster in the Databricks UI, switch to &lt;SPAN class=""&gt;&lt;STRONG&gt;Edit &amp;gt; JSON&lt;/STRONG&gt;&lt;/SPAN&gt;, and inspect how the field &lt;SPAN class=""&gt;"spark_version"&lt;/SPAN&gt; is written.&amp;nbsp;&lt;/P&gt;&lt;P class=""&gt;&lt;STRONG&gt;The naming convention (e.g., whether it includes &lt;SPAN class=""&gt;photon&lt;/SPAN&gt; or not) depends on other parameters like the &lt;SPAN class=""&gt;runtime engine&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P class=""&gt;You can check these details in the Databricks API docs:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P class=""&gt;&lt;A href="https://docs.databricks.com/api/workspace/clusters/create#runtime_engine" target="_blank" rel="noopener"&gt;Runtime Engine&lt;/A&gt;&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P class=""&gt;&lt;A href="https://docs.databricks.com/api/workspace/clusters/create#spark_version" target="_blank" rel="noopener"&gt;Spark Version&lt;/A&gt;&lt;/P&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P class=""&gt;So yes, sometimes the Databricks API and Terraform provider get slightly out of sync.&lt;/P&gt;&lt;P class=""&gt;Setting the &lt;SPAN class=""&gt;spark_version&lt;/SPAN&gt; manually is a reliable way to verify which runtimes are truly supported in your environment. Try with&amp;nbsp;&lt;STRONG&gt;17.3.x-photon-scala2.13 or 17.3.x-scala2.13 + runtime engine&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;Hope this helps, &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;BR /&gt;&lt;BR /&gt;Isi&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 02 Nov 2025 17:05:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/137304#M4331</guid>
      <dc:creator>Isi</dc:creator>
      <dc:date>2025-11-02T17:05:01Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with spark version</title>
      <link>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/137357#M4336</link>
      <description>&lt;P&gt;Hi, thanks a lot , direct push of version worked.&lt;BR /&gt;In future I will use API command to check version without using terraform module.&lt;/P&gt;</description>
      <pubDate>Mon, 03 Nov 2025 08:13:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/issue-with-spark-version/m-p/137357#M4336</guid>
      <dc:creator>lubiarzm1</dc:creator>
      <dc:date>2025-11-03T08:13:18Z</dc:date>
    </item>
  </channel>
</rss>

