cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue with spark version

lubiarzm1
New Contributor

Hello, I faced an issue with the configuration of IaC using Terraform.

Our organization uses IaC as the default method for deploying resources.
When I try to specify my Spark version using the Databricks provider (v1.96 - latest version) like this:

data "databricks_spark_version" "latest_lts" {
provider = databricks
spark_version = var.spark_version
}


Where `var.spark_version` has the value "4" (also tested with "4.0.0" and "4.0"), I do not receive any valid version.
When I try to get the latest version, it shows me:

id = 16.4.x-scala2.12

as the latest available Spark version.
What should I do? (Our data engineers want to use Spark version 4.0.0.)

 

1 REPLY 1

Isi
Honored Contributor III

Hello @lubiarzm1 

To list all available Spark versions for your Databricks workspace, you can call the following API endpoint:

GET /api/2.1/clusters/spark-versions API Docs

This request will return a JSON response containing all available Spark runtime versions.

For example:

{
  "versions": [
    {
      "key": "12.2.x-scala2.12",
      "name": "12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)"
    },
    {
      "key": "17.3.x-photon-scala2.13",
      "name": "17.3 LTS Photon (includes Apache Spark 4.0.0, Scala 2.13)"
    }
  ]
}

You can then choose the Spark version using the value of the "key" field — for instance:

spark_version = "17.3.x-photon-scala2.13"

Another quick trick is to open any existing cluster in the Databricks UI, switch to Edit > JSON, and inspect how the field "spark_version" is written. 

The naming convention (e.g., whether it includes photon or not) depends on other parameters like the runtime engine

You can check these details in the Databricks API docs:

So yes, sometimes the Databricks API and Terraform provider get slightly out of sync.

Setting the spark_version manually is a reliable way to verify which runtimes are truly supported in your environment. Try with 17.3.x-photon-scala2.13 or 17.3.x-scala2.13 + runtime engine 


Hope this helps, 🙂

Isi

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now