Hello @lubiarzm1
To list all available Spark versions for your Databricks workspace, you can call the following API endpoint:
GET /api/2.1/clusters/spark-versions API Docs
This request will return a JSON response containing all available Spark runtime versions.
For example:
{
"versions": [
{
"key": "12.2.x-scala2.12",
"name": "12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)"
},
{
"key": "17.3.x-photon-scala2.13",
"name": "17.3 LTS Photon (includes Apache Spark 4.0.0, Scala 2.13)"
}
]
}
You can then choose the Spark version using the value of the "key" field — for instance:
spark_version = "17.3.x-photon-scala2.13"
Another quick trick is to open any existing cluster in the Databricks UI, switch to Edit > JSON, and inspect how the field "spark_version" is written.
The naming convention (e.g., whether it includes photon or not) depends on other parameters like the runtime engine
You can check these details in the Databricks API docs:
So yes, sometimes the Databricks API and Terraform provider get slightly out of sync.
Setting the spark_version manually is a reliable way to verify which runtimes are truly supported in your environment. Try with 17.3.x-photon-scala2.13 or 17.3.x-scala2.13 + runtime engine
Hope this helps, 🙂
Isi