Required versus current compute setup
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago - last edited a month ago
To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)
I am also seeing a warning for the worker type: Warning: compute in this mode needs at least 1 worker to run Spark commands or import tables.
how to increase that?
- Labels:
-
Compute
-
required versus current
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
a month ago
Hello @AGnewbie,
Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtime to the required 15.4 LTS version.
Secondly, regarding the warning for the worker type: to run Spark commands or import tables, you need at least one worker node. If your compute setup currently does not meet this requirement, you need to increase the number of worker nodes.
To increase the number of worker nodes:
- Go to your Databricks workspace.
- Click on
Clusters
in the sidebar. - Select the cluster that you are using.
- Click
Edit
. - In the
Workers
field, specify the number of workers you want. - Click
Confirm
.

