cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Required versus current compute setup

AGnewbie
New Contributor

To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)

I am also seeing a warning for the worker type: Warning: compute in this mode needs at least 1 worker to run Spark commands or import tables. 

how to increase that? 

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hello @AGnewbie,

Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtime to the required 15.4 LTS version.

Secondly, regarding the warning for the worker type: to run Spark commands or import tables, you need at least one worker node. If your compute setup currently does not meet this requirement, you need to increase the number of worker nodes.

To increase the number of worker nodes:

  1. Go to your Databricks workspace.
  2. Click on Clusters in the sidebar.
  3. Select the cluster that you are using.
  4. Click Edit.
  5. In the Workers field, specify the number of workers you want.
  6. Click Confirm.