We can use a config called "spark.task.cpus"
This specifies the number of cores to allocate for each task.
The default value is 1
If we specify say 2, it means fewer tasks will be assigned to the executor.
Thanks,
Saikrishna Pujari
Sr. Spark Technical Solutions Engineer, Databricks