cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Multiple Jobs with different resource requirements on the same cluster

dbrick
New Contributor II

I have a big cluster with the auto-scaling(min:1, max: 25) feature enabled. I want to run multiple jobs on that cluster with different values of spark properties( `--executor-cores` and `โ€“executor-memory) but I don't see any option to specify the same when creating the jobs.

I tried the below code snippet in my Pyspark application.

spark = SparkSession \
    .builder \
    .config("spark.executor.instances", "2") \
    .appName("SparkWarehouseETL") \
    .getOrCreate()

But when running this application, it used all the 25 workers instead of using only 2. Since it is already using all the 25 workers, submitting another is just a waste.

What am I missing? Please help !!!

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @Neelesh databricksโ€‹, What's your DBR version?

Vidula
Honored Contributor

Hi @Neelesh databricksโ€‹ 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group