cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Multiple Jobs with different resource requirements on the same cluster

dbrick
New Contributor II

I have a big cluster with the auto-scaling(min:1, max: 25) feature enabled. I want to run multiple jobs on that cluster with different values of spark properties( `--executor-cores` and `โ€“executor-memory) but I don't see any option to specify the same when creating the jobs.

I tried the below code snippet in my Pyspark application.

spark = SparkSession \
    .builder \
    .config("spark.executor.instances", "2") \
    .appName("SparkWarehouseETL") \
    .getOrCreate()

But when running this application, it used all the 25 workers instead of using only 2. Since it is already using all the 25 workers, submitting another is just a waste.

What am I missing? Please help !!!

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @Neelesh databricksโ€‹, What's your DBR version?

Vidula
Honored Contributor

Hi @Neelesh databricksโ€‹ 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you. 

Cheers!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.