Multiple Jobs with different resource requirements on the same cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ07-11-2022 05:57 AM
I have a big cluster with the auto-scaling(min:1, max: 25) feature enabled. I want to run multiple jobs on that cluster with different values of spark properties( `--executor-cores` and `โexecutor-memory) but I don't see any option to specify the same when creating the jobs.
I tried the below code snippet in my Pyspark application.
spark = SparkSession \
.builder \
.config("spark.executor.instances", "2") \
.appName("SparkWarehouseETL") \
.getOrCreate()
But when running this application, it used all the 25 workers instead of using only 2. Since it is already using all the 25 workers, submitting another is just a waste.
What am I missing? Please help !!!
- Labels:
-
JOBS
-
Multiple Jobs
-
Pyspark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ09-03-2022 01:30 AM
Hi @Neelesh databricksโ
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!

