With AWS/Azure Autoscaling, how do we fine tune spark jobs?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-19-2022 06:38 AM
With the recommended autoscaling, e.g, https://docs.databricks.com/clusters/cluster-config-best-practices.html, setting; is it possible to dynamically set a fine tuned spark job, given that the number of executors could be changing at any time?
Labels:
- Labels:
-
AWS
-
Fine Tune Spark Jobs
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-20-2022 06:04 AM
@Andrew Fogarty
I would suggest you instead of dynamic add that thing in the spark cluster itself by that you can save cost
AviralBhardwaj

