cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

With AWS/Azure Autoscaling, how do we fine tune spark jobs?

auser85
New Contributor III

With the recommended autoscaling, e.g, https://docs.databricks.com/clusters/cluster-config-best-practices.html, setting; is it possible to dynamically set a fine tuned spark job, given that the number of executors could be changing at any time?

1 REPLY 1

Aviral-Bhardwaj
Esteemed Contributor III

@Andrew Fogarty​ 

I would suggest you instead of dynamic add that thing in the spark cluster itself by that you can save cost

AviralBhardwaj