cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

With AWS/Azure Autoscaling, how do we fine tune spark jobs?

auser85
New Contributor III

With the recommended autoscaling, e.g, https://docs.databricks.com/clusters/cluster-config-best-practices.html, setting; is it possible to dynamically set a fine tuned spark job, given that the number of executors could be changing at any time?

1 REPLY 1

Aviral-Bhardwaj
Esteemed Contributor III

@Andrew Fogarty​ 

I would suggest you instead of dynamic add that thing in the spark cluster itself by that you can save cost

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.