Hi @Ednexllc ,
It really depends on many factor like workload type, size of your data, number of tables etc. You can check some recommendations given by Databricks here:
Compute configuration recommendations | Databricks on AWS
And also here - you should find useful info in section called Databricks Cluster Configuration and Tuning
Comprehensive Guide to Optimize Data Workloads | Databricks
But for me, it’s always a process that requires some trial and error at the beginning. I try different settings, and in the end I choose the ones that handled the given workload best.
So, to put it simply - there's no silver bullet. You can use some guidelines, but in the end you need to test and align compute to your workload/envirionment yourself