by
Mr__D
• New Contributor II
- 7083 Views
- 1 replies
- 0 kudos
Hello All,Could anyone please suggest impact of Autoscaling in cluster cost ?Suppose if I have a cluster where min worker is 2 and max is 10 but most of the time active worker are 3 so the cluster will be billed for only 3 workers or for 10 worker(...
- 7083 Views
- 1 replies
- 0 kudos
Latest Reply
@Deepak Bhatt :Autoscaling in Databricks can have a significant impact on cluster cost, as it allows the cluster to dynamically add or remove workers based on the workload.In the scenario you described, if the active worker count is consistently at ...
- 2475 Views
- 2 replies
- 8 kudos
I have a custom application/executable that I upload to DBFS and transfer to my cluster's local storage for execution. I want to call multiple instances of this application in parallel, which I've only been able to successfully do with Python's subpr...
- 2475 Views
- 2 replies
- 8 kudos
Latest Reply
Autoscaling works for spark jobs only. It works by monitoring the job queue, which python code won't go into. If it's just python code, try single node.https://docs.databricks.com/clusters/configure.html#cluster-size-and-autoscaling
1 More Replies
- 5750 Views
- 7 replies
- 2 kudos
I currently have multiple jobs (each running its own job cluster) for my spark structured streaming pipelines that are long running 24x7x365 on DBR 9.x/10.x LTS. My SLAs are 24x7x365 with 1 minute latency. I have already accomplished the following co...
- 5750 Views
- 7 replies
- 2 kudos
- 2528 Views
- 1 replies
- 0 kudos
What determines when the cluster autoscaling activates to add and remove workers? Also, can it be adjusted?
- 2528 Views
- 1 replies
- 0 kudos
Latest Reply
> What determines when the cluster autoscaling activates to add and remove workersDuring scale-down, the service removes a worker only if it is idle and does not contain any shuffle data. This allows aggressive resizing without killing tasks or recom...