Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-16-2024 03:10 AM
Databricks autoscaling is designed to dynamically adjust the number of worker nodes in a cluster based on workload demand, optimizing resource utilization and cost. Understanding the conditions under which Spark triggers autoscaling requires insight into how Databricks monitors and interprets the workload.
Regards,
Abhishek Upadhyay
Abhishek Upadhyay