cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Is there any way to control the autoOptimize interval?

brickster_2018
Databricks Employee
Databricks Employee

I can see my streaming jobs running optimize jobs more frequently, Is there any property so I can control autoOptimize duration

1 ACCEPTED SOLUTION

Accepted Solutions

brickster_2018
Databricks Employee
Databricks Employee

The autoOptimize is not performed on a time basis. It's an event-based trigger. Once the delta table/partition has 50 (default value of spark.databricks.delta.autoCompact.minNumFiles) files, auto-compaction is triggered.

To reduce the frequency, increase the value of the aforementioned configuration. optionally you can completely turn off the auto-compaction and run a regular optimize command on a daily basis.

View solution in original post

1 REPLY 1

brickster_2018
Databricks Employee
Databricks Employee

The autoOptimize is not performed on a time basis. It's an event-based trigger. Once the delta table/partition has 50 (default value of spark.databricks.delta.autoCompact.minNumFiles) files, auto-compaction is triggered.

To reduce the frequency, increase the value of the aforementioned configuration. optionally you can completely turn off the auto-compaction and run a regular optimize command on a daily basis.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group