cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Triggering clean-ups in Spark to handle accumulated metadata

User16826994223
Honored Contributor III
 
1 ACCEPTED SOLUTION

Accepted Solutions

User16826994223
Honored Contributor III

setting the parameter โ€˜spark.cleaner.ttlโ€™ or by dividing the long running jobs into different batches and writing the intermediary results to the disk.

View solution in original post

1 REPLY 1

User16826994223
Honored Contributor III

setting the parameter โ€˜spark.cleaner.ttlโ€™ or by dividing the long running jobs into different batches and writing the intermediary results to the disk.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.