Hi team,
I am running a weekly purge process from databricks notebooks that cleans up chunk of records from my tables used for audit purposes. Tables are external tables. I need clarification on below items
1.Should I need to run Optimize and Vacuum command ? . Very Minimal Read Queries are executed against the audit tables
2. If i need to run, should I add Optimize and vacuum command in the same notebook to shrink the storage layer?
3. What scenarios should i look for to optimize and vaccum command for tables involved in purge process
3.No Action. Will data bricks and Apache Spark framework takes care internally on optimizing ?