I have a delta table whose size will increases gradually now we have around 1.5 crores of rows while running vacuum command on that table i am getting the below error.
ERROR: Job aborted due to stage failure: Task 7 in stage 491.0 failed 4 times, most recent failure: Lost task 7.4 in stage 481.0 (TID 4116) (10.154.64.26 executor 7): ExecutorLostFailure (executor 7 exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 177186 ms
and also wanted to know the best order of using
VACUUM table RETAIN 168 HOURS
optimize table;
fsck repair table table;
REFRESH table table;