by
Nis
• New Contributor II
- 1733 Views
- 1 replies
- 2 kudos
I have a delta table whose size will increases gradually now we have around 1.5 crores of rows while running vacuum command on that table i am getting the below error.ERROR: Job aborted due to stage failure: Task 7 in stage 491.0 failed 4 times, most...
- 1733 Views
- 1 replies
- 2 kudos
Latest Reply
Do you have access to the Executor 7 logs? is there a high GC or some other events that is making the heartbeat timeout? would you be able to check the failed stages?
- 2436 Views
- 2 replies
- 1 kudos
Hello there,I currently have the problem of deleted files still being in the transaction log when trying to call a delta table. What I found was this statement:%sql
FSCK REPAIR TABLE table_name [DRY RUN]But using it returned following error:Error in ...
- 2436 Views
- 2 replies
- 1 kudos
Latest Reply
Remove square brackets and try executing the command%sqlFSCK REPAIR TABLE table_name DRY RUN
1 More Replies
- 1584 Views
- 1 replies
- 0 kudos
Is it a good practice to run the FSCK REPAIR command on a regular basis? I have Optimize and VACUUM commands scheduled to run every day.
- 1584 Views
- 1 replies
- 0 kudos
Latest Reply
Unlike OPTIMIZE and VACUUM, FSCK REPAIR is not an operational command that has to be executed on a regular basis. FSCK REPAIR is useful to repair the Delta metadata and remove the reference of the files from the metadata that are no longer accessible...