11-22-2021 09:26 AM
After dropping a delta table using DROP command in databricks, is there a way to drop the s3 files in databricks without using rm command? Looking for a solution where junior developers can safely drop a table wihout messing with the rm command where they may cause accidental data loss using recursive option.
thanks
Alina.
11-22-2021 10:41 AM
official way is that before DROP:
DELETE FROM events
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP 🙂
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
11-22-2021 10:41 AM
official way is that before DROP:
DELETE FROM events
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP 🙂
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
11-22-2021 11:51 AM
Hi @Alina Bella ,
Like @Hubert Dudek mentioned, we have a best practice guide for dropping managed tables. You can find the docs here
11-29-2021 11:14 AM
Hi @Alina Bella ,
If @Hubert Dudek ''s answer solved the issue, would you be happy to mark their answer as best? That will help others find the solution more easily in the future.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group