โ11-22-2021 09:26 AM
After dropping a delta table using DROP command in databricks, is there a way to drop the s3 files in databricks without using rm command? Looking for a solution where junior developers can safely drop a table wihout messing with the rm command where they may cause accidental data loss using recursive option.
thanks
Alina.
โ11-22-2021 10:41 AM
official way is that before DROP:
DELETE FROM events
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP ๐
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
โ11-22-2021 10:41 AM
official way is that before DROP:
DELETE FROM events
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP ๐
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
โ11-22-2021 11:51 AM
Hi @Alina Bellaโ ,
Like @Hubert Dudekโ mentioned, we have a best practice guide for dropping managed tables. You can find the docs here
โ11-29-2021 11:14 AM
Hi @Alina Bellaโ ,
If @Hubert Dudekโ ''s answer solved the issue, would you be happy to mark their answer as best? That will help others find the solution more easily in the future.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group