- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 09:26 AM
After dropping a delta table using DROP command in databricks, is there a way to drop the s3 files in databricks without using rm command? Looking for a solution where junior developers can safely drop a table wihout messing with the rm command where they may cause accidental data loss using recursive option.
thanks
Alina.
- Labels:
-
Delta table
-
rm command
-
S3 Storage Files
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 10:41 AM
official way is that before DROP:
- Run DELETE FROM:
DELETE FROM events
- Run VACUUM with an interval of zero:
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP 🙂
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 10:41 AM
official way is that before DROP:
- Run DELETE FROM:
DELETE FROM events
- Run VACUUM with an interval of zero:
VACUUM events RETAIN 0 HOURS
I agree that there could be some DEEP DROP 🙂
Alternatively not in SQL but in python you could write custom class/function to do that and then preinstall it on clusters so people would use some CleanTable(TableName) to make data validation and then delete+vacuum+drop+rm
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-22-2021 11:51 AM
Hi @Alina Bella ,
Like @Hubert Dudek mentioned, we have a best practice guide for dropping managed tables. You can find the docs here
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-29-2021 11:14 AM
Hi @Alina Bella ,
If @Hubert Dudek ''s answer solved the issue, would you be happy to mark their answer as best? That will help others find the solution more easily in the future.

