Hi everyone,
We work with sensitive data in Databricks, so it's crucial from both security and regulatory perspectives to purge all data saved in notebook revisions.
Currently, there are two manual methods:
Delete all history from each notebook individually.
Permanently purge all revision history via Settings -> Advanced for all notebooks.
Is there any way to automate this process?
I noticed that the API endpoints used for this are not documented:
I've tested calling these endpoints; although I receive an HTTP 200 response, the history does not actually get purged.
Has anyone managed to automate notebook revision purging successfully?
Any guidance would be greatly appreciated!