In Databricks is it possible to get the total amount of delta lake storage being used in the parquet format per user?
Subsequently, what are the best practices on making sure that users saving delta files are not taking up storage unnecessarily, for example is vacuuming files the quickest method for clearing things up?
Thanks,
Zach