Measure size of all tables in Azure databricks

Devsql
New Contributor III

Hi Team,

Currently I am trying to find size of all tables in my Azure databricks, as i am trying to get idea of current data loading trends, so i can plan for data forecast ( i.e. Last 2 months, approx 100 GB data came-in, so in next 2-3 months there should be 150 GB data coming-in)

My production Azure databricks environment is using Unity Catalog, which hosts:

a- All Bronze Tables

b- All Silver Tables

c- All Gold Tables.

d- Some extra Delta-Live-Tables, acting as Temp table holding results of intermediate calculation.

e- Some tables made via EXCEL sheet data.

Above tables are Delta-Live-Tables, made via DLT based pipelines/Jobs.

So i am looking for a script/code/solution which gives me total size in GB for all tables in given database.

Solution based on SQL is good one but even having answer based on Python/scala would be ok.

Also in tradition Relation-DBMS world, there used to be several built-in reports which gives idea of data loading trends via Charts or some graphs...So do we have such built-in feature with Azure Databricks ?

Thanks in Advance

Devsql