Rather everything will be in bytes. Spak sql have built in methods to get table size but also in bytes:
spark.sql("ANALYZE TABLE df COMPUTE STATISTICS NOSCAN")
spark.sql("DESCRIBE EXTENDED df ").filter(col("col_name") === "Statistics").show(false)
My blog: https://databrickster.medium.com/