Hello,
I administer a self-service oriented Databricks workspace and I notice that more and more users are storing their data in DBFS due to lack of knowledge.
They are not specifying a location when creating their schema or they are not specifying a schema at all.
To prevent them from doing so, I thought about adding a parameter in spark config : "spark.sql.warehouse.dir adl://forbidden ".
It works pretty well ... except when we add a location on a table.
Why it's trying to use "spark.sql.warehouse.dir" for that ?
Is there another way to block DBFS usage ?
PS : I know that Unity Catalog should solve this problem but it will deployed only in few months.
Thank you.