Hello,
I administer a self-service oriented Databricks workspace and I notice that more and more users are storing their data in DBFS due to lack of knowledge.
They are not specifying a location when creating their schema or they are not specifying a schema at all.
To prevent them from doing so, I thought about adding a parameter in spark config : "spark.sql.warehouse.dir adl://forbidden ".
It works pretty well ... except when we add a location on a table.
Why it's trying to use "spark.sql.warehouse.dir" for that ?
![MaximeGendre_1-1718227315313.png MaximeGendre_1-1718227315313.png](/t5/image/serverpage/image-id/8396iAD2CD63B2851937B/image-size/medium/is-moderation-mode/true?v=v2&px=400)
Is there another way to block DBFS usage ?
PS : I know that Unity Catalog should solve this problem but it will deployed only in few months.
Thank you.