Hi @migq2 ,
Look at below snippets from documentation. It works on single user cluster, because that mode has full access to DBFS.
You can try to grant ANY FILE permission to make it work on shared cluster.
How does DBFS work in single user access mode?
Clusters configured with single user access mode have full access to DBFS, including all files in the DBFS root and mounted data.
How does DBFS work in shared access mode?
Shared access mode combines Unity Catalog data governance with Azure Databricks legacy table ACLs. Access to data in the hive_metastore is only available to users that have permissions explicitly granted.
To interact with files directly using DBFS, you must have ANY FILE permissions granted. Because ANY FILE allows users to bypass legacy tables ACLs in the hive_metastore and access all data managed by DBFS, Databricks recommends caution when granting this privilege.
You can also try to hack this library a little bit. Take a look on stackoverflow thread. Maybe instead passing path to dbfs, you can try pass path to UC Volume. I don't know if it'll work, but it's worth a try.
from mlflow.utils.databricks_utils import _get_dbutils
def fake_tmp():
return 'path_to_Volume...' # something writable
_get_dbutils().entry_point.getReplLocalTempDir = fake_tmp
https://stackoverflow.com/questions/77579396/databricks-11-mlflow-error-permission-denied-in-create-...