โ12-12-2024 06:11 AM
The way I understand it, mount points are deprecated in UC. db.fs.mount() doesn't even seem to work in newer DB runtimes.
But what is the solution when Databricks features don't allow using UC volumes? E.g. specifying a compute's logging path won't work with volumes.
โ12-12-2024 06:25 AM
Right now only mount supported option is to use Volumes as you have mentioned, otherwise you can connect directly to the cloud storage object https://docs.databricks.com/en/connect/unity-catalog/index.html
โ12-12-2024 08:29 AM
> otherwise you can connect directly to the cloud storage object
You mean using the cloud provider's APIs? That wouldn't solve logging to cloud storage as the the compute's logging path only allows dbfs:/ references. So can I only log to a managed DBFS location?
โ12-12-2024 08:47 AM
As you cannot use volumes it seems that indeed this will be your only option
โ12-13-2024 02:37 AM
I'm just wondering why Notebook do support mounts, even in my UC workspace?
โ12-13-2024 04:22 AM - edited โ12-13-2024 04:22 AM
I can actually see the mount point does exist when I do
DBUtils.getDBUtils().fs.mounts()
->
MountInfo(/mnt/cluster-logs,wasbs://xxx@xxx.blob.core.windows.net/cluster-logs/,)
But still no logs arrive.
โ12-13-2024 04:49 AM
Found the error message at last: DatabricksServiceException: IO_ERROR: java.io.IOException: This request is not authorized to perform this operation using this permission. Please see the cause for further information.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group