a week ago
The way I understand it, mount points are deprecated in UC. db.fs.mount() doesn't even seem to work in newer DB runtimes.
But what is the solution when Databricks features don't allow using UC volumes? E.g. specifying a compute's logging path won't work with volumes.
a week ago
Right now only mount supported option is to use Volumes as you have mentioned, otherwise you can connect directly to the cloud storage object https://docs.databricks.com/en/connect/unity-catalog/index.html
a week ago
> otherwise you can connect directly to the cloud storage object
You mean using the cloud provider's APIs? That wouldn't solve logging to cloud storage as the the compute's logging path only allows dbfs:/ references. So can I only log to a managed DBFS location?
a week ago
As you cannot use volumes it seems that indeed this will be your only option
a week ago
I'm just wondering why Notebook do support mounts, even in my UC workspace?
a week ago - last edited a week ago
I can actually see the mount point does exist when I do
DBUtils.getDBUtils().fs.mounts()
->
MountInfo(/mnt/cluster-logs,wasbs://xxx@xxx.blob.core.windows.net/cluster-logs/,)
But still no logs arrive.
a week ago
Found the error message at last: DatabricksServiceException: IO_ERROR: java.io.IOException: This request is not authorized to perform this operation using this permission. Please see the cause for further information.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group