Logging to an external location via UC volume
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 06:11 AM
The way I understand it, mount points are deprecated in UC. db.fs.mount() doesn't even seem to work in newer DB runtimes.
But what is the solution when Databricks features don't allow using UC volumes? E.g. specifying a compute's logging path won't work with volumes.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 06:25 AM
Right now only mount supported option is to use Volumes as you have mentioned, otherwise you can connect directly to the cloud storage object https://docs.databricks.com/en/connect/unity-catalog/index.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 08:29 AM
> otherwise you can connect directly to the cloud storage object
You mean using the cloud provider's APIs? That wouldn't solve logging to cloud storage as the the compute's logging path only allows dbfs:/ references. So can I only log to a managed DBFS location?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 08:47 AM
As you cannot use volumes it seems that indeed this will be your only option
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 02:37 AM
I'm just wondering why Notebook do support mounts, even in my UC workspace?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 04:22 AM - edited 12-13-2024 04:22 AM
I can actually see the mount point does exist when I do
DBUtils.getDBUtils().fs.mounts()
->
MountInfo(/mnt/cluster-logs,wasbs://xxx@xxx.blob.core.windows.net/cluster-logs/,)
But still no logs arrive.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2024 04:49 AM
Found the error message at last: DatabricksServiceException: IO_ERROR: java.io.IOException: This request is not authorized to perform this operation using this permission. Please see the cause for further information.

