permission denied listing external volume when using vscode databricks extension
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2024 04:30 PM
hey, i'm using the Db extension for vscode (Databricks connect v2). When using dbutils to list an external volume defined in UC like so:
However, running the same statement in the Db workspace works fine.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2024 08:25 AM
We still face the problem (UC enabled shared cluster). Is there any resolution? @Retired_mod
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-17-2024 06:35 AM
Hi @Retired_mod,
thanks for your answer! Meanwhile we received the following answer from Azure Support:
I have checked this internally and engaging the Databricks backline team and confirmed that, The Databrick connect does not support UC volume, and The Databricks engineering team is working on this, but we do not have any ETA as of now.
Can you confirm that accessing volumes should indeed not yet be possible using databricks-connect?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2025 03:32 AM
I am able to list volumes in a shared UC cluster. I hope you are no more facing this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2025 03:46 AM
Works now also in VSCode with connect, at least it using dbutils from the workspace client instead of the spark session...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2025 07:07 AM
great, thanks for confirming. This feature was under development in the early quarter last year. Now it is available.

