- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-05-2023 12:19 PM
In Databricks 12.2, this Python code block lists out the contents of our ad_hoc folder in our mounted GCP bucket.
import os
os.listdir('/dbfs/mnt/hlm/ad_hoc/')
For some reason in 13.3 this same code block throws a " No such file or directory" error.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2023 05:24 AM
I've discovered the cause of this issue. The path is functional, but what I'm actually running into is a restriction due to the cluster set to "Shared" access mode:
- Cannot use R, RDD APIs, or clients that directly read the data from cloud storage, such as DBUtils.
Changing the cluster mode resolved the issue.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2023 05:24 AM
I've discovered the cause of this issue. The path is functional, but what I'm actually running into is a restriction due to the cluster set to "Shared" access mode:
- Cannot use R, RDD APIs, or clients that directly read the data from cloud storage, such as DBUtils.
Changing the cluster mode resolved the issue.

