03-27-2025 07:19 AM
Hi everyone,
We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to isolation constraints that prevent access to certain Databricks-native features.
The job runs, but when we try to perform a basic file operation like:
with open("/Volumes/folder_example/file.txt", "r") as f:
data = f.read()
We get the following error:
PermissionError: [Errno 1] Operation not permitted: '/Volumes/folder_example/file.txt'
We increased the log level and got more detail in the traceback:
TaskException: Task in task_example failed: An error occurred while calling o424.load.
: com.databricks.backend.daemon.data.common.InvalidMountException: Error while using path /Volumes/folder_example/ for creating file system within mount at '/Volumes/folder_example/'.
at com.databricks.backend.daemon.data.common.InvalidMountException$.apply(DataMessages.scala:765)
From the error message and behavior, we suspect this could be related to how container services isolate the job’s execution environment possibly preventing it from accessing Unity Catalog Volumes, since these mounts may not be available or reachable outside of a native Databricks execution context.
However, we haven’t found official documentation clearly explaining whether Databricks Volumes can be accessed in jobs triggered this way, or under which conditions access is denied.
We can access the same data directly from S3 using the instance profile without any issues. This is expected, since the S3 path is accessed directly via the instance profile credentials
• Are Volumes intentionally not accessible from container services?
• Is there any official documentation detailing execution contexts and their access to UC/Volumes/Workspace paths?
Thanks in advance! 🙂
Isi
04-08-2025 08:08 PM
Is the Volume mounted using unity catalog?
Do user/service principal used run_as job has required access on the volume?
04-12-2025 02:55 AM
Hello @rcdatabricks
Yes, its unity catalog volume and has permissions. I am able to access it from a notebook or a job but not using this container services...
Any idea?
Thanks,
Isi
04-21-2025 06:48 AM
@Isi Are you using databricks-sdk library to access this volumes?
example:
https://docs.databricks.com/aws/en/dev-tools/sdk-python#files-in-volumes:~:text=Catalog%20volume.-,P....
07-07-2025 04:02 AM
@Isi Did you find solution to this issue. I am facing the exact problem right now.
07-10-2025 01:32 PM
@rxj @Octavian1
No 😞 I didn`t but a volume is like a door to the storage, so we end reading the path directly with boto3
07-10-2025 06:25 AM
Check also whether the cluster used to run the job has the right access to the specific UC Volume.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now