cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to access Databricks Volume from job triggered via API (Container Services)

Isi
Contributor

Hi everyone,

We’re facing a strange issue when trying to access a Databricks Volume from a job that is triggered via the Databricks REST API (not via Workflows). These jobs are executed using container services, which may be relevant, perhaps due to isolation constraints that prevent access to certain Databricks-native features.

The job runs, but when we try to perform a basic file operation like:

with open("/Volumes/folder_example/file.txt", "r") as f:
    data = f.read()

We get the following error:

PermissionError: [Errno 1] Operation not permitted: '/Volumes/folder_example/file.txt'

We increased the log level and got more detail in the traceback:

TaskException: Task in task_example failed: An error occurred while calling o424.load.
: com.databricks.backend.daemon.data.common.InvalidMountException: Error while using path /Volumes/folder_example/ for creating file system within mount at '/Volumes/folder_example/'.
	at com.databricks.backend.daemon.data.common.InvalidMountException$.apply(DataMessages.scala:765)

From the error message and behavior, we suspect this could be related to how container services isolate the job’s execution environment possibly preventing it from accessing Unity Catalog Volumes, since these mounts may not be available or reachable outside of a native Databricks execution context.

However, we haven’t found official documentation clearly explaining whether Databricks Volumes can be accessed in jobs triggered this way, or under which conditions access is denied.

We can access the same data directly from S3 using the instance profile without any issues. This is expected, since the S3 path is accessed directly via the instance profile credentials

• Are Volumes intentionally not accessible from container services?

• Is there any official documentation detailing execution contexts and their access to UC/Volumes/Workspace paths?

Thanks in advance! 🙂

Isi

2 REPLIES 2

rcdatabricks
New Contributor II

Is the Volume mounted using unity catalog?

Do user/service principal used run_as job has required access on the volume?

Hello @rcdatabricks 

Yes, its unity catalog volume and has permissions. I am able to access it from a notebook or a job but not using this container services...

Any idea?
Thanks,
Isi