cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Mount Workspace to Docker container

ivanychev
Contributor II

Is there a way to mount Workspace folder (WSFS) to the Docker container if I'm using the Databricks Container Services ofr running a general purpose cluster?

If I create a cluster without a Docker image, the `!ls` command in Databricks notebook returns `/Workspace` folder that contains user folders, as well as their mounted Git repositories.

Is there a way to make this folder available when running a cluster with a Docker image? 

 

 

Sergey
2 REPLIES 2

User16539034020
Databricks Employee
Databricks Employee

Hello:

Thanks for contacting Databricks Support!

I'm afraid that mounting the WSFS directly into a Docker container isn't directly supported. The Databricks workspace is a specialized environment and isn't directly analogous to a regular filesystem.

When you use custom containers in Databricks, your notebooks still run inside the Databricks workspace, and you can still access DBFS and other Databricks-specific features. The container is essentially providing the runtime environment, but you're still operating within the Databricks ecosystem. So, when you execute commands within a notebook, it's executed within that Databricks context, and you should still have access to the /Workspace and other Databricks-specific directories.

Regards,


So, when you execute commands within a notebook, it's executed within that Databricks context, and you should still have access to the /Workspace and other Databricks-specific directories.

This does not appear to be true. When I try to open any path in /Workspace from a python notebook, on a cluster running, for example, the image projectglow/databricks-glow:10.4, I get "No such file or directory" errors.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group