โ04-29-2025 05:46 AM
I am running databricks premium and looking to create a compute running conda. It seems that the best way to do this is to boot the compute from a docker image. However, in the ```create_compute > advanced``` I cannot see the the docker option nor can i see it in other tabs in the create compute.
Am i missing permissions or shoud i alter some underlying configuration?
โ04-30-2025 05:58 AM
Hello @Askenm!
Have you enabled Databricks Container Services in your workspace settings?
Without enabling DCS, the Docker option wonโt appear when you create a compute cluster.
For reference: https://docs.databricks.com/aws/en/compute/custom-containers
โ04-30-2025 09:16 PM
Advika is correct, atabricks Container Services needs to be explicitly enabled for your workspace.
A workspace admin can enable it using the Databricks CLI with the following command:
databricks workspace-conf set-status \ --json '{"enableDcs": "true"}'
Refer - https://docs.databricks.com/aws/en/compute/custom-containers#enable
โ11-16-2025 07:23 PM
@NandiniN @Advika I've followed the documentation and enabled DCS by using the Databricks CLI and running
databricks workspace-conf set-status \ --json '{"enableDcs": "true"}'I even checked by running get-status.
However, one month later, and the Docker tab is still not showing up in the Advanced section of the Create Compute page. I tried setting the Access Mode to Dedicated: Single User, and still not showing. I tried No User Isolation, and that doesn't work either.
โ11-16-2025 07:30 PM
I am using SINGLE_USER access mode and Databricks Runtime 16.4 LTS on GCP, but I do not see the Docker tab for custom containers. Is Databricks Container Services enabled for my workspace, and is there any tier or backend restriction?
Monday
Even I've enabled the enableDcs to True, the docker tab still doesn't show up
Monday - last edited Monday
Hi @Askenm
In Databricks Premium, the Docker option for custom images is not available on all compute types and is not controlled by user level permissions. Custom Docker images are only supported on Databricks clusters that use the legacy VM based compute with the standard access mode, and they are not available on serverless compute, shared access mode, or some newer simplified cluster UIs. If you do not see the Docker option in the advanced settings, it usually means your workspace or selected compute type does not support custom container images. There is no underlying configuration you can change to enable this in those cases. The recommended alternatives are to use init scripts to install and activate conda on cluster startup, use Databricks supported environments such as ML runtimes which include conda, or manage Python dependencies using Databricks libraries. If Docker based compute is a strict requirement, you would need to ensure the workspace allows legacy clusters and that you select a compatible runtime and access mode.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now