โ04-29-2025 05:46 AM
I am running databricks premium and looking to create a compute running conda. It seems that the best way to do this is to boot the compute from a docker image. However, in the ```create_compute > advanced``` I cannot see the the docker option nor can i see it in other tabs in the create compute.
Am i missing permissions or shoud i alter some underlying configuration?
โ04-30-2025 05:58 AM
Hello @Askenm!
Have you enabled Databricks Container Services in your workspace settings?
Without enabling DCS, the Docker option wonโt appear when you create a compute cluster.
For reference: https://docs.databricks.com/aws/en/compute/custom-containers
โ04-30-2025 09:16 PM
Advika is correct, atabricks Container Services needs to be explicitly enabled for your workspace.
A workspace admin can enable it using the Databricks CLI with the following command:
databricks workspace-conf set-status \ --json '{"enableDcs": "true"}'
Refer - https://docs.databricks.com/aws/en/compute/custom-containers#enable
Sunday
@NandiniN @Advika I've followed the documentation and enabled DCS by using the Databricks CLI and running
databricks workspace-conf set-status \ --json '{"enableDcs": "true"}'I even checked by running get-status.
However, one month later, and the Docker tab is still not showing up in the Advanced section of the Create Compute page. I tried setting the Access Mode to Dedicated: Single User, and still not showing. I tried No User Isolation, and that doesn't work either.
Sunday
I am using SINGLE_USER access mode and Databricks Runtime 16.4 LTS on GCP, but I do not see the Docker tab for custom containers. Is Databricks Container Services enabled for my workspace, and is there any tier or backend restriction?
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now