I'm trying to use a custom docker image for my job. This is my docker file:
FROM databricksruntime/standard:12.2-LTS
COPY . .
RUN /databricks/python3/bin/pip install -U pip
RUN /databricks/python3/bin/pip install -r requirements.txt
USER root
My job is using a pool, and first I've tried to directly go to the job -> compute -> advanced -> docker, and put my image. But then it fails with the following:
Unexpected user error while preparing the cluster for the job. Cause: INVALID_PARAMETER_VALUE: The target instance pool InstancePoolId(xxxxx) does not have docker images configured, thus not supporting cluster creation with docker image. Please update your cluster attribute or create a separate instance pool for docker image clusters.
So instead I tried to create a new all-purpose cluster with my custom image defined, and when the cluster is trying to initialize it fails with the error:
23/07/25 13:40:48 ERROR DriverDaemon$: stderr:
/databricks/spark/scripts/setup_container_iptables_rules.sh: line 32: iptables: command not found
23/07/25 13:40:48 ERROR DriverDaemon$: XXX Fatal uncaught exception. Terminating driver.
org.apache.spark.api.python.PythonSecurityException: Failed to run: 'enable iptables restrictions for Python'
Any advice?