Hi @Kevin Kimโ We cannot export cluster configs into a docker image. The purpose of providing custom docker images is to be able to pre-install the necessary dependencies on the cluster nodes, instead of doing that during the cluster startup. But the image cannot include the spark/other cluster configs by itself.
However, if you would like to create a cluster with the exact same configs as that of an existing cluster in another workspace, you may use the REST APIs:
https://docs.databricks.com/dev-tools/api/latest/clusters.html