Thanks @Renu_ . That makes sense, I understand that in the DBX workspaces, you can choose between two 16.4 spark versions when creating a compute.
My confusion was with using the docker image in local environment. I pull it from the registry but then I'm also installing dependencies with pip. So if I understand it correctly, I will still pull the one and only 16.4 docker image but then based on which scala I want to test with, I need to install that scala version compatible package.
I haven't found a simple way of pip installing scala 2.13 compiled packages and what I did find was this open issue (PySpark installation doesn't support Scala 2.13 binaries)
https://issues.apache.org/jira/browse/SPARK-39995
If anyone has a suggestion for how to set up a local Dockerfile to run 16.4 image and scala 2.13 packages, it would be greatly appreciated.