Regarding --files option in spark submit task of Databricks jobs, would like to understand how it works and what is the syntax to pass multiple files to --files?
I tried using --files and --py-files and my understanding is, it should make available the mentioned files to driver/executor's execution location. But nothing like that was happening for me.
There are no proper examples mentioning the syntax as well as working of spark submit job and it is real struggle migrating most of the legacy/on-premise spark jobs which were using spark-submit to Databricks and things are not direct lift and shift and making things complicated.