Hi,
We tried a simulate the question on our end and what we did was packaged a module inside a whl file.
Now to access the wheel file we created another python file test_whl_locally.py. Inside test_whl_locally.py to access the content of the wheel file first you have to impart the module or the class you want to access eg
#Syntax
# from <wheelpackagedirname>.<module> import <className>
# refVar = <className>()
# example :
from somewhlpackage.module_two import ModuleTwo
moduleTwo = ModuleTwo()
moduleTwo.print()
now upload both the wheel package in your case it will be the egg file and the calling python file (in our case it is test_whl_locally.py but in your case it is job_main.py) to dbfs. Once done configure your spark-submit.
["--py-files","dbfs:/FileStore/tables/whls/somewhlpackage-1.0.0-py3-none-any.whl","dbfs:/FileStore/tables/whls/test_whl_locally.py"]
If you look closely we have provided the fully qualified path of dbfs in --py-files. so when py-files runs it will install both the wheel file/egg file in the virtual environment that it creates.