- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2021 11:58 PM
Error: missing application resource
Getting this error while running job with spark submit. I have given following parameters while creating job:
--conf spark.yarn.appMasterEnv.PYSAPRK_PYTHON=databricks/path/python3
--py-files dbfs/path/to/.egg job_main.py
Above given appropriately as per expectation given by databricks spark submit syntax.
Can anyone please let me know if anything missing while giving spark submit parameters?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2021 11:32 AM
Hi,
We tried a simulate the question on our end and what we did was packaged a module inside a whl file.
Now to access the wheel file we created another python file test_whl_locally.py. Inside test_whl_locally.py to access the content of the wheel file first you have to impart the module or the class you want to access eg
#Syntax
# from <wheelpackagedirname>.<module> import <className>
# refVar = <className>()
# example :
from somewhlpackage.module_two import ModuleTwo
moduleTwo = ModuleTwo()
moduleTwo.print()
now upload both the wheel package in your case it will be the egg file and the calling python file (in our case it is test_whl_locally.py but in your case it is job_main.py) to dbfs. Once done configure your spark-submit.
["--py-files","dbfs:/FileStore/tables/whls/somewhlpackage-1.0.0-py3-none-any.whl","dbfs:/FileStore/tables/whls/test_whl_locally.py"]
If you look closely we have provided the fully qualified path of dbfs in --py-files. so when py-files runs it will install both the wheel file/egg file in the virtual environment that it creates.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2021 07:19 AM
Hi @D M ,
Good day.
Can you please try with
--py-files /dbfs/path/to/.egg job_main.py
The above will invoke the fuse driver.
If the spark-submit still fails, Can you please provide the full stack trace?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2021 11:32 AM
Hi,
We tried a simulate the question on our end and what we did was packaged a module inside a whl file.
Now to access the wheel file we created another python file test_whl_locally.py. Inside test_whl_locally.py to access the content of the wheel file first you have to impart the module or the class you want to access eg
#Syntax
# from <wheelpackagedirname>.<module> import <className>
# refVar = <className>()
# example :
from somewhlpackage.module_two import ModuleTwo
moduleTwo = ModuleTwo()
moduleTwo.print()
now upload both the wheel package in your case it will be the egg file and the calling python file (in our case it is test_whl_locally.py but in your case it is job_main.py) to dbfs. Once done configure your spark-submit.
["--py-files","dbfs:/FileStore/tables/whls/somewhlpackage-1.0.0-py3-none-any.whl","dbfs:/FileStore/tables/whls/test_whl_locally.py"]
If you look closely we have provided the fully qualified path of dbfs in --py-files. so when py-files runs it will install both the wheel file/egg file in the virtual environment that it creates.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2021 03:18 AM
Thanks a lot for looking into this issue and providing above solution but my expected scenario is where I want to read main function .py file from .zip package (including number of py files). Can you please tell me how to pass main function python file or how will it take reference of that?

