cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Run Pyspark job of Python egg package using spark submit on databricks

ItsMe
New Contributor II

Error: missing application resource

Getting this error while running job with spark submit.​ I have given following parameters while creating job:

--conf spark.yarn.appMasterEnv.PYSAPRK_PYTHON=databricks/path/python3

--py-files dbfs/path/to/.egg job_main.py

Above given appropriately as per expectation given by databricks spark submit syntax.

Can anyone please let me know if anything missing while giving spark submit parameters?​

1 ACCEPTED SOLUTION

Accepted Solutions

User16752246494
Contributor

Hi,

We tried a simulate the question on our end and what we did was packaged a module inside a whl file.

Now to access the wheel file we created another python file test_whl_locally.py. Inside test_whl_locally.py to access the content of the wheel file first you have to impart the module or the class you want to access eg

#Syntax
# from <wheelpackagedirname>.<module> import <className>
# refVar = <className>()
# example :
 
from somewhlpackage.module_two import ModuleTwo
 
moduleTwo = ModuleTwo()
moduleTwo.print()

now upload both the wheel package in your case it will be the egg file and the calling python file (in our case it is test_whl_locally.py but in your case it is job_main.py) to dbfs. Once done configure your spark-submit.

["--py-files","dbfs:/FileStore/tables/whls/somewhlpackage-1.0.0-py3-none-any.whl","dbfs:/FileStore/tables/whls/test_whl_locally.py"]

If you look closely we have provided the fully qualified path of dbfs in --py-files. so when py-files runs it will install both the wheel file/egg file in the virtual environment that it creates.

image 

View solution in original post

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @ Dhananjay! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

User16752246494
Contributor

Hi @D M​ ,

Good day.

Can you please try with

--py-files /dbfs/path/to/.egg job_main.py

The above will invoke the fuse driver.

If the spark-submit still fails, Can you please provide the full stack trace?

User16752246494
Contributor

Hi,

We tried a simulate the question on our end and what we did was packaged a module inside a whl file.

Now to access the wheel file we created another python file test_whl_locally.py. Inside test_whl_locally.py to access the content of the wheel file first you have to impart the module or the class you want to access eg

#Syntax
# from <wheelpackagedirname>.<module> import <className>
# refVar = <className>()
# example :
 
from somewhlpackage.module_two import ModuleTwo
 
moduleTwo = ModuleTwo()
moduleTwo.print()

now upload both the wheel package in your case it will be the egg file and the calling python file (in our case it is test_whl_locally.py but in your case it is job_main.py) to dbfs. Once done configure your spark-submit.

["--py-files","dbfs:/FileStore/tables/whls/somewhlpackage-1.0.0-py3-none-any.whl","dbfs:/FileStore/tables/whls/test_whl_locally.py"]

If you look closely we have provided the fully qualified path of dbfs in --py-files. so when py-files runs it will install both the wheel file/egg file in the virtual environment that it creates.

image 

Thanks a lot for looking into this issue and providing above solution but ​my expected scenario is where I want to read main function .py file from .zip package (including number of py files). Can you please tell me how to pass main function python file or how will it take reference of that?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.