How can I add jars ("spark.jars") to pyspark notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2019 12:29 PM
I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the conf. Even if I'm able to create a new session with the new conf, it seems to be not picking up the jars. Is there any better way to add jars?
- Labels:
-
Pyspark
-
Spark
-
Spark-submit
-
Xgboost
-
Xgboost4j
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2019 11:05 PM
Hi @dbansal, Install the libraries/jars while initialising the cluster.
Please go through the documentation on the same below,
https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/8C2A30E5B696B676846234E4B14F2C7B/responsive_peak/images/icon_anonymous_message.png)