How can I add jars ("spark.jars") to pyspark notebook?
I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the...
- 13688 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @dbansal, Install the libraries/jars while initialising the cluster.Please go through the documentation on the same below,https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel
- 0 kudos