cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I add jars ("spark.jars") to pyspark notebook?

dbansal
New Contributor

I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the conf. Even if I'm able to create a new session with the new conf, it seems to be not picking up the jars. Is there any better way to add jars?

1 REPLY 1

shyam_9
Valued Contributor
Valued Contributor

Hi @dbansal, Install the libraries/jars while initialising the cluster.

Please go through the documentation on the same below,

https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.