I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the conf. Even if I'm able to create a new session with the new conf, it seems to be not picking up the jars. Is there any better way to add jars?