<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How can I add jars (&amp;quot;spark.jars&amp;quot;) to pyspark notebook? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-can-i-add-jars-quot-spark-jars-quot-to-pyspark-notebook/m-p/27683#M19544</link>
    <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the conf. Even if I'm able to create a new session with the new conf, it seems to be not picking up the jars. Is there any better way to add jars?&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 14 Oct 2019 19:29:00 GMT</pubDate>
    <dc:creator>dbansal</dc:creator>
    <dc:date>2019-10-14T19:29:00Z</dc:date>
    <item>
      <title>How can I add jars ("spark.jars") to pyspark notebook?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-add-jars-quot-spark-jars-quot-to-pyspark-notebook/m-p/27683#M19544</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I want to add a few custom jars to the spark conf. Typically they would be submitted along with the spark-submit command but in Databricks notebook, the spark session is already initialized. So, I want to set the jars in "spark.jars" property in the conf. Even if I'm able to create a new session with the new conf, it seems to be not picking up the jars. Is there any better way to add jars?&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Oct 2019 19:29:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-add-jars-quot-spark-jars-quot-to-pyspark-notebook/m-p/27683#M19544</guid>
      <dc:creator>dbansal</dc:creator>
      <dc:date>2019-10-14T19:29:00Z</dc:date>
    </item>
    <item>
      <title>Re: How can I add jars ("spark.jars") to pyspark notebook?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-add-jars-quot-spark-jars-quot-to-pyspark-notebook/m-p/27684#M19545</link>
      <description>&lt;P&gt;&lt;/P&gt;&lt;P&gt;Hi @dbansal, Install the libraries/jars while initialising the cluster.&lt;/P&gt;&lt;P&gt;Please go through the documentation on the same below,&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel" target="_blank"&gt;https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel&lt;/A&gt; &lt;/P&gt;</description>
      <pubDate>Tue, 15 Oct 2019 06:05:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-add-jars-quot-spark-jars-quot-to-pyspark-notebook/m-p/27684#M19545</guid>
      <dc:creator>shyam_9</dc:creator>
      <dc:date>2019-10-15T06:05:00Z</dc:date>
    </item>
  </channel>
</rss>

