<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/run-spark-code-in-notebook-by-setting-spark-conf-instead-of/m-p/62149#M2776</link>
    <description>&lt;P&gt;Hi community,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code.&amp;nbsp;&lt;/P&gt;&lt;P&gt;One way I found out is given here: &lt;A href="https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module" target="_blank"&gt;https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module&lt;/A&gt;&lt;/P&gt;&lt;P&gt;The other way was running a code like:&amp;nbsp;SparkSession.builder.appName('NewSpark').getOrCreate(), and then exporting spark conf creds, i.e:&amp;nbsp;&lt;BR /&gt;spark.conf.set("spark.databricks.service.token", "&amp;lt;token&amp;gt;")&lt;BR /&gt;spark.conf.set("spark.databricks.service.address", "&amp;lt;address"), etc.&amp;nbsp;&lt;/P&gt;&lt;P&gt;But using above approach gives me error:&amp;nbsp;Caused by: java.lang.RuntimeException: Config file /home/ec2user/.databricks-connect not found. Please run `databricks-connect configure` to accept the end user license agreement and configure Databricks Connect.&lt;/P&gt;&lt;P&gt;Can I have a case where the .databricks config file is not created/populated, but via spark conf code we are able to run spark code?&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 27 Feb 2024 19:17:46 GMT</pubDate>
    <dc:creator>Surajv</dc:creator>
    <dc:date>2024-02-27T19:17:46Z</dc:date>
    <item>
      <title>Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime</title>
      <link>https://community.databricks.com/t5/get-started-discussions/run-spark-code-in-notebook-by-setting-spark-conf-instead-of/m-p/62149#M2776</link>
      <description>&lt;P&gt;Hi community,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code.&amp;nbsp;&lt;/P&gt;&lt;P&gt;One way I found out is given here: &lt;A href="https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module" target="_blank"&gt;https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module&lt;/A&gt;&lt;/P&gt;&lt;P&gt;The other way was running a code like:&amp;nbsp;SparkSession.builder.appName('NewSpark').getOrCreate(), and then exporting spark conf creds, i.e:&amp;nbsp;&lt;BR /&gt;spark.conf.set("spark.databricks.service.token", "&amp;lt;token&amp;gt;")&lt;BR /&gt;spark.conf.set("spark.databricks.service.address", "&amp;lt;address"), etc.&amp;nbsp;&lt;/P&gt;&lt;P&gt;But using above approach gives me error:&amp;nbsp;Caused by: java.lang.RuntimeException: Config file /home/ec2user/.databricks-connect not found. Please run `databricks-connect configure` to accept the end user license agreement and configure Databricks Connect.&lt;/P&gt;&lt;P&gt;Can I have a case where the .databricks config file is not created/populated, but via spark conf code we are able to run spark code?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 27 Feb 2024 19:17:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/run-spark-code-in-notebook-by-setting-spark-conf-instead-of/m-p/62149#M2776</guid>
      <dc:creator>Surajv</dc:creator>
      <dc:date>2024-02-27T19:17:46Z</dc:date>
    </item>
  </channel>
</rss>

