<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to properly import spark functions? in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/how-to-properly-import-spark-functions/m-p/39871#M5638</link>
    <description>&lt;P&gt;Thanks for your reply.&lt;/P&gt;&lt;P&gt;I have redefined the function, including spark in the scope:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;def get_info(spark: SparkSession):
    return spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")&lt;/LI-CODE&gt;&lt;P&gt;After implementing the change, it works.&lt;/P&gt;&lt;P&gt;Hence, thank you for the explanation and the suggested approach.&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;</description>
    <pubDate>Mon, 14 Aug 2023 17:46:28 GMT</pubDate>
    <dc:creator>daniel23</dc:creator>
    <dc:date>2023-08-14T17:46:28Z</dc:date>
    <item>
      <title>How to properly import spark functions?</title>
      <link>https://community.databricks.com/t5/get-started-discussions/how-to-properly-import-spark-functions/m-p/39500#M5636</link>
      <description>&lt;P&gt;I have the following command that runs in my databricks notebook.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")&lt;/LI-CODE&gt;&lt;P&gt;I have wrapped this command into a function (simplified).&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;def get_info():
    return spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")&lt;/LI-CODE&gt;&lt;P&gt;I have then added this function in a .py module, that I install as a private package in the environment of my workspace. I am able to import this function and call it.&lt;/P&gt;&lt;P&gt;However, when I run this function, I receive an error message.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;get_info()
&amp;gt;&amp;gt;&amp;gt; NameError: name 'spark' is not defined&lt;/LI-CODE&gt;&lt;P&gt;If I define the same function in the body of the notebook, I can run it without problems.&lt;/P&gt;&lt;P&gt;- Why bringing this function to a separate module forces me to import spark? What's the proper way of creating a separate module with spark functions? How to import them?&lt;/P&gt;&lt;P&gt;- If possible, what is happening under the hood, that makes it work when I define the function in the notebook, but not work when I import it?&lt;/P&gt;</description>
      <pubDate>Thu, 10 Aug 2023 08:06:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/how-to-properly-import-spark-functions/m-p/39500#M5636</guid>
      <dc:creator>daniel23</dc:creator>
      <dc:date>2023-08-10T08:06:23Z</dc:date>
    </item>
    <item>
      <title>Re: How to properly import spark functions?</title>
      <link>https://community.databricks.com/t5/get-started-discussions/how-to-properly-import-spark-functions/m-p/39871#M5638</link>
      <description>&lt;P&gt;Thanks for your reply.&lt;/P&gt;&lt;P&gt;I have redefined the function, including spark in the scope:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;def get_info(spark: SparkSession):
    return spark.conf.get("spark.databricks.clusterUsageTags.managedResourceGroup")&lt;/LI-CODE&gt;&lt;P&gt;After implementing the change, it works.&lt;/P&gt;&lt;P&gt;Hence, thank you for the explanation and the suggested approach.&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;</description>
      <pubDate>Mon, 14 Aug 2023 17:46:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/how-to-properly-import-spark-functions/m-p/39871#M5638</guid>
      <dc:creator>daniel23</dc:creator>
      <dc:date>2023-08-14T17:46:28Z</dc:date>
    </item>
  </channel>
</rss>

