<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to use Databricks Unity Catalog as metastore for a local spark session in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101444#M40668</link>
    <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/135258"&gt;@furkancelik&lt;/a&gt;&amp;nbsp;Glad it helps.&lt;/P&gt;
&lt;P&gt;I just found &lt;A href="https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533" target="_self"&gt;this article&lt;/A&gt; which I believe will clarify many of your doubts. Please refer straight to the "Accessing Databricks UC from the PySpark shell". Notice the &lt;STRONG&gt;"unity"&amp;nbsp;&lt;/STRONG&gt;in the configuration strings&amp;nbsp;will be your UC Default Catalog.&lt;/P&gt;
&lt;P&gt;To answer your question, the UNITY_CATALOG_ENDPOINT should be in the format:&lt;/P&gt;
&lt;P&gt;https://&amp;lt;account-name&amp;gt;.cloud.databricks.com/api/2.1/unity-catalog&lt;/P&gt;
&lt;P&gt;You do not need to create an endpoint in Unity Catalog settings; there is one available by default that you can try out first.&lt;/P&gt;</description>
    <pubDate>Mon, 09 Dec 2024 09:22:27 GMT</pubDate>
    <dc:creator>VZLA</dc:creator>
    <dc:date>2024-12-09T09:22:27Z</dc:date>
    <item>
      <title>How to use Databricks Unity Catalog as metastore for a local spark session</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101176#M40576</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I would like to access Databricks Unity Catalog from a Spark session created outside the Databricks environment. Previously, I used Hive metastore and didn’t face any issues connecting in this way. Now, I’ve switched the metastore to Unity Catalog and want to connect it similarly to a local Spark session as the metastore.&lt;/P&gt;&lt;P&gt;The Unity Catalog documentation includes some guidance on this, and the following configuration was &lt;A href="https://docs.unitycatalog.io/integrations/unity-catalog-spark/#__tabbed_2_2" target="_self"&gt;shared:&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;bin/pyspark --name &lt;SPAN class=""&gt;&lt;SPAN class=""&gt;"local-uc-test" \&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;DIV class=""&gt;--master &lt;SPAN class=""&gt;&lt;SPAN class=""&gt;"local[*]" \ --packages &lt;SPAN class=""&gt;"io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf &lt;SPAN class=""&gt;"spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf &lt;SPAN class=""&gt;"spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf &lt;SPAN class=""&gt;"spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf &lt;SPAN class=""&gt;"spark.sql.catalog.unity.uri=&lt;A href="http://localhost:8080" target="_blank" rel="noopener"&gt;http://localhost:8080" \ --conf &lt;SPAN class=""&gt;"spark.sql.catalog.unity.token=" \ --conf &lt;SPAN class=""&gt;"spark.sql.defaultCatalog=unity"&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;P&gt;However, I’m not sure how to adapt this configuration for Databricks Unity Catalog. I would appreciate your assistance on this matter.&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Fri, 06 Dec 2024 07:42:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101176#M40576</guid>
      <dc:creator>furkancelik</dc:creator>
      <dc:date>2024-12-06T07:42:06Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Databricks Unity Catalog as metastore for a local spark session</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101258#M40599</link>
      <description>&lt;P&gt;To connect to Databricks Unity Catalog from a local Spark session, you'll need to configure your Spark session with the appropriate dependencies, extensions, and authentication. Here's a general setup:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Install the required libraries, such as io.delta:delta-spark_2.12:3.2.1 and io.unitycatalog:unitycatalog-spark_2.12:0.2.0. You'll need to check for the right compatible versions, try looking into your cluster's /databricks/jars or Drivers Classpath for matching library versions.&lt;/LI&gt;
&lt;LI&gt;Configure your Spark session using the following settings:&lt;/LI&gt;
&lt;LI&gt;--name "local-uc-test"&lt;BR /&gt;--master "local[*]"&lt;BR /&gt;--conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension"&lt;BR /&gt;--conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog"&lt;BR /&gt;--conf "spark.sql.catalog.unity.uri=&amp;lt;YOUR_UNITY_CATALOG_ENDPOINT&amp;gt;"&lt;BR /&gt;--conf "spark.sql.catalog.unity.token=&amp;lt;YOUR_ACCESS_TOKEN&amp;gt;"&lt;BR /&gt;--conf "spark.sql.defaultCatalog=unity"&lt;/LI&gt;
&lt;LI&gt;For authentication, use a valid personal access token for spark.sql.catalog.unity.token. You can generate one in your Databricks workspace with appropriate permissions (Settings -&amp;gt; Developer -&amp;gt; Generate New Token).&lt;/LI&gt;
&lt;LI&gt;Once the Spark session starts, validate the setup by running SHOW CATALOGS in a Spark SQL query to confirm access to Unity Catalog.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;If any issues arise, ensure network connectivity to the Unity Catalog endpoint and verify that your environment has the necessary configurations.&lt;/P&gt;</description>
      <pubDate>Fri, 06 Dec 2024 16:16:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101258#M40599</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-12-06T16:16:28Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Databricks Unity Catalog as metastore for a local spark session</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101295#M40619</link>
      <description>&lt;P&gt;Thank you for replying my quesiton&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/34618"&gt;@VZLA&lt;/a&gt;&amp;nbsp;, it is super helpful. I have a small question:&lt;BR /&gt;Does the UNITY_CATALOG_ENDPOINT something like below or should I create an endpoint in UC settings&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;https://&amp;lt;&amp;lt;account-name&amp;gt;&amp;gt;.cloud.databricks.com/api/2.1/unity-catalog&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 07 Dec 2024 00:09:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101295#M40619</guid>
      <dc:creator>furkancelik</dc:creator>
      <dc:date>2024-12-07T00:09:37Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Databricks Unity Catalog as metastore for a local spark session</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101444#M40668</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/135258"&gt;@furkancelik&lt;/a&gt;&amp;nbsp;Glad it helps.&lt;/P&gt;
&lt;P&gt;I just found &lt;A href="https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533" target="_self"&gt;this article&lt;/A&gt; which I believe will clarify many of your doubts. Please refer straight to the "Accessing Databricks UC from the PySpark shell". Notice the &lt;STRONG&gt;"unity"&amp;nbsp;&lt;/STRONG&gt;in the configuration strings&amp;nbsp;will be your UC Default Catalog.&lt;/P&gt;
&lt;P&gt;To answer your question, the UNITY_CATALOG_ENDPOINT should be in the format:&lt;/P&gt;
&lt;P&gt;https://&amp;lt;account-name&amp;gt;.cloud.databricks.com/api/2.1/unity-catalog&lt;/P&gt;
&lt;P&gt;You do not need to create an endpoint in Unity Catalog settings; there is one available by default that you can try out first.&lt;/P&gt;</description>
      <pubDate>Mon, 09 Dec 2024 09:22:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-databricks-unity-catalog-as-metastore-for-a-local/m-p/101444#M40668</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-12-09T09:22:27Z</dc:date>
    </item>
  </channel>
</rss>

