<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Missing Delta-live-Table in hive-metastore catalog in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110619#M43626</link>
    <description>&lt;P&gt;Hi experts,&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I defined my delta table in an external location as following:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;%sql&lt;BR /&gt;CREATE OR REFRESH STREAMING TABLE pumpdata (&lt;BR /&gt;Body string,&lt;BR /&gt;EnqueuedTimeUtc string,&lt;BR /&gt;SystemProperties string,&lt;BR /&gt;_rescued_data string,&lt;BR /&gt;Properties string&lt;BR /&gt;)&lt;BR /&gt;USING DELTA&lt;BR /&gt;LOCATION 'abfss://mdwh@XXXX.dfs.core.windows.net/Bronze/pumpdata'&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a delta live table pipeline with theses settings:&lt;/P&gt;&lt;P&gt;As you can see, I have defined the same external location and set hive Metastore as storage option:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild1.png" style="width: 801px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14949i91EC24BB62A2560E/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild1.png" alt="Bild1.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;and this definition:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;import dlt&lt;BR /&gt;from pyspark.sql.functions import col&lt;BR /&gt;json_path = f"abfss://schachtwasser@XXXX.dfs.core.windows.net/XXXX/*/*/*/*/*.JSON"&lt;BR /&gt;@dlt.create_table(&lt;BR /&gt;name="pumpdata",&lt;BR /&gt;table_properties={&lt;BR /&gt;"quality": "raw"&lt;BR /&gt;},&lt;BR /&gt;comment="Data ingested from an ADLS2 storage account."&lt;BR /&gt;)&lt;BR /&gt;def pumpdata():&lt;BR /&gt;return (&lt;BR /&gt;spark.readStream.format("cloudFiles")&lt;BR /&gt;.option("cloudFiles.format", "JSON")&lt;BR /&gt;.load(json_path)&lt;BR /&gt;)&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;I can successfully run my DLT and parquet files are put in the storage account, but in the catalog under hive-meta store, I cannot see my table&lt;/SPAN&gt;&amp;nbsp;:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild2.png" style="width: 392px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14951iA5F19FFD260A51A5/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild2.png" alt="Bild2.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild3.png" style="width: 611px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14952i64F1341A6F1BAA72/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild3.png" alt="Bild3.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild4.png" style="width: 283px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14950i93CA1988E3D0A6E5/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild4.png" alt="Bild4.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 19 Feb 2025 15:35:01 GMT</pubDate>
    <dc:creator>BobCat62</dc:creator>
    <dc:date>2025-02-19T15:35:01Z</dc:date>
    <item>
      <title>Missing Delta-live-Table in hive-metastore catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110619#M43626</link>
      <description>&lt;P&gt;Hi experts,&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I defined my delta table in an external location as following:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;%sql&lt;BR /&gt;CREATE OR REFRESH STREAMING TABLE pumpdata (&lt;BR /&gt;Body string,&lt;BR /&gt;EnqueuedTimeUtc string,&lt;BR /&gt;SystemProperties string,&lt;BR /&gt;_rescued_data string,&lt;BR /&gt;Properties string&lt;BR /&gt;)&lt;BR /&gt;USING DELTA&lt;BR /&gt;LOCATION 'abfss://mdwh@XXXX.dfs.core.windows.net/Bronze/pumpdata'&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a delta live table pipeline with theses settings:&lt;/P&gt;&lt;P&gt;As you can see, I have defined the same external location and set hive Metastore as storage option:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild1.png" style="width: 801px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14949i91EC24BB62A2560E/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild1.png" alt="Bild1.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;and this definition:&lt;/SPAN&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;import dlt&lt;BR /&gt;from pyspark.sql.functions import col&lt;BR /&gt;json_path = f"abfss://schachtwasser@XXXX.dfs.core.windows.net/XXXX/*/*/*/*/*.JSON"&lt;BR /&gt;@dlt.create_table(&lt;BR /&gt;name="pumpdata",&lt;BR /&gt;table_properties={&lt;BR /&gt;"quality": "raw"&lt;BR /&gt;},&lt;BR /&gt;comment="Data ingested from an ADLS2 storage account."&lt;BR /&gt;)&lt;BR /&gt;def pumpdata():&lt;BR /&gt;return (&lt;BR /&gt;spark.readStream.format("cloudFiles")&lt;BR /&gt;.option("cloudFiles.format", "JSON")&lt;BR /&gt;.load(json_path)&lt;BR /&gt;)&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;I can successfully run my DLT and parquet files are put in the storage account, but in the catalog under hive-meta store, I cannot see my table&lt;/SPAN&gt;&amp;nbsp;:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild2.png" style="width: 392px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14951iA5F19FFD260A51A5/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild2.png" alt="Bild2.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild3.png" style="width: 611px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14952i64F1341A6F1BAA72/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild3.png" alt="Bild3.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bild4.png" style="width: 283px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/14950i93CA1988E3D0A6E5/image-size/large?v=v2&amp;amp;px=999" role="button" title="Bild4.png" alt="Bild4.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2025 15:35:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110619#M43626</guid>
      <dc:creator>BobCat62</dc:creator>
      <dc:date>2025-02-19T15:35:01Z</dc:date>
    </item>
    <item>
      <title>Re: Missing Delta-live-Table in hive-metastore catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110932#M43743</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/66116"&gt;@BobCat62&lt;/a&gt;&amp;nbsp;, Try these steps&amp;nbsp;&lt;/P&gt;&lt;P&gt;1. Try manually registering the table in hive_metastore. Run this in a databricks notebook&amp;nbsp;&lt;/P&gt;&lt;P&gt;CREATE&amp;nbsp;TABLE&amp;nbsp;hive_metastore.default.pumpdata&amp;nbsp;USING&amp;nbsp;DELTA LOCATION 'abfss://mdwh@XXXX.dfs.core.windows.net/Bronze/pumpdata/tables/pumpdata';&lt;/P&gt;&lt;P&gt;2. Then verify the table by running this&amp;nbsp;&lt;/P&gt;&lt;P&gt;SHOW TABLES IN hive_metastore.default&lt;/P&gt;&lt;P&gt;This should create a table under the hive_metastore&lt;/P&gt;</description>
      <pubDate>Fri, 21 Feb 2025 21:37:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110932#M43743</guid>
      <dc:creator>KaranamS</dc:creator>
      <dc:date>2025-02-21T21:37:37Z</dc:date>
    </item>
    <item>
      <title>Re: Missing Delta-live-Table in hive-metastore catalog</title>
      <link>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110948#M43747</link>
      <description>&lt;P&gt;Hey&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/66116"&gt;@BobCat62&lt;/a&gt;&amp;nbsp;, This might help&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ashraf1395_0-1740203798056.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/15034i5D3A842D1DFAB800/image-size/medium?v=v2&amp;amp;px=400" role="button" title="ashraf1395_0-1740203798056.png" alt="ashraf1395_0-1740203798056.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;dlt will be in direct publishingmode by default. If you select hive_metstore you must specify the default schema in the dlt pipeline setting. If not done there. At the time of defining the dlt table pass the schema_name.pumpdata.&lt;BR /&gt;&lt;BR /&gt;example default.pumpdata. - This will store the table in default schema onf hive_metastore&lt;/P&gt;</description>
      <pubDate>Sat, 22 Feb 2025 05:59:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/missing-delta-live-table-in-hive-metastore-catalog/m-p/110948#M43747</guid>
      <dc:creator>ashraf1395</dc:creator>
      <dc:date>2025-02-22T05:59:37Z</dc:date>
    </item>
  </channel>
</rss>

