<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: unable to open external table created under hive_metastore (Data view) in azure . in Data Governance</title>
    <link>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10764#M441</link>
    <description>&lt;P&gt;@karthik p​&amp;nbsp;Unity Catalog is supported in DBR 11.2 and higher versions. If you attempt to create hive external tables on these DBR versions by default storage credential will be used for authentication to the storage account. You can create storage credentials and external location then create external table wth three level namespace notation `hive_metastore`.`schema`.`table`. If you want to configure storage access using spark configuration, then use DBR 11.1 or lower versions for your use case. &lt;/P&gt;</description>
    <pubDate>Sat, 28 Jan 2023 16:50:52 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2023-01-28T16:50:52Z</dc:date>
    <item>
      <title>unable to open external table created under hive_metastore (Data view) in azure .</title>
      <link>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10761#M438</link>
      <description>&lt;P&gt;we have enabled unity catalog in Azure, we have a requirement to create external table in hive_metastore. we have configured ADLS Gen2 and by using different access methods that ADLS Gen 2 supports we have created external table. we are able to view data when we run select query through notebook, but when we try to view table under hive_metastore catalog in Data, able to see table name, but when we select that table, we are getting.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;ADLS Gen2 Access config&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;service_credential = dbutils.secrets.get(scope="&amp;lt;scope&amp;gt;",key="&amp;lt;service-credential-key&amp;gt;")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.auth.type.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "OAuth")&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.oauth.provider.type.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.oauth2.client.id.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "&amp;lt;application-id&amp;gt;")&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.oauth2.client.secret.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", service_credential)&lt;/P&gt;&lt;P&gt;spark.conf.set("fs.azure.account.oauth2.client.endpoint.&amp;lt;storage-account&amp;gt;.dfs.core.windows.net", "https://login.microsoftonline.com/&amp;lt;directory-id&amp;gt;/oauth2/token")&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;"Failure to initialize configuration" &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; we have configured data access configuration setting and added service credential, we have also tried to add "spark.hadoop.fs.azure.account.key.hiveexternalstore.blob.core.windows.net &amp;lt;Key&amp;gt;" but still we are not able to view table      &lt;/P&gt;</description>
      <pubDate>Tue, 24 Jan 2023 17:29:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10761#M438</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2023-01-24T17:29:21Z</dc:date>
    </item>
    <item>
      <title>Re: unable to open external table created under hive_metastore (Data view) in azure .</title>
      <link>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10762#M439</link>
      <description>&lt;P&gt;Hi, Unity Catalog and Hive metastore are two different things. If the cluster is, Unity Catalog enabled (and especially shared one) it doesn't support the above configuration. Maybe the best will be to register security credentials and external location in the Unity catalog and then register the external table.&lt;/P&gt;</description>
      <pubDate>Tue, 24 Jan 2023 18:23:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10762#M439</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2023-01-24T18:23:11Z</dc:date>
    </item>
    <item>
      <title>Re: unable to open external table created under hive_metastore (Data view) in azure .</title>
      <link>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10763#M440</link>
      <description>&lt;P&gt;@Hubert Dudek​&amp;nbsp;Ya you are right, when we enable UC and create external table using external location, we are able to view using select stameent from notebook (we need to run access key code), but when we add access key to cluster or sql warehouse and login to data--&amp;gt;hive_metastore, we are able to see table, but when we click on table we are getting issue.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;please find below query&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;CREATE EXTERNAL TABLE test.hiveextmount&lt;/P&gt;&lt;P&gt;USING delta&lt;/P&gt;&lt;P&gt;LOCATION 'abfss://testcontainer@testexternalstore.dfs.core.windows.net/tables'&lt;/P&gt;&lt;P&gt;SELECT * from CSV.`abfss://testcontainer@testexternalstore.dfs.core.windows.net/tables`;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;as per data bricks if we enable UC also and if we are not specifying catalog, we should be able to create external/managed tables in hive_metastore which is default metastore. we are able to view old tables that we created using mount when we are on wasbs type. after changing wasbs to abfss and using above query when we create table then we started seeing issue.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;if we create table under new catalog which 3 level namespace, that works fine. but as a backup we are cheking for hive_metastore also, as UC has lot of limitations &lt;/P&gt;</description>
      <pubDate>Tue, 24 Jan 2023 23:04:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10763#M440</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2023-01-24T23:04:36Z</dc:date>
    </item>
    <item>
      <title>Re: unable to open external table created under hive_metastore (Data view) in azure .</title>
      <link>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10764#M441</link>
      <description>&lt;P&gt;@karthik p​&amp;nbsp;Unity Catalog is supported in DBR 11.2 and higher versions. If you attempt to create hive external tables on these DBR versions by default storage credential will be used for authentication to the storage account. You can create storage credentials and external location then create external table wth three level namespace notation `hive_metastore`.`schema`.`table`. If you want to configure storage access using spark configuration, then use DBR 11.1 or lower versions for your use case. &lt;/P&gt;</description>
      <pubDate>Sat, 28 Jan 2023 16:50:52 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unable-to-open-external-table-created-under-hive-metastore-data/m-p/10764#M441</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2023-01-28T16:50:52Z</dc:date>
    </item>
  </channel>
</rss>

