<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: unity catalog external migration and downtime in Data Governance</title>
    <link>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16676#M592</link>
    <description>&lt;P&gt;you can check with the databricks team on this send email to help@databricks.com they will help you&lt;/P&gt;</description>
    <pubDate>Fri, 16 Dec 2022 15:53:15 GMT</pubDate>
    <dc:creator>Aviral-Bhardwaj</dc:creator>
    <dc:date>2022-12-16T15:53:15Z</dc:date>
    <item>
      <title>unity catalog external migration and downtime</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16675#M591</link>
      <description>&lt;P&gt;HI Team,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;once after implementing unity catalog and start migrating external tables from legacy hive to unity catalog, i am seeing in articles that we need to make changes to our workloads to be in sync with 3 level name space&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;eg: if i have 50 notebooks which uses legacy hive, usually there is no 3-level name space  and we can run our notebooks based on standard spark queries. but after migrating to unity catalog if i need to run those notebooks and i am not using default hive_metastore config in spark level, do i need to manually change to use catalog and followed by &lt;B&gt;schema.tablename&lt;/B&gt; in spark/python codes in notebooks to work without any issues&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Note: i have seen article where we can use both legacy and unity catalog until we feel confident on functionalities are in tack with our requirement and drop legacy tables. if we drop legacy, do we need to change all notebooks to use catalog and 3 level namespaces &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;do we have backup mechanism in case if we want to revert back to legacy. is that INFORMATION_SCHEMA backup or any other mechanisam&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 15 Dec 2022 17:19:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16675#M591</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2022-12-15T17:19:55Z</dc:date>
    </item>
    <item>
      <title>Re: unity catalog external migration and downtime</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16676#M592</link>
      <description>&lt;P&gt;you can check with the databricks team on this send email to help@databricks.com they will help you&lt;/P&gt;</description>
      <pubDate>Fri, 16 Dec 2022 15:53:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16676#M592</guid>
      <dc:creator>Aviral-Bhardwaj</dc:creator>
      <dc:date>2022-12-16T15:53:15Z</dc:date>
    </item>
    <item>
      <title>Re: unity catalog external migration and downtime</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16677#M593</link>
      <description>&lt;P&gt;@Aviral Bhardwaj​&amp;nbsp;this will be normal priority and it takes lot of time to get response.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;team, we are looking for downtime possibilities and how to handle or change workflows or notebooks to use unity catalog 3 level namespace. do we have any utility to directly run, and it takes care to modify globally all over notebooks &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;eg: if i have 15 notebooks and i have multiple databases, do i need to change manually in notebooks to use unity catalog and databases in that with 3 level namespace and also if i have few jobs which uses external mount points, do i need to specify or modify to use unity catalog or once after migrating external tables to UC does it automatically takes care @pat​&amp;nbsp;any suggestions please&lt;/P&gt;</description>
      <pubDate>Mon, 19 Dec 2022 21:31:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16677#M593</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2022-12-19T21:31:57Z</dc:date>
    </item>
    <item>
      <title>Re: unity catalog external migration and downtime</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16678#M594</link>
      <description>&lt;P&gt;yes right maybe @Kaniz Fatma​&amp;nbsp; can help you here&lt;/P&gt;</description>
      <pubDate>Tue, 20 Dec 2022 01:44:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-external-migration-and-downtime/m-p/16678#M594</guid>
      <dc:creator>Aviral-Bhardwaj</dc:creator>
      <dc:date>2022-12-20T01:44:23Z</dc:date>
    </item>
  </channel>
</rss>

