<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Unity Catalog &amp; Delta Lake in Data Governance</title>
    <link>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22414#M774</link>
    <description>&lt;P&gt;In setting up Databricks accounnt, we need to provide storage (s3 buckets) to store the final cleaned data or aggregated data. If the Unity Catalog is enabled at the beginning of account set up, we'd need to to provide S3 buckets again to store the metedata. My question: can it be one S3 bucket for final cleaned data and the Unity Catalog, or do they need to be separate s3 buckets?&lt;/P&gt;</description>
    <pubDate>Mon, 14 Nov 2022 21:22:02 GMT</pubDate>
    <dc:creator>Tram</dc:creator>
    <dc:date>2022-11-14T21:22:02Z</dc:date>
    <item>
      <title>Unity Catalog &amp; Delta Lake</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22414#M774</link>
      <description>&lt;P&gt;In setting up Databricks accounnt, we need to provide storage (s3 buckets) to store the final cleaned data or aggregated data. If the Unity Catalog is enabled at the beginning of account set up, we'd need to to provide S3 buckets again to store the metedata. My question: can it be one S3 bucket for final cleaned data and the Unity Catalog, or do they need to be separate s3 buckets?&lt;/P&gt;</description>
      <pubDate>Mon, 14 Nov 2022 21:22:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22414#M774</guid>
      <dc:creator>Tram</dc:creator>
      <dc:date>2022-11-14T21:22:02Z</dc:date>
    </item>
    <item>
      <title>Re: Unity Catalog &amp; Delta Lake</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22415#M775</link>
      <description>&lt;P&gt;HI @Tram Nguyen​&amp;nbsp;,&lt;/P&gt;&lt;P&gt;I am not sure if I got this correctly but I believe you are referring to the:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;B&gt;Root storage for a workspace&lt;/B&gt;: Root storage for workspace objects like cluster logs, notebook revisions, and job results libraries (https://docs.databricks.com/administration-guide/account-api/aws-storage.html#configure-aws-storage)&lt;/LI&gt;&lt;LI&gt;Metastore storage: Each metastore is configured with a root storage location in an S3 bucket in your AWS account. This storage location is used for metadata and &lt;A href="https://docs.databricks.com/data-governance/unity-catalog/index.html#managed-tables" alt="https://docs.databricks.com/data-governance/unity-catalog/index.html#managed-tables" target="_blank"&gt;managed tables&lt;/A&gt; data.(https://docs.databricks.com/data-governance/unity-catalog/index.html#metastores)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;It's not recommended to re-use the workspace s3 bucket for metastore.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/1201i733AD0B60FC55C38/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;thanks,&lt;/P&gt;&lt;P&gt;Pat.&lt;/P&gt;</description>
      <pubDate>Tue, 15 Nov 2022 15:27:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22415#M775</guid>
      <dc:creator>Pat</dc:creator>
      <dc:date>2022-11-15T15:27:27Z</dc:date>
    </item>
    <item>
      <title>Re: Unity Catalog &amp; Delta Lake</title>
      <link>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22416#M776</link>
      <description>&lt;P&gt;@Tram Nguyen​&amp;nbsp;above pat provided links will have clear information. but to provide overview &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;account s3 bucket should be different &lt;/LI&gt;&lt;LI&gt;unity catalog s3 bucket should be different (Meta store in technical terms)&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;unity catalog bucket should be configured in same region of your account, so that no network Issues will be seen. multiple workspaces in that account can use same unity catalog metastore (which is s3 bucket)&lt;/P&gt;</description>
      <pubDate>Tue, 15 Nov 2022 18:22:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-governance/unity-catalog-delta-lake/m-p/22416#M776</guid>
      <dc:creator>karthik_p</dc:creator>
      <dc:date>2022-11-15T18:22:54Z</dc:date>
    </item>
  </channel>
</rss>

