<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: bucket ownership of s3 bucket in databricks in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/bucket-ownership-of-s3-bucket-in-databricks/m-p/4349#M1095</link>
    <description>&lt;P&gt;I suspect you provided a dbfs path to save the data hence the data saved under your workspace root bucket.&lt;/P&gt;&lt;P&gt;For the workspace root bucket, databricks workspace will interact with databricks credential to make sure databricks has access to it and able to modify. &lt;/P&gt;</description>
    <pubDate>Tue, 06 Jun 2023 00:27:40 GMT</pubDate>
    <dc:creator>User16752239289</dc:creator>
    <dc:date>2023-06-06T00:27:40Z</dc:date>
    <item>
      <title>bucket ownership of s3 bucket in databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/bucket-ownership-of-s3-bucket-in-databricks/m-p/4348#M1094</link>
      <description>&lt;P&gt;We had a databricks job that has strange behavior,&lt;/P&gt;&lt;P&gt;when we passing 'output_path' to function &lt;B&gt;saveAsTextFile and not &lt;/B&gt;output_path variable the data saved to the following path: &lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt; s3://dev-databricks-hy1-rootbucket/nvirginiaprod/3219117805926709/output_path&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;The problem is that the object ownership for each object that saved in above path is not belongs to us - aws-prod.&lt;/P&gt;&lt;P&gt;So the question why the object owner is aws-prod and not our prod aws account?&lt;/P&gt;&lt;P&gt;&lt;B&gt;Thanks for help.&lt;/B&gt;&lt;/P&gt;&lt;P&gt;Attached the notebook of job and also screenshoot from s3 bucket.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="s3"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/204i4842F04F994BC735/image-size/large?v=v2&amp;amp;px=999" role="button" title="s3" alt="s3" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 15 May 2023 16:13:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/bucket-ownership-of-s3-bucket-in-databricks/m-p/4348#M1094</guid>
      <dc:creator>LidorAbo</dc:creator>
      <dc:date>2023-05-15T16:13:15Z</dc:date>
    </item>
    <item>
      <title>Re: bucket ownership of s3 bucket in databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/bucket-ownership-of-s3-bucket-in-databricks/m-p/4349#M1095</link>
      <description>&lt;P&gt;I suspect you provided a dbfs path to save the data hence the data saved under your workspace root bucket.&lt;/P&gt;&lt;P&gt;For the workspace root bucket, databricks workspace will interact with databricks credential to make sure databricks has access to it and able to modify. &lt;/P&gt;</description>
      <pubDate>Tue, 06 Jun 2023 00:27:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/bucket-ownership-of-s3-bucket-in-databricks/m-p/4349#M1095</guid>
      <dc:creator>User16752239289</dc:creator>
      <dc:date>2023-06-06T00:27:40Z</dc:date>
    </item>
  </channel>
</rss>

