<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Can Databricks write query results to s3 in another account via the API in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/can-databricks-write-query-results-to-s3-in-another-account-via/m-p/102188#M41007</link>
    <description>&lt;P&gt;Have you been able to get a response on this topic, based on the information I can see it might not be supported to write on an S3 outside your account&lt;/P&gt;</description>
    <pubDate>Mon, 16 Dec 2024 03:13:08 GMT</pubDate>
    <dc:creator>Walter_C</dc:creator>
    <dc:date>2024-12-16T03:13:08Z</dc:date>
    <item>
      <title>Can Databricks write query results to s3 in another account via the API</title>
      <link>https://community.databricks.com/t5/data-engineering/can-databricks-write-query-results-to-s3-in-another-account-via/m-p/83940#M37079</link>
      <description>&lt;P&gt;I work for a company where we are trying to create a Databrick's integration in node using the&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/23348"&gt;@DataBricks&lt;/a&gt;/sql package to query customers clusters or warehouses.&amp;nbsp; I see &lt;A href="https://docs.databricks.com/en/ingestion/cloud-object-storage/copy-into/temporary-credentials.html" target="_self"&gt;documentation&lt;/A&gt; of being able to load data via a query from s3 using STS tokens where you do something like:&lt;/P&gt;&lt;PRE&gt;&lt;SPAN class=""&gt;COPY&lt;/SPAN&gt; &lt;SPAN class=""&gt;INTO&lt;/SPAN&gt; &lt;SPAN class=""&gt;my_json_data&lt;/SPAN&gt;
&lt;SPAN class=""&gt;FROM&lt;/SPAN&gt; &lt;SPAN class=""&gt;'s3://my-bucket/jsonData'&lt;/SPAN&gt; &lt;SPAN class=""&gt;WITH&lt;/SPAN&gt; &lt;SPAN class=""&gt;(&lt;/SPAN&gt;
  &lt;SPAN class=""&gt;CREDENTIAL&lt;/SPAN&gt; &lt;SPAN class=""&gt;(&lt;/SPAN&gt;&lt;SPAN class=""&gt;AWS_ACCESS_KEY&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;,&lt;/SPAN&gt; &lt;SPAN class=""&gt;AWS_SECRET_KEY&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;,&lt;/SPAN&gt; &lt;SPAN class=""&gt;AWS_SESSION_TOKEN&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;)&lt;/SPAN&gt;
&lt;SPAN class=""&gt;)&lt;/SPAN&gt;
&lt;SPAN class=""&gt;FILEFORMAT&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;JSON&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;but the inverse doesn't seem to be true where you can use these to copy query results directly into s3 like doing something like&lt;/P&gt;&lt;PRE&gt;&lt;SPAN class=""&gt;COPY&lt;/SPAN&gt; &lt;SPAN class=""&gt;INTO&lt;/SPAN&gt; &lt;SPAN class=""&gt;'s3://my-bucket/jsonData'&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN class=""&gt;FROM (SELECT * &lt;SPAN class=""&gt;my_json_data)&lt;/SPAN&gt;&lt;/SPAN&gt; &lt;SPAN class=""&gt;WITH&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN class=""&gt;(&lt;/SPAN&gt; &lt;SPAN class=""&gt;CREDENTIAL&lt;/SPAN&gt; &lt;SPAN class=""&gt;(&lt;/SPAN&gt;&lt;SPAN class=""&gt;AWS_ACCESS_KEY&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;,&lt;/SPAN&gt; &lt;SPAN class=""&gt;AWS_SECRET_KEY&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;,&lt;/SPAN&gt; &lt;SPAN class=""&gt;AWS_SESSION_TOKEN&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;'...'&lt;/SPAN&gt;&lt;SPAN class=""&gt;)&lt;/SPAN&gt; &lt;SPAN class=""&gt;)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN class=""&gt;FILEFORMAT&lt;/SPAN&gt; &lt;SPAN class=""&gt;=&lt;/SPAN&gt; &lt;SPAN class=""&gt;JSON&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;Does anyone know if capabilities like that exist?&amp;nbsp; We'd like to be able to run a query and have the customer's Databrick's instance write results to an S3 bucket in our account.&lt;/P&gt;</description>
      <pubDate>Thu, 22 Aug 2024 15:34:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/can-databricks-write-query-results-to-s3-in-another-account-via/m-p/83940#M37079</guid>
      <dc:creator>kirkj</dc:creator>
      <dc:date>2024-08-22T15:34:10Z</dc:date>
    </item>
    <item>
      <title>Re: Can Databricks write query results to s3 in another account via the API</title>
      <link>https://community.databricks.com/t5/data-engineering/can-databricks-write-query-results-to-s3-in-another-account-via/m-p/102188#M41007</link>
      <description>&lt;P&gt;Have you been able to get a response on this topic, based on the information I can see it might not be supported to write on an S3 outside your account&lt;/P&gt;</description>
      <pubDate>Mon, 16 Dec 2024 03:13:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/can-databricks-write-query-results-to-s3-in-another-account-via/m-p/102188#M41007</guid>
      <dc:creator>Walter_C</dc:creator>
      <dc:date>2024-12-16T03:13:08Z</dc:date>
    </item>
  </channel>
</rss>

