cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Can Databricks write query results to s3 in another account via the API

kirkj
New Contributor

I work for a company where we are trying to create a Databrick's integration in node using the @DataBricks/sql package to query customers clusters or warehouses.  I see documentation of being able to load data via a query from s3 using STS tokens where you do something like:

COPY INTO my_json_data
FROM 's3://my-bucket/jsonData' WITH (
  CREDENTIAL (AWS_ACCESS_KEY = '...', AWS_SECRET_KEY = '...', AWS_SESSION_TOKEN = '...')
)
FILEFORMAT = JSON

but the inverse doesn't seem to be true where you can use these to copy query results directly into s3 like doing something like

COPY INTO 's3://my-bucket/jsonData' 
FROM (SELECT * my_json_data) WITH
( CREDENTIAL (AWS_ACCESS_KEY = '...', AWS_SECRET_KEY = '...', AWS_SESSION_TOKEN = '...') )
FILEFORMAT = JSON

Does anyone know if capabilities like that exist?  We'd like to be able to run a query and have the customer's Databrick's instance write results to an S3 bucket in our account.

1 REPLY 1

Walter_C
Databricks Employee
Databricks Employee

Have you been able to get a response on this topic, based on the information I can see it might not be supported to write on an S3 outside your account

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group