cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Delivery audit logs to multiple S3 buckets

jgrycz
New Contributor III

Hi!
Am I able to configure delivery of Databricks audit logs to multiple S3 buckets (on different AWS accounts)? 

Thanks in Advance!

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts.

This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API.

Here is an example of how you can configure a new storage configuration:

bash
curl -X POST 'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configuration...' \
--header 'Authorization: Bearer $OAUTH_TOKEN' \
-d '{ "storage_configuration_name": "databricks-workspace-storageconf-v1", "root_bucket_info": { "bucket_name": "my-company-example-bucket" } }'

You would need to repeat this process for each S3 bucket that you want to deliver logs to. 

Please note that if you want to deliver logs to an AWS account other than the one used for your Databricks workspace, you must add an S3 bucket policy.

View solution in original post

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts.

This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API.

Here is an example of how you can configure a new storage configuration:

bash
curl -X POST 'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configuration...' \
--header 'Authorization: Bearer $OAUTH_TOKEN' \
-d '{ "storage_configuration_name": "databricks-workspace-storageconf-v1", "root_bucket_info": { "bucket_name": "my-company-example-bucket" } }'

You would need to repeat this process for each S3 bucket that you want to deliver logs to. 

Please note that if you want to deliver logs to an AWS account other than the one used for your Databricks workspace, you must add an S3 bucket policy.

jgrycz
New Contributor III

Thank you @Kaniz !