cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Delivery audit logs to multiple S3 buckets

jgrycz
New Contributor III

Hi!
Am I able to configure delivery of Databricks audit logs to multiple S3 buckets (on different AWS accounts)? 

Thanks in Advance!

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts.

This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API.

Here is an example of how you can configure a new storage configuration:

bash
curl -X POST 'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configuration...' \
--header 'Authorization: Bearer $OAUTH_TOKEN' \
-d '{ "storage_configuration_name": "databricks-workspace-storageconf-v1", "root_bucket_info": { "bucket_name": "my-company-example-bucket" } }'

You would need to repeat this process for each S3 bucket that you want to deliver logs to. 

Please note that if you want to deliver logs to an AWS account other than the one used for your Databricks workspace, you must add an S3 bucket policy.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts.

This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API.

Here is an example of how you can configure a new storage configuration:

bash
curl -X POST 'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configuration...' \
--header 'Authorization: Bearer $OAUTH_TOKEN' \
-d '{ "storage_configuration_name": "databricks-workspace-storageconf-v1", "root_bucket_info": { "bucket_name": "my-company-example-bucket" } }'

You would need to repeat this process for each S3 bucket that you want to deliver logs to. 

Please note that if you want to deliver logs to an AWS account other than the one used for your Databricks workspace, you must add an S3 bucket policy.

jgrycz
New Contributor III

Thank you @Kaniz_Fatma !

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group