Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts.
This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API.
Here is an example of how you can configure a new storage configuration:
bash
curl -X POST 'https://accounts.cloud.databricks.com/api/2.0/accounts/<databricks-account-id>/storage-configuration...' \
--header 'Authorization: Bearer $OAUTH_TOKEN' \
-d '{ "storage_configuration_name": "databricks-workspace-storageconf-v1", "root_bucket_info": { "bucket_name": "my-company-example-bucket" } }'
You would need to repeat this process for each S3 bucket that you want to deliver logs to.
Please note that if you want to deliver logs to an AWS account other than the one used for your Databricks workspace, you must add an S3 bucket policy.