As a Databricks account owner or account admin, you can configure daily delivery of billable usage logs in CSV file format to an AWS S3 storage bucket, where you can make the data available for further analytics.
A simple pipeline for configuring budget thresholds and alerting could look like this
- Configure billable usage logs delivery to an s3 bucket.
- Configure autoloader to load this to delta table
- Configure dashboards in Databricks SQL for reporting
- Configure alerts in Databricks SQL by scheduling a query that checks a threshold