Hi @APJESK ,
Audit Logs, are critical logs for security, they track user actions, such as who accessed what data, who ran a job, and who changed permissions. Delivery of these to an S3 bucket is a standard, must-have practice.
Additionally, you have system tables which has your account's operational data found in the system
catalog. System tables can be used for historical observability across your account. You may have to configure your VPC endpoint policy to allow access to the S3 bucket where your region's system tables data is stored.
Please refer to the official doc - Monitoring section for setting up dashboards as well https://docs.databricks.com/aws/en/compute/serverless/best-practices#monitor-the-cost-of-serverless-...
Serverless is Databricks managed, so that as a developer you do not have to worry about infra, scaling etc, however, if you prefer more visibility:
- Spark logs are not available when using serverless notebooks and jobs. Users only have access to client-side application logs. - So these can be collected.
- The Spark UI is not available. Instead, use the query profile to view information about your Spark queries. See Query profile. This is also accessible with Rest API
You could also explore https://docs.databricks.com/aws/en/lakehouse-monitoring/
Thanks & Regards,
Nandini