We recently started using the Data Profiling/ Lakehouse monitoring feature from Databricks https://learn.microsoft.com/en-us/azure/databricks/data-quality-monitoring/data-profiling/. Data Profiling is using serverless compute for running the profiling job.
Is there any way to tag this serverless compute or data profiling job (monitoring job)? Cost tracking is essential for our use case. Only way to achieve this in my understanding is by tagging the jobs, computes..etc.
I know we can get the overall cost for Data Profiling on a workspace from system tables. But in our workspace we are running multiple use cases and cost need to be tracked on use case level. Usually we achieve this using tagging but in the case of data profiling it seems impossible.