can we parameterize the tags in the job compute
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2024 04:23 AM
I want to monitor the cost better for the databricks job computes.
I am using tags in the cluster to monitor cost.
The tag values is static as of now.
can we parameterize the compute the job cluster so that I can pass the tag values during the runtime and monitor the cost better?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2024 04:50 AM
Hi @,
If you're using ADF you can look at below article:
Applying Dynamic Tags To Databricks Job Clusters in Azure Data Factory | by Kyle Hale | Medium
If not, I think you can try to write some code that will use below endpoint. The idea is, before executing actual job, you can use below endpoint to ovewrite exisiting tags
Update job settings partially | Jobs API | REST API reference | Databricks on AWS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2024 05:02 AM
I am not using ADF but using databricks workflow.
So option 1 is not applicable for my use case.
Regarding option 2:
I am calling my workflow using REST API from a third party scheduler.
I am running multiple instances of the workflow in parallel.
So changing the job setting may not be a good idea for me.
I was hoping if there a way to pass the tag a a parameter while calling the workflow?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2024 05:35 AM
Unfortunately, your option is pretty limited here. As you can see, you can't pass tag object during trigger run 😕
https://docs.databricks.com/api/workspace/jobs/runnow#job_parameters

